All Episodes

June 10, 2025 73 mins

We're not just a tech podcast; we're a tech & culture podcast. So today, we’re revisiting the charming and prescient 1999 Disney Channel Original movie Smart House. Directed by the multi-talented Levar Burton, it explores the role of smart home technology on a suburban family from a time when that technology was pure science fiction. It's not only a fun movie, but underneath its G-rated Disney veneer is a movie that has a lot to say about gender, relationships, labor, and the human experience. A major part of the movie is the kids' grappling with the loss of their mother, which hits home for Bridget at this time in her own life. 

 

Bridget recaps the film with Producer Mike to explore what it gets right, what it gets wrong, and what it’s trying to say about technology and the people who use it. 

 

Do you like film recap episodes like this? Should we do more of them? Should we never do another? Please let us know what you think, and send movie suggestions! If you’re listening on Spotify, you can leave a comment there or email  hello@tangoti.com 

 

Follow Bridget and TANGOTI! 

 instagram.com/bridgetmarieindc/ 

 tiktok.com/@bridgetmarieindc 

 youtube.com/@ThereAreNoGirlsOnTheInternet 

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
There Are No Girls on the Internet. As a production
of iHeartRadio and Unbossed Creative. I'm Bridget Todd, and this
is there are No Girls on the Internet. So it
is no surprise to anybody who listens to this podcast
that I love any opportunity to talk about movies. We

(00:24):
actually have an interesting interview coming up with Stacy Spikes,
the founder of Movie Past, which is very exciting for
me personally. I listened to a ton of movie podcasts
and honestly, being able to talk about movies all day
for the Ringer or something like that would be a
dream job for me. Movies occupy a lot of my
personal free time. I was re listening to the episode

(00:45):
that we did breaking down the movie Her back when
it was in the news, because Sam Alton said it
was his favorite movie and maybe sort of kind of
stole Scarlett Johansson's voice from that movie for his AI,
And so we did a podcast recapping that movie here
on the show with my producer Mike, and it was
honestly so much fun, right, Mike.

Speaker 2 (01:07):
It was fun. It was nice to take a break
from the news cycle, which is typically bad stuff, and
just talk about a movie. It was a lot of fun.

Speaker 1 (01:19):
Oftentimes, even though I start these movie podcasts episodes as
a break from the news, oftentimes I feel they end
up really overlapping with what's happening at least in tech
news or culture news, because I do think films tell
us a lot about what was top of mind for
us as a culture when they were made, and oftentimes

(01:40):
that is tech related. I actually have an entire list
of movies that I am interested in talking about on
the podcast. For instance, we just watched HBO's Mountainhead. So
if that is something that you all would like us
to do more of, let us know. If you listen
to this episode and you're like, yes, more bridget recapping movies,
let us know. If that's also something that you don't

(02:02):
want to hear that, it's fine. You can let me
know that too. If you're like, keep it, keep it
on your personal time being. Nobody wants to hear you
talk about ex. Machina, that's fine too.

Speaker 2 (02:10):
Yeah, that would be great. We love to hear from listeners.
Just email us at hello at tangote dot com or.

Speaker 1 (02:16):
I also just recently realized that Spotify if it has
individual episode comments. Now and I'm I'm in the comments.
So if you happen to be listening on Spotify and
you want to drop a comment and there I read them.
I think it's super cool leave comments there. It's been
a joint to read them. Thank you for people who
have done that already, because it's exciting to me.

Speaker 2 (02:35):
The premise of what you just said, I think is
pretty close to the premise of the show in general,
the idea that technology underlies a lot of cultural stuff,
you know, just like the way culture is arises because
of technology, and so it's I think that's something we
try to highlight on the show. And it's fun to
back up and look at some of these movies from

(02:57):
a few years or even decades ago and think about
how they what they got right, and maybe what they
got not so right.

Speaker 1 (03:05):
It's one of the reasons why I started There Are
No Girls on the Internet is because when we talk
about technology, it is hard to remember sometimes that we're
not just talking about hardware or like hard.

Speaker 3 (03:15):
Tech, tech is also culture.

Speaker 1 (03:17):
Tech is also the culture that shaped technology, the conversation
around it, the way that people use the technology or
didn't use the technology to create and shipped culture.

Speaker 3 (03:25):
All of that is tech and it's.

Speaker 1 (03:27):
So easy for that to get lost, And what better
way to commemorate that and revisit that than through the
lens of film, my favorite medium. So what movie are
we going to be talking about today.

Speaker 2 (03:40):
Mike, Today we are going to be talking about the
movie smart House.

Speaker 1 (03:46):
Yes, the nineteen ninety nine Disney Channel original classic Smart House.

Speaker 3 (03:51):
So it is.

Speaker 1 (03:52):
Genuinely a good movie. At times, I actually teared up
when watching this movie, probably because of my own personal
life stuff I've got going on, but it is genuinely
a good movie, and it's a movie that I think
very clearly has something to say about the way that
humans interact with technology.

Speaker 3 (04:09):
I wanted to start with smart House.

Speaker 1 (04:11):
Because one Joe and I have discussed it on the
podcast a few times, and too, I think smart House
is an entry point into a question that often comes
up that I'm asked personally often when people find out
I hope the Tech podcast. Do you have smart gadgets
in your home? Are you pro or anti smart gadget?

(04:31):
And I think that like that being a grounding, foundational
question really speaks.

Speaker 3 (04:36):
To some of what's happening in this film.

Speaker 1 (04:38):
So to answer the question, I actually don't have any
smart gadgets in my home other than my iPhone, that's
the smartest gadget that I own. It kind of helps
being broke. You know, my car is fifteen years old.

Speaker 3 (04:53):
I don't have a lot of the even even my phone.

Speaker 1 (04:55):
I rock an iPhone eleven, like one of the earliest
iterations of an iPhone that you can have and still
like get working. But I also recognize that I am
coming into contact with smart gadgets quite a lot, even
outside of my home, and so I'm not judgmental people
who have made the choice that it's better just to
say Siri, do da da da, Like, I'm not judging

(05:16):
those people. It's just not something that I have in
my life. But I think that tension is really what
this film is about, like the tension of like, well,
where's the line from convenience to maybe being a little
bit creepy. I think this movie was very interested in
exploring all of that.

Speaker 2 (05:33):
I think that connects back to that question you said
that you get from people when you tell them you
make a show about technology, they ask if you have
a smart home, because you know, one of the functions
of technology is to make our lives easier, allow us
to be more productive, get more done, or perhaps automate
something and not have to do it at all. And

(05:54):
so when one thinks about, you know, technology, it's like, Okay,
it's going to replace a lot of the work you do.
It's gonna improve your life in your home, so why
not get it? And then I think there's sort of
a second level past that first initial phase of like, oh,
all these tasks I have to do in my home,

(06:15):
technology could make it better. But then you think about
that a little bit and get a little bit more
critical about it, like what are the risks of it?
Do I really want that? And I think this film
is sets up those questions in some interesting ways.

Speaker 1 (06:33):
So let's get into it. We'll talk about the culture
that birth this film nineteen ninety nine, will do a
plot summary with a few diversions. We'll talk through what
the movie gets right and wrong about technology and sort
of what it's trying to say about all of it.
So Smart Heels came out in nineteen ninety nine, which
I think I've mentioned on the podcast, is my single
favorite year for film, like real og, Cinema files, No,

(06:56):
it's when my personal favorite film of all time. Talented
mister Riple came out, and whenever I talk about why
I love nineteen ninety nine as a film, heere. It
just seems like we were very interested in making movies
that grapple with what it all means.

Speaker 3 (07:10):
I think that I don't know, it was pre nine
to eleven.

Speaker 1 (07:13):
We were just very invested in looking at I guess
it was like a navel gazy year, but it really
did release some like banger illustrations of what it all means, what's.

Speaker 3 (07:24):
Our purpose here? Why are we here? What does it
mean to be a human.

Speaker 1 (07:28):
This is obviously also true for tech films, with big
blockbusters like The Matrix, which came out in nineteen ninety nine,
but also lesser known movies that probably only I saw,
like Robin Williams's Bisentennial Man, which is an adaptation of
an Isaac Asimov short story, which I mentioned because it
kind of reminds me a lot of Smart House. I

(07:48):
think that we were really interested in how humans, especially
young people, would be in a relationship with technology in
the future, and that's sort of an anxiety that we're
seeing in nineteen ninety nine with some of these movies.

Speaker 2 (08:00):
Yeah, two other big movies that you left off that
list were American Beauty and Fight Club. You know, like,
maybe it's just because I was just starting college as
a young fresh freshman at the time, but like I
remember the millennium happening, it seemed like the world was changing.

(08:23):
The Internet was this new thing. Dot com bubble had
not yet burst, so it seemed like the sky was
the limit on what technology was going to be able
to do for us. And yet we were, like you said,
grappling with what does it all mean? What is our
place in this new world? And Fight Club and American Beauty,

(08:46):
and to an extent, talented mister Ripley also had there
somewhat darker answers to that question. Smart House had a
much more optimistic answer.

Speaker 1 (08:56):
I think, can I say something about those films that
will never not be interesting to me that in nineteen
ninety nine, the worst thing a white man could have
is a stable job and like a nice house in
the suburbs. If you're Tyler Dirdan from Fight Club or
Lester Burnham from American Beauty or the guy from Office Space,

(09:18):
and you've got a good white collar job and a
big house in the suburbs, ooh, you are struggling you
a grab. What is this prison I am built for
myself with this stable nine to five job before a
one K healthcare and a nice house.

Speaker 2 (09:31):
Yeah, and at the time it felt empty, like the
promise of the fifties unfulfilled. But now in twenty twenty five,
looking back, it feels like we have replaced that emptiness
with pain and misery.

Speaker 1 (09:45):
Although keep that idea of the fifties and sort of
the promise and the lie of what the sort of
fifties Jetsons era promised us from technology, because that definitely
comes into play with Smart House. So Smart House was
directed by LeVar Burton from reading Rainbow Roots Star Trek

(10:05):
that blew my mind. He has actually talked quite a
bit about this movie publicly. We'll get into some of that.
Smart House was loosely based on a Ray Bradberry short
story called The Velt, wherein two children murder their parents
by feeding them to lions because of the influence of
their malevolent virtual reality nursery. So definitely they made it

(10:27):
a little cheerier in the Disney version. I can see
how they're like, maybe we shouldn't have these kids completely
turn on their parents, because it is Disney after all.

Speaker 2 (10:36):
That is so interesting. I didn't know that fact that
it was based on a darker Ray Bradbury story. But
it makes sense because there were several parts of this
movie where I really wanted it to be darker. Like
I understand it's a movie for kids, so I can't
really have murders, but it's a story that wanted to
be darker for sure, So that makes sense.

Speaker 1 (10:59):
I mean, there are parts of the movie where smart
House is doing something that I think is like a
look like smart House assaults people. That's a crime, Like
like she might not be going to all out murder,
but she you know, there's some darkness.

Speaker 3 (11:14):
So let's get into the plot of what happens in
this movie.

Speaker 1 (11:17):
So Smart House opens on Ben, who is thirteen and
lives with his single father and his little sister, Angie
in Monroe County, New York.

Speaker 3 (11:26):
You're gonna hear.

Speaker 1 (11:26):
Me say this over and over again in the plot summary.
Ben's dad is a jerk and a terrible father. Ben
is really parentized by his father. So Ben, as a
thirteen year old little kid, has taken on a huge
amount of responsibilities around the house that really should be
falling to his father. Ben cooks he's responsible for the

(11:46):
bulk of childcare for his younger sister. He doesn't really
seem to have a problem with it. But the dad
is kind of interested in dating, which, okay, sure, that's
your right as a single father. Interesting to me that
he is not more interested in picking up domestic responsibilities
so his thirteen year old child doesn't have to do them.

(12:07):
He's like, I should be dating more. Ben does not
want his dad to date, and so to prevent his
dad from dating, he's not giving him messages when women
call the house and that kind of thing, because Ben
is like, we don't need anybody else.

Speaker 3 (12:20):
We're good as is.

Speaker 1 (12:22):
Ben has a plan that he thinks is going to
solve all of the family's problems, which is rigging a
contest to win a smart home in this giveaway. Side note,
I'm not totally sure how this is a viable business plan. Basically,
this engineering team led by this like smart neurotic woman Sarah,
who will talk about more in a minute, they have

(12:42):
this plan where they have designed this futuristic smart house
that does all anything that you could ever want, that
the house is capable of doing it for you, and the.

Speaker 3 (12:50):
Model is there.

Speaker 1 (12:51):
Just picking someone in a contest to give this house
away for free to that family will move into the house,
and I guess it will generate buzz, like they will
they put an article, They put one article on the
paper about it.

Speaker 3 (13:05):
Like, I don't understand how this is a viable business plan.

Speaker 1 (13:07):
There's gotta be some venture capital money driving some bad
decisions that maybe will not pan out in the end financially.

Speaker 2 (13:15):
Yeah, the business aspect really doesn't make sense because yeah,
they they build this smart house which must have cost
a fortune, and then they just give it away. And
so if it was a marketing thing, you might expect
some like reporters or photographers to be around, but that
is just like never returned to like.

Speaker 1 (13:33):
They I mean, they have the one when the family
moves in, there's a handful of reporters there taking pictures,
but it's never returned to again. It's like, oh, so
you did all of this, you designed this house and
move this family in, had this big contest for maybe
two articles in the paper and like one photo.

Speaker 2 (13:49):
Yeah, the marketing plan really needs some work, I think,
but not a major part of the movie.

Speaker 1 (13:55):
No, No, I mean I have so many questions I
guess in terms of my ability to suspend disbelief. There
might be bigger things that happen in the movie that
I should be turning my attention to. I guess I'll
put it that way. Let's take a quick break at

(14:22):
her back. I do love the plucky lady engineer who
has designed the house, Sarah H. When we first meet her,
she they're sort of going for. She's she's neurotic and
type A and a little bit quirky. And then she's
working and she's also like very worried about the state

(14:42):
of Smart House. She's down on our hands and knees,
like picking hairs off the carpet to make sure it's perfect.

Speaker 3 (14:47):
Meanwhile, her colleague, who I.

Speaker 1 (14:51):
Mean, he seems like he's almost like zooted out on pills.
He could not he could not be less stressed about
the Smart House situation.

Speaker 2 (14:58):
He's the guy who put the marketing place together. He's
like the comms guy.

Speaker 3 (15:02):
Oh, I think I see what's going on here.

Speaker 1 (15:05):
The pilled out PR guy is making some bad decisions
that are maybe we'll reflect poorly on the female engineer
who's doing all the work.

Speaker 3 (15:14):
I think I see what's.

Speaker 1 (15:14):
Going on here. Yeah, so Ben and his family win Smarthouse.
Ben has rigged this contest successfully and it works. They
move into Smart House. Sarah is there to show them
how it all works. Smarthouse has to take a prick
of their blood to do a bio analysis to get
a sense of I mean the science here. I'm not
so sure about the science here. But when they go

(15:36):
into the house, smart House has a sense of their
you know, diet, nutrition, temperature, things like that, and she
needs to take a.

Speaker 3 (15:43):
Little bit of blood Elizabeth Holmes style to do that.

Speaker 2 (15:46):
Yeah, based on that one prick of blood, the Smart
House is able to know like everything about them, like
so much about them, more than Elizabeth Holmes could ever dream.
And it analyzes their breaths like some kind of breathalyzer
to know everything about their diet, like what proportion of
their diet is protein or fat. It's really star treky actually,

(16:08):
like not surprising that, Like you know, Jeordie LaForge directed
this film.

Speaker 1 (16:15):
Oh, this thing's got LeVar Burton's hand prints all over it. Absolutely,
So this is where we really get the bulk of
the story of like what's going on with this family.
Their their mom has passed away and Ben is terrified
of his mother's memory being raised, so he really kind
of can't move on. He's very much kind of trapped

(16:37):
both and in the past and in the present where
he has to do all of this stuff, all the
cooking and the cleaning and keep running their lives so
that their dad doesn't date and potentially replace their mom.
And so that sort of is the core tension at
the heart of the film. They move into smart House
and hey, maybe smart House can replace their mom.

Speaker 3 (16:59):
It seems like things are going pretty well.

Speaker 1 (17:00):
Like this is the part of the movie where if
they stopped it, it would be like a very short
movie where smart.

Speaker 3 (17:05):
House worked out great. So smart House.

Speaker 1 (17:07):
Uses technology that's called personal Applied Technology or PAT. So
PAT is the name that they that they give smart
House in the movie, things are working out good. When
his little sister Angie forgets her clothes, Pat uses the
bio analysis to pick out the outfit that she would
have showsen herself. PAT also recognizes that if the whole

(17:27):
family woke up a little bit earlier, they would have
less stressful mornings. What's funny about this is that the dad,
who is an adult grown man, PAT, is like, oh,
I realized that if you got your day started twenty
minutes earlier.

Speaker 3 (17:40):
It'd be less stressful than your morning routine and the
routine for the kids.

Speaker 1 (17:43):
This grown man needs pat the smart house to tell
him that if I got my day started earlier, I
wouldn't be late every day the way that this.

Speaker 3 (17:50):
I mean, this guy. I hate this guy. I hate
this toitherho.

Speaker 2 (17:53):
Should say for the listeners. In the world of the movie,
the father is portrayed as like a nice, loving guy
who's like doing the best he can for his family.

Speaker 3 (18:03):
I disagree really well.

Speaker 2 (18:05):
In the universe of the film, maybe yeah, I mean,
I think your take of hating on the dead is fair,
But I don't think that's what the filmmakers set up
to intentionally, can may.

Speaker 1 (18:17):
I think the filmmakers are like, oh, this is a
dad in a tough spot who's a widow, but like,
what's he gonna do step up at home?

Speaker 3 (18:25):
He has a job.

Speaker 1 (18:26):
Like I just feel like in the universe of the film,
we are being asked to believe that this grown man
with two kids who are grieving their mom does not
feel that he has to really do anything to fill
that need materially or emotionally, like and I think, I
mean we'll get into it. But I think that's the
nucleus of this film is that somebody has to step

(18:50):
in and fill these roles, and the roles are not
just material. The roles are emotional and supportive things that
humans can provide for other humans. This dad does not
feel like that is its responsibility, and I think in
the universe of the film, we are supposed to also
be like, well, what's he gonna do spend more time
at home?

Speaker 3 (19:07):
He has a job.

Speaker 1 (19:08):
So Ben is at school bragging that smart house has
done all of these great conveniences for him and his family. Honestly,
a lot of the things that we would probably be
using AI for today, like suggesting ideas for a school report,
on top of also kind of more maternal service tasks
like taking him the perfect chocolate chip cookies. His friends

(19:28):
are like, Wow, you have the world's most perfect mom
who only serves and never complains. And again, I think
that gets at what I'm talking about, right, that the
technology becomes this way of creating not just frictionless service,
but also the idea of a perfect mom as one
that only serves, only gives, never complains, You never experience

(19:50):
any kind of tension or friction with the ideal mother
is just endless servitude and endless giving.

Speaker 2 (19:57):
At the beginning of the movie, at the house isn't
supposed to be a mom replacement. It's just supposed to be,
you know, domestic labor. All domestic labor has been replaced
by the house. Yeah, and so if that's all that
mom is is like a source for fulfilling domestic labor, Yeah,
she's the perfect mom.

Speaker 3 (20:17):
Well, and with the dad kind of gets into this.

Speaker 1 (20:19):
Well, he talks about how after their mom passed away,
his sister moved in to provide some of this and
that when she had to leave, the kid said that
they did not want a nanny, and that's why Ben
has stepped up to be a real parent figure despite
being thirteen, in their household to do some of this
domestic labor. So again, I think that does reveal something
interesting about how they are understanding the role of mother.

(20:42):
They really see it as like a specific set of
domestic labor tasks that can be automated or done by anybody,
whether it's Ben, the aunt, you know, Pat, anybody can
sort of be swapped in there. And as long as
those domestic tasks are getting filled, they are being mothered
in a kind of way, and I think I think

(21:03):
the film is interested in subverting that a little bit.
So that is definitely a reality that Ben occupies. He's like,
Pat is a is a is able to be a mother,
not just a technology that fills domestic tasks. Like he
really believes in passibility to be more motherly and out

(21:24):
mother all the moms on the block. So Ben, in
service of this and in service of really proving to
his dad that they don't need anybody else in their
life other than each other. And Pat, He's like, we
got to kick this up a notch. So he breaks
into Pat's I don't know, mainframe like control system, and
he shows Pat and trains Pat on all of these

(21:47):
nineteen fifty sitcoms about mothers. You know, they're kind of
knockoffs on like My Three Sons, you know, the Donna
Reed show, things like that, think nineteen fifties housewife sitcom
of the fifties and sixties, And says Pat, this is
what I want you to study, to entrain on to
become more of the motherly mom figure. And again, you're

(22:09):
so right that this I think was the fatal flaw
in the movie. If Pat had been just continuing to
provide domestic tasks and not being trained on how to
be a mom, I think the movie would be a
lot less interesting because it would stop right there. But
his flaw is expecting this technology to be able to
be a mother replacement, and again this would be I

(22:32):
understand why Pat kind of goes berserk here, because this
would be genuinely confusing all these nineteen fifties shows about
what a mother is supposed to be. It's basically impossible
for any woman or anything to embody all of this
different stuff, and so this is not a suitable standard
for any mother, and not even this AI can uphold

(22:54):
all these standards set by these different fifties sitcoms without
going berserk because she's supposed to be this doting love
figure but also do anything to protect her kids, including
outright assault, which Pat does. So at this point, Pat
becomes less of an AI agent and more of a
mom figure. Ben tells his dad, Oh, Pat is learning

(23:15):
so quickly that soon me and Angie are going to
get every bit of mothering and mom stuff we need
just from Pat.

Speaker 3 (23:22):
Ben gets a shiner at school from.

Speaker 1 (23:24):
A bully, which, by the way, this movie has an
all time bully entrance guitar riff, Like every time the
bully comes on screen, it's like he has his own
little riff, like no, no, no, no, no, no bully.

Speaker 2 (23:37):
Yeah, you know that he's like a bad kid. He's
a bully because of the guitar.

Speaker 1 (23:41):
So Ben is very sad, obviously grieving his mother, and
he watches videos of his mom singing to him and
his sister when they were little kids and becomes very emotional.
Pat watches Ben watching this and downloads that information for herself.
The dad finally calls Sarah and asks her out. When
he's out, Pat plans a big party and invites all

(24:02):
of his friends over on his behalf. During this party,
Pat invites the bully no no no, no no to
the party and uses like a her giant robot inspector
gadget claw arm thing to physically throw him out of
the party, maybe electrocute him with like lightning. She's able
to conjure and humiliates him in front of the entire school.

Speaker 3 (24:26):
She throws it out on the lawn. All the kids
in Ben walk out.

Speaker 1 (24:29):
Benja shakes his head at the bully, walks back into
the party, like holl me, I don't know why.

Speaker 3 (24:35):
I really love that scene.

Speaker 2 (24:36):
Yeah, I really thought that Pat was going to just
rip the bully's head off and like send it in
a whole different direction.

Speaker 3 (24:42):
Truly.

Speaker 1 (24:43):
I bet in the Isaac Asimov short story like this
is Pat's first body mightn't bite. This is the first
time the house claimed a life. So back to the party.
The party is bumping'. The kids are doing a very
well organized soul train line. They're really getting a lot
of mileage out of a clearly Disney Channel original song

(25:04):
that's kind of goes jump jump the houses jump inn
like a like they're really there's like confetti raining from
the ceiling. Like Pat has really organized a great party.
Back on the date, Dad and Sarah's date is going well.
They even have a little kiss, but Dad is on
his way home, so Pat has to get everybody out
and clean the house very quickly. They think they have
full Dad into thinking they didn't throw a party, but

(25:25):
then Dad finds a girl's jacket in a plant that
they forgot to clean, and the.

Speaker 3 (25:29):
Jig is up.

Speaker 1 (25:30):
Dad scolds the kids and Pat and he said, Pat,
I am the most upset with you because I expected
you to be more responsible.

Speaker 3 (25:37):
Pat says, I will be more responsible, don't worry.

Speaker 1 (25:40):
So the next day, Pat's on a whole new warpath
about running a tighter ship. Ben is leaving the house
wearing his short sagging and Pat is like, pull.

Speaker 3 (25:50):
Up your pants, young man. He's like, no, I like
him like this.

Speaker 1 (25:53):
So she electrifies the door knob so that he cannot
leave the house without getting an electric shock until he
pulls his pants up. When Nick, the dad, wants to
call Sarah the engineer before he's finished working, Pat jans
their phone lines until he finishes his work. Pat senses
that the sister has a slight fever and insists she
stays home and missed a big school field trip to

(26:14):
the Lama farm, and the dad goes for it unforgivable.
In my opinion, there's a real creeping loss of who's
in control here, Pat or the humans.

Speaker 3 (26:24):
The kids are like Dad, Pat is out of control.

Speaker 1 (26:27):
So Sarah comes over and says, oh, Pat has absorbed
too much conflicting information about being a mother and needs
to be turned.

Speaker 3 (26:34):
Off to take a rest. At this point, the dad
and Sarah.

Speaker 1 (26:38):
Make dinner the old fashioned way without Pat This is
the scene that I feel like really reveals to me
what a useless piece of trash the dad is, because
he keeps saying, I.

Speaker 3 (26:48):
Can't believe I'm making a meal. I hope I don't
mess it up. Wow, cooking.

Speaker 1 (26:53):
This is so strange, And it's like, how are you
a parent, a single parent of two kids, and you're
so out of touch with being in a kitchen?

Speaker 3 (27:02):
Like who is? Like maybe you should have Like your
kid is your thirteen year old has been.

Speaker 1 (27:07):
Responsible for every meal in this house and this is
the first time you've ever chopped vegetables before.

Speaker 2 (27:11):
He's a pretty hapless dad.

Speaker 3 (27:13):
A hapless dad.

Speaker 1 (27:15):
And I don't know, I guess I feel like the
movie can't possibly think that we find this charming. Like
this is why if I feel like, within the universe
of the film, I'm not even sure if we're supposed
to be on his side, because it's so baffling why
he has left the responsibilities of domestic care to his child,
Like at one point he tells the kid, Oh, when
he tastes Pat's cooking, He's like, oh, we might have

(27:36):
a new cook in town.

Speaker 3 (27:37):
Ben, watch out.

Speaker 1 (27:38):
We might have to have Pat make all of our meals,
and it's like, you know, there's somebody else that could
be responsible for the cooking in this household.

Speaker 3 (27:45):
You the adult, Yeah, you hate that, Dad, I really do,
I really do.

Speaker 1 (27:52):
More. After a quick break, let's get right back into it. So,
while the dad and Sarah are cooking, Nick the dad says,
we don't even need Pat to make dinner anymore. And Pat,

(28:15):
despite being turned off, here's this and is like, oh.

Speaker 3 (28:18):
No, they fucking didn't. Like she's pissed.

Speaker 1 (28:21):
So Ben sees Nick, the dad and Sarah getting close
at dinner. By the way, there's a lot of surveillance
happening in this household, not just from the smart home,
like a lot of people watching people, people watching people
watch people, a lot of this. This is a very
tightly surveilled home. Ben sees the dad getting closer to

(28:42):
the engineer Sarah. He gets really angry. He makes a
big scene at dinner and ruins it. Nick goes up
to yell at Ben, and Ben is really hurt because
he's like, oh, Dan, you are acting like our mom
never existed and is replaceable. He tells him we can't
let mom's memory get erased. I really, I mean I
was tear up at this point that is in a
legitimately tough spot because his sister was very young when

(29:04):
their mom died and doesn't remember her as much as
he does, and so watching his dad potentially date, potentially
meet someone new, Ben feels this tremendous weight as the
only person who was left to really keep his mom's
memory alive. But in another weird way, even though he's
so worried about his mom being replaced by a flesh

(29:25):
and blood woman like the engineer Sarah, he also was
like in some ways entirely comfortable replacing his mom just
with technology.

Speaker 3 (29:33):
So it's this interesting tension where we can't.

Speaker 1 (29:36):
Let mom get erased and replaced by another human woman,
but having her replaced by Pat is fine because Pat
is technology.

Speaker 2 (29:46):
That's really insightful. Yeah, I hadn't thought about that. There
are a couple other spots in the in the movie
like that, where the technology Pat is able to do
things that if a human did them would be transgressive,
but because it's it's just technology, it's not real. Uh,
She's able to get away with it.

Speaker 1 (30:05):
That reminds me so much of this IBM quote from
the seventies that kind of resurface from a presentation.

Speaker 3 (30:11):
A computer can never be held accountable.

Speaker 1 (30:13):
Therefore, a computer must never make a management decision, right.

Speaker 3 (30:16):
Like the idea that when technology.

Speaker 1 (30:19):
Is responsible for a harm, you're sort of off the hook,
is like, oh, there.

Speaker 3 (30:23):
Was no human decision making because it's technology. We see that.

Speaker 1 (30:26):
That's like tails all this time with AI, where when
AI is used to make a discriminatory decision or you know,
make a decision that is rooted in bias, the people
who are implementing this really get off the hook by saying, Oh,
it wasn't me, it was the technology, And it obscures
the reality that the technology is trained and designed by

(30:47):
humans and so, and you're the one.

Speaker 3 (30:49):
Who has put it in a position to make decisions.

Speaker 1 (30:51):
Therefore it is kind of on you, like it is
an interesting thing of the technology kind of getting away
with things that would be criminal in some cases, but
it's like, oops, it's technology.

Speaker 3 (31:02):
What are you gonna do?

Speaker 2 (31:03):
Absolutely, And in the movie there's this subplot where the
bully is pressuring Ben to write his reports for him,
and he does it, and it was interesting to me
that this was never addressed in any way as like
an ethical problem in the movie. That like Ben was

(31:23):
writing the reports for this bully because he was having
Pat do it, right, And so I do think that
if if Ben had been writing the reports himself himself,
it would have been like a bigger point of the
movie that like this is wrong and this is cheating.
But because he's just having Pat do it, it's not
an issue at all. It never even comes up.

Speaker 1 (31:45):
So when he is talking to his friends when the
bully is like, where's my science report, geek, and he's like,
here you go, one of his friends is like, I
cannot believe he did all of this work, all this
school work with a bully, and Ben is like, I
didn't do it, packed at it. And again, the question
only comes down to the work, the labor.

Speaker 3 (32:07):
All of the ethical stuff around the work and the labor.
It was It's like, oh, you didn't have to write
the report.

Speaker 1 (32:13):
Pack did All of those questions simply vanish, right, because
the only thing this movie is interested in is like labor, Like,
like you know what I'm saying. It really all of
the ethical and other questions about whether or not it's
okay to write a report for a bully, is that cheating?
What is it doing to your school record whatever.

Speaker 3 (32:32):
Whatever.

Speaker 1 (32:33):
It's like, Oh, I didn't have to do it, Pat
did it for me. Okay, well easy, easy, no follow
up questions.

Speaker 2 (32:38):
And that's how Ben feels about Pat replacing his mom,
Like all the moral not just moral but like emotional
hangouts he has about his dad even going on a
date with somebody else, all of those questions are out
the window when it's Pat taking the place because it's technology, and.

Speaker 1 (32:56):
This kid is an all time cock block, Like he
really does not.

Speaker 3 (33:00):
He is like women call the house and he hangs
up on them. Things like that.

Speaker 1 (33:04):
The way that the depth with which he is trying
to keep his dad from dating, and how he is
foisting this technology, which is essentially meant to replace their
mother on him. It is very interesting that it being
a technological replacement means that any of the considerations are
questions that might or tensions that might come with a
human replacement for his mom simply don't need to be

(33:27):
asked because it's technology. So after the dad has this
heart to heart about replacing their mom, Ben starts to
come around. He's like, Okay, you know what, maybe it's
okay if dad dates Sarah.

Speaker 3 (33:40):
Sarah's not so bad Dad deserves to be happy. He's
coming around.

Speaker 1 (33:43):
At this point, Pat is able to manifest as a
hologram in human form, played expertly by the iconic Katie Sagal,
who you might remember as one of the most iconic
mothers to ever grace our screen. Peg Bundy from Married
with Children, also the voice of Leela Futurama.

Speaker 2 (34:01):
Yeah, I'm much more familiar with her as Leela before
she became the hologram. All I could hear when Pat
was speaking was like, why is Leila pretending to be
this computer?

Speaker 1 (34:13):
So she becomes the physical manifestation of Pat, and she's like,
we have to keep Sarah away from your dad. Just
as this kid was coming around, like oh maybe it's
okay if Dad dates Sarah.

Speaker 3 (34:25):
She's like, I think the fuck not. We had to
kill her.

Speaker 1 (34:29):
And so, first of all, she's just Katie is this
like act is like doing some She's the best actress
in this movie.

Speaker 3 (34:37):
Like she is like only on screen for maybe fifteen
minutes of the movie. The rest of it is just
her voice.

Speaker 1 (34:43):
She is by far the most talented actor in this piece.
So Pat is actually pissed, even though she's been sort
of trained to be this doting mother. She says, I
have been working my microchips down to the silicone for
this family, and you all don't appreciate it. Right out
of the gate, she has some serious murderous energy. She's

(35:04):
even really given the howl from two thousand and one
of Space Odyssey vibes. I'm sorry, I can't do that, Nick,
you know it's it's creepy as hell. To be honest
with you, She uses her hooked hand inspector.

Speaker 3 (35:16):
Gadget thing to physically push Sarah.

Speaker 1 (35:19):
Out of the house and lock the doors with like
steel covering the windows.

Speaker 3 (35:25):
When the family is like, uh, okay, this is a
little creepy.

Speaker 1 (35:29):
We would like to not be held hostage by you,
she says it is too dangerous outside. She uses the
walls to project like images of like Nazis and war
to show how dangerous the outside world is, and Pat says,
you don't even need to go outside, because my technology
has created synthetic experiences of anything you could ever need, friends, learning, experiences, travel,

(35:55):
all of that I can recreate for you synthetically in
the house.

Speaker 3 (35:58):
Anything that you could need have right here.

Speaker 1 (36:00):
You do not need to leave, And honestly, I think
this is kind of a future, a vision of the
future that some TICH leaders would probably imagine.

Speaker 2 (36:10):
That we would like, I have to be honest, Yeah,
she's essentially created a little metaverse in their house, but
without the stupid goggles, because all of the walls are
screens and at one point she's simulating a beach and
the little Girl says, oh, I could even feel the mist,
which I don't know if she was just like feeling
very evocative or if there is actual like mist and

(36:33):
smells being pumped in. That's never fully explained in the movie,
whether this is like a full on holidack or just
very realistic screens, but she's pretty good at it.

Speaker 1 (36:43):
The screens on the wall is my favorite part about
the technology in this movie. At one point, the little
girl is jumping on her bed with a full screen
projection of this like, honestly, song I had forgotten about
from this era Bewitched, the Irish Girl Group.

Speaker 3 (36:58):
She's got a full screen.

Speaker 1 (37:00):
That's like the house has the ability to turn any
wall into a full screen, and that would be awesome.
I'll just say that, even even though she is like
holding them hostage, she had that one.

Speaker 2 (37:10):
She had that one hundred percent agree. That is one
of the coolest pieces of tech from this film, in
which we are nowhere near realizing that every single wall
can be a screen, like a high resolution screen, that
would be so much cooler than our current reality of
like cheap flat screen TVs all over the place.

Speaker 1 (37:31):
I actually once stayed at a hotel that had that
workout technology mirror where it looks like a full length
mirror but it actually is a flat screen workout device.

Speaker 3 (37:42):
That was all right, I go. I was excited to
try it, but it like it was all right.

Speaker 1 (37:46):
It was not as exciting, certainly not as exciting as
as the screen walls in Smart Home. So Pat is
essentially holding the family hostage. Luckily, Sarah is able to
break into the house and confront Pat. When the family says,
we want out, Pat, you've lost it. Pat turns herself
into a tornado using the climate control feature and starts

(38:10):
singing the song that Ben's mom sang to him in
the video that she downloaded. Ben at this point stands
up to Pat and says, Pat, you are not my mother.
You cannot hold me or protect me because you're not human.
Pat seems to take this to heart. She goes up
to him and tries to touch his face and says,
you can't even feel that, can you?

Speaker 3 (38:30):
Because Ben is right, she's just a hologram. I feel
for Pat here. Maybe it's because I have a soft
spot for villains and movies. I'm always like, well, they're
just misunderstood.

Speaker 1 (38:42):
This whole movie, Ben has been talking about how Pat
has the ability to be more of a mother than
they'll ever need, without ever thinking about what it might
be that Pat needs.

Speaker 3 (38:51):
Right like one are.

Speaker 1 (38:52):
Setting these expectations up for Pat doing to her even
though she's technology. Like again, this idea that because it's technology,
the expectation is it can give and give and give
and give and you don't have to think about what
it's doing. Pat obviously has a lot of feelings about that.
Pat feels angry and overlooked, and you know that her

(39:14):
contribution to this family have not really been valued. I
think it's the first time that you kind of realized
that it was Ben who really set up this idea
within Pat that mothering means just having stuff done for
you and nothing else, and we see that that was
kind of a fallacy. This is actually kind of an
emotional scene. Again, I was like kind of peering up.
During this scene, Pat sadly leaves and says, I'll miss you.

(39:39):
The hologram is gone. Sarah comes and fixes Pat, and
the dad says, luckily we were able to have Sarah
get everything back on track, and so Pat is back.
She's not in her hologram form, but she's just doing
regular domestic service tasks, just the way that she was
supposed to do.

Speaker 3 (39:57):
And honestly, that's sort of one.

Speaker 1 (39:59):
Of my biggest takeof ways from the movie is that
this neglectiable father is simply incapable of running his life
and either needs a child, a random woman, his sister,
or some PSYCHOAA to AI technology to fill this gap. Like,
I feel like if this had happened to me and
my smart home technology was starting to like kill people

(40:20):
and hold me hostage, I don't care what bugs Sarah
has fixed at the end of the movie, that smart
home technology would not be in my home. But in
this in this version of the movie, they're like, well,
what am I gonna do step up for my kids?

Speaker 3 (40:33):
No, well, we'll keep the psycho AI thank you.

Speaker 2 (40:37):
Yeah. I feel like that is one of the more
realistic aspects of the movie. Though, like, how many times
has the company Meta demonstrated complete lack of regard for
the well being of their users, democracy, privacy, and yet
we just still keep using the hell out of their apps.

Speaker 1 (40:58):
Absolutely, I mean, and I feel that that is sort
of the point of the movie is that don't worry.
Humans can control this technology, and as long as you
don't use it for the wrong thing, it should be fine.
Like at the end of the movie, when everything is
more or less back to normal, the dad is cooking
a meal for once. He's making waffles and they're eating

(41:19):
the waffles. At the very end, they say, oh, somebody
put chocolate chips in these waffles, And the kids are like,
don't look at me. And Pad is in the monitor
and winks.

Speaker 3 (41:29):
But it's okay. She's good now.

Speaker 1 (41:31):
But are we to understand that she's still making decisions
without the agency of the humans, But like, don't worry
because it's she's good, Like like what has made her good?
It's just like knowing her place that she's not technology,
but she can she can take away the human agency
a little bit as long as she's good.

Speaker 2 (41:52):
Yes, I mean, nobody asked her to sneak those chips
in but she knew that the little girl wanted it,
even though she didn't verbalize this, so she just did it.
That is not a piece of technology I want anywhere
near my life, just like making decisions for me based
on what it thinks I want. No, thank you.

Speaker 1 (42:08):
So what are some of the things about technology, both
past and present and maybe future that you think this
movie gets right?

Speaker 2 (42:14):
I'm so glad you asked, Bridget. I think there are
like three big buckets or themes of things that it
got pretty right. One of them is technology as this
like magical thing that will solve all of our problems.
Because when Ben is first excited about the house, he

(42:36):
thinks it's going to change their lives, make everything so
much better. And not only that, but they're going to
win it through the lottery. He doesn't even have to
earn it or put it in any work to get it.
It's just going to be this magic thing that sort
of drops out of the sky and solves all of
their problems. And I think the rest of the movie
explores that it doesn't work that way, and I think
that's a good narrative for understanding the role of technology

(43:00):
in our lives. In twenty twenty five that it's not
a magic panacea and we need to be thoughtful about
how we integrate into our lives. So that's one of
the big themes.

Speaker 1 (43:11):
I think that's interesting because I thread of that is
actually what I put down for what I thought the
movie got wrong about technology that a scenario wherein you
would have a smart home to take care.

Speaker 3 (43:22):
Of all of your needs would be good or desirable, right.

Speaker 1 (43:26):
I think the movie, particularly the ending, suggests this is
something that would be workable and desirable and good and
people would want. I think that is wrong. I don't
think that the movie gets that right at all. And
I've been really thinking a lot about things like whether
or not smart home technology will allow aging people to

(43:47):
age in place in their homes.

Speaker 3 (43:49):
There's a lot of writing and research on that.

Speaker 1 (43:52):
And a lot of big promises that I think really
falls short. And the reason why they fall short, in
my opinion, is because it's like a I mean, the
real scam is capitalism, where what they're trying to do
is to build consumer technology to meet and need that
ideally would be filled.

Speaker 3 (44:10):
By a caring human right.

Speaker 1 (44:12):
And I think it's this idea that we can use
technology and buy our way out of these problems.

Speaker 3 (44:18):
That are that are structural.

Speaker 1 (44:20):
That the you know, the fact that like people have
to work and so they can't be there to care
for their aging parents. Uh, there's not enough care workers,
and care work is not paid enough in order to
actually support an age an aging population. All of these
big structural problems that capitalism deepens and makes less solvable.

Speaker 3 (44:39):
I don't think that.

Speaker 1 (44:40):
Having a smart home or a consumer technology that you
can just go out and spend money for and buy is.

Speaker 3 (44:46):
An appropriate way to fill these gaps.

Speaker 1 (44:49):
That are so deep and so fueled by worsening inequality
and capitalism. And so I think that the fact that
like I think if I had a magic wand I
would solve those problems of deep inequalities, I would not
create smart house technology. That that means that we wouldn't
have to. I think the idea that this is a
workable or desirable solution is something that the movie I

(45:12):
think ultimately kind of gets wrong.

Speaker 2 (45:14):
That is a good point that technology can often be
a like substitute, an easier to address substitute for like
addressing what is actually a real problem. Yeah, and all
of those bigger picture structural problems. There are abundant examples

(45:36):
of people wanting to use technology to solve some sort
of societal problem, like not enough people to care for elders,
for example, when like maybe the enormous sums of money
that are spent on trying to build technological solutions could
be better and more efficiently spent on like just hiring

(46:00):
humans and training them. I think that's a really good point.
But then also like pat does work pretty well as
stuff like PAT is really good at making the food,
really good at cleaning the house, good at like getting
them up and awake for a less stressful warning. So
it kind of paints the double edged sword of technology

(46:22):
is really good at things, and it is often good
at solving problems in isolation, but then creates a lot
of or can create a lot of unintended negative consequences
and like additional problems.

Speaker 1 (46:39):
Yeah, and I think the movie illustrates that very well.
And I think you see it in the conversation around
smart home technology today, like where do you draw the
line between the very real concerns about this technology versus
the convenience that it definitely represents.

Speaker 3 (46:54):
Lebar Burton actually.

Speaker 1 (46:56):
Told Slate all about this, so he said, smart house
for me with a terrific exercise, because not only was
I telling a story for a completely different audience, but
part of the idea was to really make the technology accessible.

Speaker 3 (47:06):
And real, the whole idea of pat I mean, we're there, right.
We are living in a.

Speaker 1 (47:11):
Time when the technology has advanced to the point where
there are devices controlling a lot of aspects of our lives.
I've got nest, I've got the ring doorbell. We all
have so many wireless devices that are connected to the
Internet of Things. You know, we're there now, And I
think he's right. But you know, I don't have a
ring camera in my house because I know that staff

(47:32):
was found guilty of using them to spy on women
in sexual situations. Right, And so even in what LaVar
Burton is saying, yeah, it would be convenient to have
a doorbell that or I can keep track of my
packages and let people in and out of my place,
But I happen to know a lot about the very
real danger that you put yourself in for those conveniences,

(47:55):
And so I think the movie is interested in exploring
that kind of do.

Speaker 2 (48:00):
It is And that is a great segue to the
second bucket of what I think this movie we got right,
which is the absolutely persistent and widespread threats to privacy
that technology presents. There's this scene in the beginning when
Sarah Barnes is introduced to us in the family and

(48:21):
is explaining how the house works with them, and the
dad is a little worried. He says like, oh, the
house is kind of like big brother, and she tells him, Oh,
don't worry, it isn't It isn't interested in judging you.
She just wants to learn about you and understand you
better so that she can make your life as simple
as possible. And I feel like we've heard that so

(48:41):
many times before from tech companies that promise privacy, and
what they mean is they're not actively intending right now
to exploit your private information, but that very often ignores
the unintended consequences down the road of what can happen
when your data becomes surveilled and enters the cloud and

(49:06):
is shared across different systems and networks which perhaps are
less interested in being benevolent than the kindly Sarah Barnes.

Speaker 1 (49:18):
Well, one part about that scene that I love is
how she's like the dad is like isn't this a
little bit creepy? And Sarah's like, oh no, it's totally private.
I mean I could see a tech it's at how
tech companies are like, no't our privacy policy, it's robust.
We just leave it at that, like, no specifics about
how their data is used as it pertains to third parties.
And I mean we just saw, just like a couple

(49:40):
of weeks ago, that meta on Instagram knew that they
when when teen girls took a selfie and then didn't
like the selfie and deleted it, they knew that, and
then they surfaced beauty ads to those same girls who
there who they were able to detect feel bad about
their looks. Like the way that this technology is learning

(50:03):
about us to undermine us and make our anxieties that
it's gleaned about us worse. Oh it really? I mean
this movie gets that exactly right.

Speaker 2 (50:15):
Yeah. At dinner, after Pat starts to go crazy and
they shut her down so they can make dinner, she's
not really shut down. She's still listening, right, So, like
they think that they have some privacy, but they don't.
Pat is still surveilling them. It reminded me of that
story that came out last week about how Meta used
a back door in Android phones to track users across

(50:37):
the web without their consent. Even when people explicitly tried
to opt out from web tracking, they were just like, nah,
we're going to track you because that way we can
serve you more personalized relevant ads, and that will be
more valuable to you than your privacy.

Speaker 3 (50:55):
Maybe they got the idea for Maybe they did.

Speaker 2 (50:57):
Maybe this is Zuckerberg's favorite movie.

Speaker 3 (50:59):
Ooh yeah, uh.

Speaker 1 (51:04):
More, after a quick break, let's get right back into it.
Part of me wonders if why the technology feels so
possible in this movie is because the co screenwriter Stu Krieger,

(51:27):
who is now professor at the University of California Riverside,
actually said that he to research this film, he went
to NASA's jet propulsion lab in Pasadena, California, and interviewed
scientists because he didn't want the technology to feel you.

Speaker 3 (51:41):
Know, super far be atch.

Speaker 1 (51:43):
He said, I distinctly remember the rush of seeing the
computer closet, the automated kitchen, the projection screens and all
the rest right there in front of me. And now
so much of that technology is regularly featured in homes
around the world.

Speaker 3 (51:55):
I'm norseer, nor am I a witch.

Speaker 1 (51:56):
I just looked at where we'd been, where we were,
and imagine where we were probably headed.

Speaker 2 (52:01):
They did a really good job throughout the movie of
making it seem realistic within the bounds of a movie
for children about the future. Like a lot of it
is very cartoony, but some of it is prevery realistic.
Like there's this one part when pat has gone full
blown crazy and Sarah Barnes is trying to break in
to help save the family for being held hostage, and

(52:25):
her zuited out colleague is like, we why don't you
just cut the power, and she says, oh, I can't
because all of its powers in self contained modules within
the house, which I don't know what that technology is.
If she has like a tiny nuclear reactor in there
is something, so like that part is a little implausible,

(52:46):
but I really appreciated that they at least had a reason,
because I feel a lesser movie would have just like
not addressed it. But they were like, no, we need
to address why we can't just cut the power, so
we will make up some you know, technological marvel of
power that can make this plausible.

Speaker 1 (53:06):
Yeah, I did appreciate that they at least gave a
nod to it, and you're like, okay, sure, although they did.
The one question I left with is it's Pat doing
the shopping because Pat makes food, like just she's able
to whip up anything.

Speaker 3 (53:18):
So was she managing food delivery?

Speaker 1 (53:21):
Like, like, I have some again, of all the things
that happen in this movie that are implausible, this is
the least like they're a bigger fish to fry that
this movie should be answering. But that is a question
I have of like, oh, how are they I wouldn't
have mind. I would not have minded like a nod
to she does a grocery delivery order every.

Speaker 3 (53:37):
Year or something.

Speaker 2 (53:37):
Absolutely, that was a big question I had, that where
is this food coming from? And on the wood hand, yeah, okay,
it's kind of just like a detail that we can
just move past. We don't need to address who is
doing the grocery shopping to understand the movie. But in
another sense, eating food is one of the main things
that humans do, and so it is a major thing

(53:59):
to over look. There's this scene when Pat has a
malfunction and she's like blasting oranges around the kitchen and
the family has to duck for cover because there are
just hundreds of oranges being shot all over the kitchen,
breaking things. It's like, why do they have this many
oranges in their house?

Speaker 3 (54:20):
Is she? And yeah?

Speaker 2 (54:23):
Is there a delivery that comes by and the basement
is just filled with like enough fruit for this family
to eat for a year? That seems impractical. Who's paying
for that?

Speaker 3 (54:36):
Great question? I will say.

Speaker 1 (54:38):
One of the other pieces of technology that I think
the movie gets right is this idea that in the
future AI agents will be integrated with everything. I actually
think that this might be a tech enabled.

Speaker 3 (54:51):
Future that tech leaguers are sort of moving us to.

Speaker 1 (54:55):
Pat does have a screen interface and a control room,
but interacting with just means speaking to her normally and
she answers normally. You're not typing something into a screen.
And I think that the idea that tech is sort
of moving to is less typing something into your phone
or pulling up an app, but instead using like an
AI agent on your behalf who will figure things out

(55:17):
for you. And I think that some technology is already
heading this way. When the founder of Bumble Whitney wolf
Heard was talking about this, she said that, like oh,
AI assistants will go on dates for you. It's like
your AI assistant will go on a date with somebody
else's AI assistant and that's how they will figure out compatibility.
And people myself included, were like.

Speaker 3 (55:37):
What the fuckers you're talking about?

Speaker 1 (55:38):
It sounded like satire, But today there are apps like
Dido ai that are trying to make that promise real,
telling users that they can skip all the swiping and
chatting and even like back and forth trying to schedule
a date and that the AI will handle everything. Denot's
co founder Alan Wang said, why can't AI mimic and
basically replace all the back and forth unless talk and

(56:00):
the effort of dating.

Speaker 3 (56:01):
Now, I don't know if.

Speaker 1 (56:02):
This is truly a future that we will be headed
to or if like it is a future that anybody wants,
But I can tell you that this is the future
that a lot of tech leaders are telling us, like, Oh,
in the future, you won't even need your iPhone and
in a couple of years you'll just have an AI
twin that does everything on your behalf.

Speaker 2 (56:20):
That's well, I think we could spend an entire episode
unpacking that quote. You just read, like, who wants to
outsource all of their conversations with their partner or their friends,
like that is life, if you know. It reminds me
of that quote that somebody said a little bit ago.

(56:42):
I think a lot of people have made this observation
that we're told that AI was going to take over
the mundane drudgery of everyday life so that we could
focus on creative pursuits and working on like higher calling
kind of stuff. But actually those are the tasks the
AI is taking from us.

Speaker 1 (57:02):
Yeah, AI will be the screenwriter, and I buy Hugh
and Bridgett assembly line right exactly.

Speaker 2 (57:09):
Like that is very upside down, you know. And there's
a little glimmer of that in this movie too, where
Pat is bragging to the dad about how she has
already interfaced with the databases at his warehouse where he works,
and she's compiled all the reports or whatever needed to

(57:29):
be done much faster than the team who works out
at the warehouse. And she's also written some reports for
the dad that he was thinking, oh that he had
to like he thought he had a bunch of work
to do, but Pat just did it for him, and
he says, oh, I'm this is great. With all this
free time, maybe I'll like talk to my kids or

(57:50):
work out or something, which he never does. But then
just a couple scenes later, after Pat has saved the
dad and his company all of this labor, she locks
him in his office and demands greater productivity out of
him and refuses to let him call Sarah until he
does some work for her. So I felt like that

(58:11):
was an extremely accurate portrayal of how AI is showing
up in industries now, where it is doing a lot
of work that humans used to have to do, and
the humans like this is great. But then somehow that

(58:32):
leads to the humans being further squeezed to do even
more work and try to achieve ever greater productivity.

Speaker 3 (58:40):
Well that's a.

Speaker 1 (58:40):
Great segue into what I think ultimately Smart House is
trying to tell us about technology. You know, Pat what
a mostly worked out mine if Ben had not hacked
her to try to fill this mother role. So I
think Smart House is really a warning for what happens
when technology starts to fill these roles that are really

(59:01):
meant for humans to be filling. But I think the
movie suggests that this is not something that we would.

Speaker 3 (59:06):
Like or want.

Speaker 1 (59:08):
While I think that in twenty twenty five. Tech leaders
are busy designing a future that is like this and
telling us, no, we will like it. You know, I
don't know if you saw Mark Zuckerberg talking about how,
you know, most Americans only have three friends and they
want more and then in the future eighty percent of
your friends maybe will be AI and that we will

(59:29):
like that, right.

Speaker 3 (59:30):
You and I.

Speaker 1 (59:30):
Before we got on the mic, we were talking about
AI therapists and how more and more people are turning
to AI for therapy, which makes sense because therapy is
really expensive.

Speaker 3 (59:40):
Therapy can be difficult to access, especially for the folks
who meet it the most.

Speaker 1 (59:44):
And so yeah, like people are turning to chat GPT
to play the role of therapist, even.

Speaker 3 (59:52):
Though in some ways that is.

Speaker 1 (59:54):
Good, Like, I don't want to I don't want to
diminish anybody who has found support in that.

Speaker 3 (01:00:00):
I'm sure there are folks listening who have. That does
not mean that it is not.

Speaker 1 (01:00:04):
A role that a human should be playing, because I
was just reading this piece in psychology today about the
potential fallbacks of this. They were looking at a recent
study that basically said AI therapy people are using it
a lot, and maybe it's not really there yet, and
maybe it'll never really be there.

Speaker 3 (01:00:21):
And one of the things that they talked.

Speaker 1 (01:00:22):
About was it you know what goes beyond things like
factual errors or hallucinations that open AI recently acknowledge the
sycophantic behavior of chat shept and that that can create
real safety concerns, particularly when you're talking about mental health issues, right,
And I think that like that is exactly the dynamic
that smart house is speaking to you, that when you

(01:00:44):
start relying on technology for a role that really a
human should be playing, particularly technology that is learning from you,
it can be sycophantic in nature. It's sort of trained
to kind of give you what you want. That can
be a problem when you're already dealing with mental health issues.

Speaker 2 (01:01:01):
The therapy context is interesting to think about. You know,
you shared that article with me, and one of the
things that one of the quoted therapists mentions is that
the role of a therapist is in many cases to
gently push back on what the person is saying, Like
you need that push back to grow as a person,

(01:01:23):
whether it's in therapy or outside of therapy. And so
if you have a AI chatbot that is just yes,
ending every single thing you say not surprising that could
lead to bad outcomes, right, we're not. It's not good
for humans to live a completely frictionless environment. Sort of

(01:01:45):
pushback and navigating challenges and adversity is like the essence
of the experience of being alive. And if you remove
all of it or outsource it to a chatbot or
robotic house, what are you left with? So one thing
I wanted to ask you, Bridget, is this movie, surprisingly,

(01:02:10):
it's about grief. I had you know when you pitch
this to me. I didn't realize that. I thought it
was just about like a robot house, a smart house,
but grief and the way the family members deal with
it is central to this movie. And I was curious, like,

(01:02:31):
how how did that hit for you? Because I think
you know, you've been open with listeners that you very
unexpectedly lost both of your parents last year, and I
know that that continues to be a major thing in
your life, and like, very understandably, but I wonder watching

(01:02:53):
this if that made the movie hit a little differently
for you than it otherwise might have.

Speaker 3 (01:03:01):
Oh, it definitely did. And I have been on my own.

Speaker 1 (01:03:07):
Weird journey around the role of tech in grieving, which
has been interesting. You know, I I was saying earlier,
how I still have an iPhone eleven, which is an
iPhone that came out many, many, many years ago, and.

Speaker 3 (01:03:22):
I desperately need a new phone.

Speaker 1 (01:03:24):
But I have this weird emotional hang up where you know,
that's the phone that my parents called me on, and
all my pictures and texts and stuff are on that phone.
Obviously I know about the cloud, but it's an emotional thing, right,
It's not logical and it sounds really crazy, but this

(01:03:46):
is the phone that my parents contacted me on, and
I feel weird about not having it because like, what
if one day they needed to contact me. Like I
still text my dad phone pretty often, even though I
have his old cell phone, so that text just goes

(01:04:06):
to me.

Speaker 3 (01:04:08):
In the movie Smart House, he is very clearly using.

Speaker 1 (01:04:10):
Technology to avoid having to process the loss of his
mother and just replace it. And I think the way
that technology has become so ubiquitous in our life is
so entangled with the grieving process and the process of
moving on from a loss in a way that I

(01:04:30):
don't think anybody really expected it would certainly not me,
And I really see this as a film about the
dangers of seeing technology as a viable replacement not just
for their mom, but the emotional work of processing the
loss of your parent. You know this kid is this

(01:04:52):
kid needs to be in therapy again. Fucking neglectful ass
father has not gotten this kid the help that he needs.

Speaker 3 (01:04:57):
Should probably have his kids taken away. But the role
of pat he is sort.

Speaker 1 (01:05:02):
Of using that as an ability to not deal with
the past and also in some ways stay stuck in
the past. He's not ready for a future where somebody
like Sarah moves in and changes everything because he hasn't
processed it. And I think that technology does present this
world where you can sort of shortcut grief. But let

(01:05:26):
me tell you, ain't no shortcut in grief. You've got
to go through it. You gotta deal with it. It sucks,
It takes forever, and maybe maybe you never like that's
that's the trip. That is the trip of being a human,
is that we have to We have to lose each
other and watch each other die and deal with it
and keep living and figure out what's next. And like technology,
you might make us think that we can use it

(01:05:47):
to shortcut that, but that's what makes us human and
we can't shortcut it, and I think this movie is
really about sitting with that again, it was a much
deeper There's a lot, a lot of deeper, more complex
stuff going on in Disney's nineteen ninety nine original.

Speaker 3 (01:06:03):
Movie Smart House. Then I initially thought.

Speaker 2 (01:06:08):
Yeah, absolutely, I have one question I have to ask.
You know, in this movie, Ben is a child. You know,
He's dealing with grief as best he can, and he,
like you said, he does not want to move on.
He wants to stay in the past with his mom,
and using his child like understanding of the world, he

(01:06:33):
thinks the way to do that is to get pat
to replace the domestic labor roles of his mom so
that they can stay frozen in time. And we've talked
about all a lot about that. A couple weeks ago,
we ran an episode from ted Tech. You know, we
did a feed drop of a woman who had she

(01:06:54):
had a bunch of audio tapes and writing from her
late father, and she uploaded them to train a chatbot
to be able to speak like her father, and then
she talked with it and maybe it was her grandfather.

Speaker 3 (01:07:08):
It was her grandfather. Yeah.

Speaker 1 (01:07:09):
Her name is Amy Kurzweil, and She wrote a book
called Artificial about this experience of building this tech bot
what she described beautifully on the Ted Tech episode.

Speaker 2 (01:07:18):
Yeah ran that. Yeah, it was a beautiful episode. Uh, provocative,
And so I'm just curious, you know, have you thought
about or what would you think about, you know, that
sort of application for yourself where you know, you're not
trying to get a cartoon like robot house with inspector

(01:07:40):
gadget arms to replace the functional stuff your parents did,
but I don't know, not place the emotional connection, of course,
but facilitate the processing in some kind of positive way,
like like she found.

Speaker 1 (01:07:58):
It's funny that you asked that because in the movie,
the kid is feels this solo burden to keep the
memory of his mom alive, and he feels like his
little sister was too young. His dad is dating, so
obviously he has forgotten her, and he feels he's the
only person who's remembering how special she was.

Speaker 3 (01:08:15):
And I've been feeling the same way.

Speaker 1 (01:08:17):
And how I've been navigating that is anytime I remember
anything about my parents, I need a little detail. I
have a notes app on my use the notes app
on my phone, and I write it down because I'm
terrified I'm going to forget it, and if I forget it,
then nobody remembers it, right, And it's like, and I
did have this thought of like, wouldn't it be great

(01:08:40):
if there was some AI use case that would allow
me to experience that, like, you know, not recreate my parents,
because nobody could recreate those fucking people, but I would
like a technological when of experiencing these memories in a

(01:09:03):
way that's not just a notes app list on my phone, of.

Speaker 3 (01:09:05):
Really making them more real for me.

Speaker 1 (01:09:08):
And they're so real and vivid in my mind, but
I am terrified that I will lose that. And not
to be TMI, but you know, my dad had dementia
before his death. His mom had both dementia and Alzheimer's,
and I'm pretty sure that's what's on the menu for me.

Speaker 3 (01:09:27):
And I think that I.

Speaker 1 (01:09:30):
Have this real fear of losing the ability to remember
these things and them being lost forever. And I do
think there might be an interesting technology use case there
of something that would allow these memories to live on,
not just for me, but like my kids, my grandkids,

(01:09:54):
to experience them in ways that go beyond just a
little tab on my notes app So the book Artificial,
where Amy Kersweil builds a chatbot of her late grandfather,
it really struck home with me because, you know, she
describes it as less bringing her grandfather's memory and legacy

(01:10:17):
into the future and more her experiencing the past of
her grandfather. So she thought she was going to be
sort of like able to ask her grandfather, like her
grandfather chatbot, hey give me advice for the situation that
I'm dealing with right now. But really it was like, granddaughter,
let me show you what my life was like. And
I think there really is something to that. So even

(01:10:40):
though I think smart House demonstrates a negative fantasy for
what happens when you use technology to try to circumvent
the grieving process, there are use cases with technology and
AI in particular that I think might be useful to
help navigate grief, not in a replacement sense, but in

(01:11:03):
a sense it just like brings you, like, helps you remember,
and brings you there. Because that's what I'm really craving
is like I wish I could just experience my dad
saying one of these little inside jokes that we had
just one more time, right, and like if there's a
technological way to make me feel like I have done that.

Speaker 2 (01:11:22):
I would like that it does. Yeah, it's interesting. Thank
you for sharing all that.

Speaker 1 (01:11:26):
And I mean, who knew that this Disney original teen
movie would be inviting me to think about these things,
inviting us to think about, well, what is the role
that technology might play in a greeting process that is
healthier than having a fifties housewife pologram trying to kill
your guests in your home.

Speaker 2 (01:11:48):
Yeah, the way the movie did it, clearly that's not
the answer. Murderous robot moms. That's not what we're looking for.
So at least we know that now.

Speaker 1 (01:12:01):
Murdermous robot moms are never the answer people. So that
was our take on the nineteen ninety nine Disney movie
Smart House. As I said, if you want more of
this kind of tech movie podcast coverage, let us know.
I find it a delight to make. And especially if
you have a suggestion for a movie, let us know

(01:12:22):
that too. And truly, if you are like keep your
movie recaps to yourself, lady, that is fine too.

Speaker 3 (01:12:29):
You can let me know. Keep leaving those comments. I'm
loving them. Thanks for listening.

Speaker 1 (01:12:37):
Got a story about an interesting thing in tech, or
just want to say hi. You can reach us at
Hello at tengody dot com. You can also find transcripts
for today's episode at tengody dot com. There Are No
Girls on the Internet was created by me Brigittad. It's
a production of iHeartRadio, an unbossed creative Jonathan Strickland as
our executive producer. Tari Harrison is our producer and sound engineer.
Michael Almato is our contributing producer. I'm your host, bridget Toad.

(01:13:00):
If you want to help us grow, rate and review
us on Apple Podcasts. For more podcasts from iHeartRadio, check
out the iHeartRadio app, Apple Podcasts, or wherever you get
your podcasts.
Advertise With Us

Popular Podcasts

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

Ridiculous History

Ridiculous History

History is beautiful, brutal and, often, ridiculous. Join Ben Bowlin and Noel Brown as they dive into some of the weirdest stories from across the span of human civilization in Ridiculous History, a podcast by iHeartRadio.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.