Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:37):
It was like, Let Mom get out the
fan record.
Beep, beep, beep, beep, beep, beep.
Roll it along.
How they can actually understandChewbacca.
And I mean Chewbacca is not a droid,but like, that's amazing.
(00:57):
Like boogie woogie.
And then Rey has this little conversation,like, I know, I know.
And she's like, she's understanding it.
So I have lovingly called my little space
heater, my droid,and he's out with the air
conditioning unit as well as like,I have a droid with me.
(01:18):
I imaginethere'll be a phase of human history
where everything will just becomelittle droids, like you'll be trying
to fix your AC unit and there'll bea little droid that walks up to you.
Like maybe John. Yeah.
I'll always have a personality.
They have a personality. Is so cute.
The trashcan follows you around
and like, No, stop
(01:39):
trying to get work done.
But they get well.
Anyway,thanks so much for joining us, everyone.
Of course, this is Community
Roots, a place where we gatherin community to talk about mental health
so we can travel that journey of lifetogether.
I am your host, Samuel Richards.
I'm Julie Richards.
(02:00):
I'm Sarah Wakefield.
I'm sitting there thinking about travelingthe journey of life together.
And all I can visualizeis that little baby going up and down
the sand all over like travel,the journey of life
with the cutest little droidon the planet.
There's 20, 21 over the year.
The Droid maybe,or maybe you're on the sun.
(02:20):
That'swhere we're headed with this conversation.
To get you a little bit.That can fall. Yes.
And stuff. Yes.
Maybe that's something for the office.
You know, that would be cute.
I never really did learn how to use that.
Yeah. Yeah.
It's one of those like.
I go for my future.
That would be quite fun.
(02:43):
Well, speaking of robots
and the future,we kind of decided to take an interesting
take on today, which is somethingthat's a little bit newsworthy.
so I was, you know, reading some articles.
It's something that
we've been kind of talk enough about, butthere was something in The New Scientist
(03:04):
about grief. I,
Mom also brought in an article
from the BBC about just, therapy bots.
And, so, yeah,I'll give you a brief overview
and then we actually have not discussed itas a group, so I am looking forward to
(03:24):
just the conversation in general and just,you know, the way things are going.
But so yeah, the first one and I will
put some of these sources in the notes,so you got it.
it's in the Novemberedition of The New Scientist.
It's called The Riseof Grief Tech, essentially.
there's another one also talk
(03:46):
the BBC characterAI young people turning to A.I.
therapist bots.
And essentially what's happeningis chat GPT, the like,
language modelsthat we've heard a lot about
have been adapted to use for
both situations of therapy,I'm guessing CBT therapy.
(04:07):
And then,
just handling grief,
like when there's a death,maybe there's this grief A.I.
service that crafts an AIbased on your dead relative,
so you can have one more conversationwith them.
So a whole spectrum of like these types of
AI that are kind ofcoming through and existing.
(04:28):
So yeah.
And then your perspective on itmaybe bring a little bit later.
Yeah, right.
I don't know.
Is that connection there or not?
You know, it's, it's almost likeI think that GPT,
these things in general are just toolsthat are going to be adapted
in a wide variety of ways.
(04:50):
There's already bots that you chat withand it gives you back its answers.
So this is just yet another kind of thing.
But I did think it was interesting.
So maybe how about let's talk about grief,
AI and then kind of go into the,
the larger just kind of therapy part bots.
(05:10):
Does that sound good to you guys? Right.
so, therise of grief tech is very interesting
because essentially what happens inthis article is the author,
her mother passed away years ago, and soshe found this service that creates A.I.
grief bots.
(05:31):
And so she talked to a coupleof psychologists and stuff
and ended up talking to this therapist,I mean, to this A.I.
bot and,
essentially just her experience with it,
what that kind of brings.
So yeah.
As the actual therapists in the room,what do you think about creating
(05:56):
A.I. bots that are dead relatives?
Like, is there,is there some benefit to that?
Is thereis it just kind of like false hope?
Are we just kind of like therapy risingto kind of like a sick solution?
I don't know. I'm going to hand it offto you. Let's see what you.
Yeah.
When I was reading the article,I was thinking it's perpetuating
(06:16):
the grieving process instead of helping it
because it's keeping that sense of
this.
Like, arewe in reality or are we not in reality?
Like, part of the reality is acceptanceof this relationship has to change.
One thing that I reflected onthat is interesting and grief training
(06:37):
is that they have kind of movedfrom a model of
like the five stages of griefthat people often hear about to.
We can actually maintain a relationship
with our loved ones who have passed,but it just changes.
So for instance,if someone wants to talk to
(06:58):
or have coffee and reflecton that relationship or drive
home from work and kind of have an outloud conversation and,
you know, talk to this loved one,it's not so much,
I don't think seen as the loved onecan hear.
You are not or you're helping that personstill be alive or something like that.
(07:20):
But there's a sense of in your heart thatrelationship has meaning and it matters.
And it can always matter.
It doesn't have to have a finality inthe sense of that relationship has ended.
It's gone. You don't have it.
It's all loss, but it does acknowledgeto the reality that
that love continues to live on.
(07:42):
And you anything from a standpointof co regulation
of those moments of healing and connectionthat we talk about,
those are in usas part of our internal community.
So people that have encouraged usloved us, supported us.
It's not that
we're just impacted by traumaand negative things that have happened,
(08:02):
but we're also impacted by momentsof connection, relationship,
encouragementand coaches, mentors, therapists,
loving moments of our lives
that that are constantly
fueling us and supporting us.
And so that can live on.We don't have to end that.
(08:24):
But perpetuating the idea of
like this, you know,
I want to say honky tonk like kind of
a sound sound bite or like a,
you know, this created
being that's really electronicand not real at all.
That's a little uncomfortableor disturbing to me.
(08:47):
Honestly.
I'm a little bitlike the article talks about the idea of
are we keeping someone more stuck
because you start to mess with your
your mind basically ofSo wait, is the person really gone or not?
And then at the end
when she's like, yeah,those are not things my mom would say.
(09:10):
And she's like,I would, I would not want you to get it.
Yeah.
Like at the end of the article,she's like, I'd be okay to pass on that
because that's
it did give her some warm connectionon some level, like brought some tears
and brought some memory,but it was like, yeah, that's not my mom.
Well, and it sounded more to that
(09:31):
to relate it back to what you were sayingearlier,
you know, meeting with a friendto discuss her.
It's on that same plane of meetingwith a friend to discuss
your deceased loved one and reminiscingin that way.
That kind of thatfelt like what she was experiencing,
talking to this bot that was supposedto be acting like her mom.
(09:53):
You know, it's really
interesting, this idea. Of.
Reality.
And you know, I'm
I'm firmly opposed to this ideaof living outside of reality
because I don't knowhow any other way to live.
But for some people,I mean, this is the future.
(10:13):
This is somethingthat people have dreamt of.
You know, we use likeshe says in the article we used to keep,
we have pictures.
People created shrinesfor their loved ones.
We we remember people
when they pass and many different ways.
And so it's it's so interesting
cause I feel like I'm of a generationthat has a foot in both worlds,
(10:38):
like grew up without a lot of technology,
but also,
you know, towards
the end of college, technologystarted booming.
And I can see how on both
both sides,how this could be valuable to people.
But it also bogglesmy mind that it's healthy for anyone.
(11:02):
A little creepy.
You what can I say that that's a little.
I think I'm going to I'mgoing to pull out a little too
a couple of lines from the articlethat I think is interesting
because basically what they are sayingthat this A.I.
could potentially resolveis the early parts of grieving
where there's kind of a conflict of how
(11:23):
we're feeling versusthe reality that's around us.
So the line I'm going to pull is
the brain goes to war with itselfas two types of memory clash.
Semantic memorykeeps track of general knowledge
about how things are, including the selfin our relationship with others.
Whereas episodic memory capturedspecific events rooted in space of time
(11:43):
during grief, the semantic expectationthat that relationship will continue
jars with the episodic memoryof the person's death.
Death
by learning to reconcile
this conflict,we gradually adapt to our loss,
which I guess is a
by a psychologist.
(12:04):
SEELEY But
I just think that's interesting where
I'm not necessarily for or against it.
I can see both sides as well.
It does seem a little creepy.
And it's also fake, right? It's not real.
That's what kept coming back to meis just like, this is fake.
(12:24):
Like it's just unsettling that it's fake.
It's interesting.
You can both put it in.
I think you can put it into the campthat's like you go
and you pay a medium to gosit in the house.
And they're like, you know, in the walls.
And they're saying that they're talking
to their your loved one or whateverit might be on that same level as that.
(12:44):
I wonder, too, if it can be craftedin a way that is beneficial.
I think what was interesting is once she,
the author, ended up talking to the grief,it was encouraging her to like,
go talk to real people,to go have real experiences, to like,
like treasurethe time, but keep moving on.
(13:05):
And so it's interesting, just
it almost conflictswith a profitable business model, one
that's trying to push people to goback into the world rather than just like,
no, keep talking to me.
It's almost like we're around forever,you know, almost like a medium word
or something like that.
But it's interestingbecause that you referenced
(13:25):
this like the sciences,because even that is a
an instance where you
you pick up and you go someplace you haveI don't know if this is episodic,
but it's an instance of
a shared experience,
whereas with this chatbot it can get.
But it's yeah, I, you can take it with youwherever you are.
(13:49):
It's constantly there.
And I don't know, Julie,if you want to talk more
about the impact on not
resolving grief, like
is there, is there an issuewith constantly being reminded or hoping
that your loved one hasn't passedbecause you can talk to them now on this?
(14:12):
AI Yeah.
In fact, there's a complicated bereavementdiagnosis of that.
It goes it's so hard because there's no
real timelimit that we can put on someone's grief.
They say six months to a year,you may be more resolved,
but it's unique to everybody.
So for some people, it doesn'tfeel resolved in that long of a period.
(14:36):
But I mean, diagnostically
they will say if it doesn't resolve
and it kind of stays in this complexbereavement
where essentially someone is stuck,
then that is consideredmore detrimental to them
because they can't go on livingand thriving and flourishing.
(14:59):
Can that complex
almost a trauma, can that also feed inlike be built into your brain?
Like can you be creating complex traumain your brain
by not resolving the grief?
I think definitely it was talking about
that like rumination that we can do.
We stay almost in thislike passive reading, ruminating
(15:23):
and this case, it's it'sclearly in like an alternative reality.
It's not even just in the relationshipitself.
It's also linking to something that isn't
actual or real.
And the article did talk about like
it could be used to avoid reality,which is concerning.
It said that it could be useful.
(15:44):
I was curious aboutI would love to hear more
on the sentence that was talking about.
It could be useful for abruptendings to ease anger or regret,
I guess to like
have further conversation or talk about.
I do run into with some peoplewho are going through
(16:07):
traumatic experiencesor grieving that there's this longing
for a conversation or more interaction
that was abruptly cut short.
So I guess it's saying that
possibly there could be some benefitthat whether it's like anger,
(16:27):
that there's something in the relationshipthat's unresolved or there's some regret
that's made between them,that there would be
a way to dialog furtherso it could have the closure
and in essence bring some relief
like we actually did have timeto have more of a conversation.
But again, it's fake.
(16:49):
So it's like, well,I think the way we resolve things
when we can't resolve them,even if it's a family member or someone,
maybe it's not safe to be in relationshipany with anymore.
Or maybe a person has passed away.
The place to do that
from my lenswould be with your trauma therapist,
(17:12):
with someone who can help you work throughand here and bear witness and hold
with you those unbearable emotions
so that they are more bearable.
There's just no replacementfor the human heart.
Sure. Real connection.
Yeah. It's funny that I forgot.
They actually call it science.
(17:33):
I just kind of funny.
I do see,I think this is an interesting thing.
I also just the perpetual ity.
I'd be curious to expand this conversationinto this therapy bot.
so this is the BBC article.
(17:55):
young peopleturning to a therapist bots essentially.
there is this, GPT
called psychologist where,
folks are able to talkwith this therapist,
AI and it kind of gives them answersbased on their questions.
(18:15):
I mean there's tons of these,but this article in particular,
follows that one.
and yeah, what I thought was interestingsome takes from
this is that users, it's mostly ages
16 to 30 between that range.
(18:35):
it's also,
at night, it tends to be like after hours.
And then the last thing that I thoughtwas interesting too
is that,
just the therapists
they talk to just, you know, pointed out
the need of this,
(18:56):
the desirefor folks to talk to a therapist.
CBT is so great, it actuallykind of points to a lack of mental
health support and learning systemswithin our current environment.
That is just our day to day.
But yeah,what do you think about this one, ma'am?
We're looking at this BBC article.
What do you think?
Well, I think it definitely doespoint to the need that people have
(19:21):
for seeking being heard or supportor knowing they're not alone.
And so the article talkedabout the idea of text
being something that young peopleare really comfortable with.
So it kind of brings down a barrier of
what might be more
daunting to be in official counseling
(19:44):
or being on the phoneor having a face to face conversation.
In some ways it could give some
maybe psychoeducationabout either anxiety or depression
or coping skills or self-careor things like that.
But one of the quotes from the articleis also saying it was quickly
(20:07):
making assumptions and giving advice,and that's not how a human would respond.
So again,there's that sense of on one hand, it's
meeting a little bit of a need,but there's also a deeper longing
of human interaction and connectionthat can't be fabricated.
I think that's interesting.
(20:28):
even that it makes assumptions.
That's almost what a GPT has to do.
Yeah, it's what it's programed for is.
Yeah.
Or trying to find those linesand just draw the dots.
So if you don't draw the line for it,then it's just going to invent that line
so it can process its own
sort of.
That is interesting.
(20:49):
I also wonder, you know, atwhat point do humans
realize that they're not interactingwith another human?
Like if you threw a face on thatwho looks like me and talks like me,
Do you think if I were as the therapistbot said,
that sounds really hard,That sounds like a difficult experience.
Like, how did you get through it?
(21:09):
Yeah, I wonder howif that would be absorbed the same way as,
a human doing it
like somebody actually sittingthey built a relationship with.
In that scenario.
I mean, does it need to be a humanasking those questions?
Because it's all about reflection.
So if something's asking methat if a human
(21:31):
or a chat bot is asking methat question or a similar question,
I mean,
am do I need it to come from a human
in order to answer it or to reflect on it?
I can see that with like, you know,it would be similar to a journal prompt
or something like if somethingis helping you in that regard.
(21:53):
Like that's how I'm hearing at Sarah.
So you can come meif that's completely off, but it's like
if someone asks a good questionthat, that prompts you
to reflect and grow and learn and feeland all that kind of stuff,
then does it matter so much whatthe source is?
If it came from a book or an articleor a typed in response
(22:13):
from a bot or from a human or what?
You know, like if you're havinga meaningful connection to yourself
and to your reflection,then that in and of itself has some value
or worth to it.
I think what is scary on the sense of are
kind of we have
(22:34):
shifting states of consciousnessthat happen.
So this kind of
is similar to the idea of hypnosisthat whenever your
your conscious awareness is coming downand you're able to access things
that are deeper unconscious, less layers,it's like you're shifting out
(22:56):
of being in the present into going inwardor to these potentially deeper layers.
I don't know how deep these botswould actually go, so I'm not suggesting
that they're delving into your unconsciousor anything like that,
but just the idea that someoneis shifting to being less present
or more in that zone of not reality
(23:18):
like the article even says,every conversation starts with a warning
in red letters that says, Remembereverything characters say is made up.
Yeah, because in essence,
you're putting your unconsciousor these deep emotional places.
If you if you go there becauselike I'm saying, that could be just
psychoeducationabout anxiety or care or whatever.
(23:40):
But if it would evoke a deeper question,suddenly your potentially vulnerable
or fragile psyche is in the handsof a computer program that can crash.
So that doesn't feel good to me.
Like ethically, to me,that feels dangerous.
Yeah.
Because someone could slide into
(24:01):
any kind of previous memory.
Your brain tries to keep you safe.
Who knows what kind of even potentiallypsychotic reaction
someone could have of, like,I'm alone, I'm in danger
or some kind of a fear responseor something.
It just seems like it's pretty dangerousbecause there's there's ultimately no
professionalon the other side of that conversation
(24:23):
that's monitoring the state or the
safety of the person typing.
So that would be a major red flag for me.
It also kind of reminds me, you know,
we spend so much time onlineand a lot of our communications happen
outside of person to person.
(24:45):
So you're I'm locking on to what you'resaying.
Julie, about it,
that this this confusion that your brain'sgoing to be experiencing.
Because I know at the other end of a text,
I mean, my friend is responding to me,but really,
I don't know thatit could be anyone responding to me.
(25:06):
And so now that I'mif I start moving to some sort of I
you know, am I really blurring nowthe reality on more than one
facet, you know,
maybe that's a tangentthat I need to explore on my own, but
it just it's so interesting,this desire we're seeing through
these articles for people to connect
(25:28):
virtually rather than personally.
It is interesting.
You know, italmost points to some relationship stuff,
you know, about trustor even knowing what's out there.
So I am going to take a slightly differentstance and not even that we're even.
But I think that this is interestingand I almost think that
this should be expandedto other fields of medicine
(25:51):
and like law and different things where
basically a generalists chat
but that just gives you general stufflike it takes so long to even schedule
emergency appointments, let alonegeneral practitioner appointments.
And like cost is so prohibitive for folks.
(26:11):
I wonder about a chat botthat was aware of its own limitations
and when to call in an expert,
but also could provide some basic stuffthat we just don't have.
You know, somethingthat I think is interesting is just like
wheneverthat psychologist was talking about,
hey, there's a need that isn't being metand we're just seeing that this chat bot
(26:32):
filled fills in that need,whether it's good or bad, is up to the,
you know, actual chat bot.
But just to say that there is a huge needand so how do we fill that need
or how do we help, you know, spreadthat word or try and create
a healthier environment?
I wonder even
(26:52):
just like the positive aspects of even
maybe to understand its own limitations,I wonder, See,
we're talking about this sort of stuffin 2024, but by 2030, who knows
where AI is going to look like in termsof understanding its own limitations.
Maybe some of thoseare they call them hallucinations,
(27:14):
kind of start to get resolved.
We might have a tool that
can reach folks who wouldn't
have access, you know,and how widespread phones are.
What I think is wild is going to Haiti or,you know, going to Haiti.
And every like a lot of folkshaving phones going to Europe.
(27:34):
And my phone still gets data connectionand I could just connect anywhere.
It's just kind of wild things to mewhere it's such a widespread,
accessible thing right now, even though,
you know,
it's not necessarilygood or bad, it's just kind of.
A couple things pop in my mind of
kind of the thingswe're already exposed to.
(27:56):
You know, any website that you go on
and this is your field a little bit,I imagine.
Samuel But just like everythingpops up, like, how can I help you?
Thank you for visiting us today.
And like, there's someone,there's a chat box ready for you,
but how
many times if we call somethingand there's that many,
you press one for this,Press two for this, press three for this.
(28:19):
I there's this gut instinct in methat's like 000.
Like give me the human.
Like, I don't want to go throughall of these, like, whoops.
And then another thoughtthat pops in my head,
which I don't want this to prohibit anyone
from utilizing it,but the crisis hotlines that, you know,
(28:43):
they have 741741and being able to text your crisis
that was made
and designed to reach more people,to help more people in crisis
and that one actually has a personon the other end
and using texting as a way for support.
But I have had clients tell me, like thatwas useless, that was not helpful. And
(29:09):
and maybe it is for the vast majority.
I have no idea.
It would be interestingto see the statistics on it
of how helpful it is or not,but I think there is just this deeper
longing for human connectionthat is always going to be there.
Like I can see what you're sayingas far as maybe there is a time
to fill in the gap,almost to triage a little bit of the need,
(29:30):
because if we don't have enough mentalhealth therapists
or we don't have support, and maybeif someone did have like a need for
help me with anxiety signsor help me with ways
to cope or something,that it could kind of
give some factual informationthat would be a support,
(29:51):
then maybe it does hold some potentialfor helping someone somewhere.
It's just such a fine line, I think,with talking about the vulnerability
of the human heart and needing supportand needing a resonant,
empathic,
I think a bonny bad badenochall the time of her,
(30:13):
like what's emerging in the momentbetween that right brain,
the right brain, Like we're missingall of that with any of this stuff.
I don't know how you would program that,put it that way.
Maybe that's mind boggling to mebecause there's something of warmth
and empathy that when you lookat the bottom line that it's fake.
How is that ever
(30:36):
right brained?
So we talk about rightbrain to right brain.
Is there something with healing traumawithin a dream or some sort of thing
where it's actually your brainalmost communicating in a fake situation
or even like internallywhere almost it's a
it's not a real situation,even though it feels real.
(30:58):
But it's one of thosealmost simulated right brain
to right brain where your right brain is,because I think sometimes
are looking into the abyssand it just looks right back at you.
Whatever you put in, that's goingto kind of just give right back to you.
You know, like a Google search.
There is like R.E.M.
(31:19):
Sleep that we havewhen we are dreaming, when we are.
I mean, so much restorative stuffis happening in our sleep,
in dream, in even imagination.
You know, you compare the fMRI eyesof someone who's an imagining something,
something even as we do attachment work,there can be kind of create
(31:41):
a security attachmentand a healthy caregiver
and allow your nervous systemto receive that and to take that in
and to have your needs get metwhere it can be transformative.
So, yes, and I think you can
even,you know, create a different outcome.
For instance,if you're having a nightmare, part of it
(32:03):
is go in and revisit itwhere there's something that rescues you,
something that saves the day,something that brings relief,
and that it can shift your nervous systemand your experience of it.
So I can see where there's some validityin where you're going with that.
Again, I would prefer it in the hands
(32:24):
of a highly skilled trauma therapistthan I would in be.
And there's standards for like licensingand education and stuff.
That isn't the case with I,
I just search as the internet and giveswhatever the internet says.
So that's wrong.
I mean,I guess you could focus it in some ways,
but also in others.
(32:45):
How is it monitored
or or yeah, what's the accountabilityfor it or things like that
or even the research on how effectiveor how much harm is caused?
Like we're not at that pointof knowing those answers.
So there's definitely risk involved.
Well, folks,very interesting conversation here.
I'm sure.
You know, actually, to the listener,if you guys have any thoughts about A.I.,
(33:08):
please let us know, especially with
Alma, hesitate to say thatbecause it seems like a fair question.
But feel free to you know, they respond
to emails that come
because, you know,no matter what, we'd like to start
that connection in general and.
Kind of a real people with real hearts,with real authenticity.
(33:30):
Right. And we're here to talk with.
When you email community roots pod,it will be a person on the other line.
So that's right. Yeah.
Before we do gratitude and stuff,I did want to shout out,
Steve Dodge for the theme song.
Alexandra Wells for the logo,Julie and Sarah
for being with me here today.
(33:52):
and yeah, check us out on iTunes,
Spotify, give us a good review,reach out to us via email.
Burma.
What's our,gratitude for this episode, folks?
I've got one.
I am so grateful for human connection
and it really,
(34:12):
I don't know.
I'm really,really grateful for human connection.
She doesn't want to expand.
I'm so gladthat we've had a sharing conversation
so I can feel like, Whoa,I wonder what Sarah's referring to.
I. Just mean the health of connectionwhere you can relate
(34:33):
and somebody hears you and understandsyou and it's valid and you're not alone
and I don't want that from the Internetor chat
because I want to know that somebody realhas experienced what I'm experiencing.
Yeah, I'm piggybacking off of that.
It's like I amthankful for technology and advancement
(34:53):
because I still remember
there was a day that I was visitingmy dad, his office,
and he's like, Now, if we if we write thisand we hit return,
it's going to send to someonewho's going to get it on the other line.
And my mind was blown.
I was just like, What?
And at that point it was just an email.
(35:14):
It wasn't even like a chat messageor a text or anything like that.
I still remember my first personI text sent a text to.
It was like mind boggling.
And so it is pretty coolthat people can create great things
that can help us and they can be usedto facilitate more human connection.
(35:36):
I do appreciate like you, Sarah,that there's some human heart
on the other side and people that we arecultivating those relationships with.
Yeah, that is interesting.
But the thing about see,there's so many thoughts going on here.
I almost want to jump on the train.
I will say something that came to mindto me at the beginning of the episode
is I'm grateful for Mom's giddiness,I guess is just a notice
(35:58):
when you get a referenceor you're ready to reference something.
But because it's so rare,we know that it's
such a treatwhen it happens, like, Yes, baby eight.
I love Bebe.
Yeah.
And speaking of human connectionsand memories and stuff, you know,
Dad and Mom were very good.
Every, every time a new Star Wars
(36:19):
film would come out,all of us would have tickets.
We'd show up to the movie theater togetherall in a row to watch Ray.
Memories.Go fight the bad guys. But anyway.
That's what we watched on New Year's Eve.
Really have to say it. The Force Awakens.
Good stuff.
And that's why probably Bebe Aidis rolling right through my mind because.
Yeah, there you go.
(36:41):
Remembering the movie.
I just assumed that you guys, this house,those movies were always playing.
Just.
And I don't think I've seen them in years,although I do love
how YouTube has the little clipsthat swing, But it's nice when you can get
a little clip and be like, reference thisand then you can just see the visiting.
So now is Ted Lasso just playing on loop?
(37:03):
Andy Andy for Dad that.
Well,thanks so much for joining us everyone.
I hope you are having a good year
and yeah otherwisewe will see you next week.