All Episodes

October 21, 2025 27 mins
And dying!
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Diane, will you read this headline? Please? You're ready, Please
read the headline.

Speaker 2 (00:04):
Should an AI copy of you help decide if you
live or die?

Speaker 1 (00:11):
I love it? I love it.

Speaker 2 (00:14):
I don't know if I understand that.

Speaker 3 (00:15):
What is it?

Speaker 1 (00:16):
So obviously.

Speaker 4 (00:21):
Artificial intelligence AI stuff, right, they're trying to figure out.
Every industry is trying to figure out how to use
it and what to do with it, and what the
future of it is and what you can do. And
in some places I believe it will be very very bad,
and in some places it could be and this I
can't say it will be. In this case, it could
be very very good. And they even say one of

(00:46):
the most high stakes questions in health A healthcare AI
today is whether artificial intelligence could help predict what incapacitated
patients might want when doctors must make life or death
decisions on their behalf.

Speaker 3 (01:06):
Okay, so that's very specific, because yes, on the surface,
I was thinking, isn't this already happening when companies are
using AI to determine if you're getting covered for certain
procedures and in those cases they kind of are deciding
whether you live or die?

Speaker 4 (01:21):
Yes, okay, okay, in this specific case, and so this
research is being done right now, and they said it's
very hard to do because somebody has to survive.

Speaker 1 (01:34):
But they said, in a lot of cases, a couple
of things.

Speaker 4 (01:37):
Number One, somebody's left incapacitated, right, and you have to
decide are we going to try to save them or
would their wishes be they don't want to be saved.

Speaker 2 (01:50):
Like if they don't have a DNR or you don't
have access to that information.

Speaker 4 (01:54):
Correct, So that could be everything from a patient maybe
may refuse to be put on a vent in the
later or receive dialysis or CPR.

Speaker 1 (02:04):
For whatever reason. Right.

Speaker 4 (02:06):
So let's say let's say, for example, so and then
and then the biggest issue they have is time. So
let's say, for example, I go down, right, I don't
know what, but I go down. I don't have I
don't have anything on me that says do not resuscitate.
It's not like I'm it's not like I'm in the
hospital or I'm like at the end of hospice and

(02:28):
I've requested in.

Speaker 1 (02:30):
A will or anything, right, I don't.

Speaker 5 (02:32):
I don't.

Speaker 1 (02:32):
I literally don't have that written down anywhere.

Speaker 4 (02:35):
But even let's say I've talked to Jackie about it right,
so Jackie jack I haven't but Jackie might.

Speaker 1 (02:41):
Know, don't do this, don't do that, don't do that.
If I had a guess, you'd be like, don't do
any of it.

Speaker 4 (02:46):
The but they still have to contact Jackie and Jackie
still has to and again, if they have to make
a moment's notice decision, they won't know what to do.

Speaker 3 (03:00):
But isn't the default to save you usually?

Speaker 1 (03:03):
OK?

Speaker 4 (03:03):
All right, that's fine, that's fine. But now let's say
let's say I'm with I'm with one of the boys. Right, Yeah,
they have no idea. But if there was an AI
version of you, not literally like a robot that walks
around with you.

Speaker 2 (03:20):
Like, but like something you'd pull up on your phone.

Speaker 4 (03:24):
Something that could be accessed, maybe in your portal. I
don't know where they keep it where a doctor wouldn't
have to And by the way, it's updated all the time.

Speaker 1 (03:33):
I have nothing written down say this.

Speaker 3 (03:35):
Is a clone that has been with me?

Speaker 4 (03:39):
Yes, yeah, so it's with me, like it's on it's
on my phone.

Speaker 1 (03:42):
I don't know where it is, but whatever.

Speaker 4 (03:44):
But based on based on conversations I've had with Jackie,
conversations I've had in here, the conversations I've had with
Sam or whoever, text messages that I've had, emails that
I've had, it would know in certain scenarios what my
wishes would be. Therefore, doctors don't have to try to

(04:07):
find or search or this one says this, but the
other one goes, well, I don't know if it was that,
all that's gone. They could use your own voice to
find out what your wishes really are.

Speaker 3 (04:21):
Okay, wow, so I've completely Isn't that kind of cool?
From the start of this misinterpreted what was going on here.

Speaker 1 (04:27):
I kind of like it.

Speaker 3 (04:28):
I thought it was going to be an artificial intelligence
version of yourself that mimics everything that is happening scientifically.
I didn't know that it was pulling from your text message.

Speaker 4 (04:39):
Yes, yeah, but like but again, like let's say let's
say I'm texting with my sister and I'm like, hey,
you know what, I've been thinking a lot about blah
blah blah. And if this happens or you know, it
inevitably comes up somebody you know, is in an accident
or whatever it is, and they end up doing And
again I'm not talking about somebody who's like they're like listen,
get your affairs in order. I'm not talking about that,

(05:01):
but like I've been talking with so and so this happened,
or this happened, and you could just be texting back
and forth, but all of a sudden you're like, I
don't ever want that.

Speaker 1 (05:08):
To be my case. I don't want that to be. Well,
that gets filtered in and.

Speaker 4 (05:11):
Then maybe I end up talking to I talked to
coast Guard Kurt, and I'm like, oh man, well, if
that ever happened, I don't want this, this, this, it
would be able to go in and decipher all that,
and then based on AI go wouldn't want this, wouldn't
want this, would want this, wouldn't want this, maybe would
want that?

Speaker 1 (05:29):
Like it could decipher all of.

Speaker 3 (05:30):
That, haven't. The court said that off the cuff conversations
do not have any legal standing when it comes to
people's health.

Speaker 4 (05:38):
Yeah, okay, but they also they also had laws that
you couldn't have oral sex on a Sunday.

Speaker 2 (05:43):
But what if you're having a bad day and you
feel defeated and you're just like, oh, screw it all,
just pull my plug, when in reality, if you would
sit down and talk with somebody, maybe that's not your okay,
got and.

Speaker 4 (05:57):
You think that the only time you've ever said anything
about your end of life care is on that one moment,
on that bad day where you were.

Speaker 1 (06:05):
Like, f it pulled the plug.

Speaker 3 (06:06):
Possibly you complain more online when you don't like the product,
make use when you're unhappy again.

Speaker 4 (06:13):
So so obviously this has to be researched, and this
has to be tested. The only way they could test
it though, is on people who survived. So there's there's
one guy.

Speaker 1 (06:25):
I can't remember his name. I don't know. They give
the guy a lot of credits, a really smart guy.
I don't have his name. I have his name.

Speaker 3 (06:36):
It doesn't really matter, doesn't.

Speaker 1 (06:38):
No, it really doesn't. It would you can call him Elliott.
I don't know. The main limitation is testing. Oh, here
we go.

Speaker 4 (06:47):
It can only verify the accuracy of the model if
the patient survives and can later confirm that the model
made the right choice. So it needs somebody who was
very clear in saying, but on a ventilator, put me
on this, put me on.

Speaker 1 (07:03):
Whatever I mean. It may not be all methods. It
may be a method and then.

Speaker 4 (07:08):
The person survives, and they could go in and they
go listen we followed what he was and all of
these models lined up. It seems like the accuracy testing
could then expand to further facilities in the network with
the aim of developing AI surrogates that can accurately predict

(07:28):
patient preferences in better than two thirds of the time.

Speaker 1 (07:33):
Beautiful.

Speaker 3 (07:34):
So are you then, because you said, this is a
clone or a suriat or a representation of yourself that
is not being created in the moment. It is something
that it's ongoing. It's ongoing with you. Are you engaging
with this?

Speaker 1 (07:49):
You can?

Speaker 3 (07:50):
So this is a chatbot that that you're talking.

Speaker 1 (07:53):
To and dating a chat bot.

Speaker 3 (07:56):
I just mean this is something that yes, you can
updating with.

Speaker 1 (07:59):
Yes you can. You can.

Speaker 4 (08:01):
You could be like you know what, you could be
sitting and just being just being have just just be
having a very insightful thought with yourself and go, you
know what.

Speaker 2 (08:12):
I should talk to my clone or put this in
there so the information is available.

Speaker 1 (08:20):
Thank you. The No, Diane, you don't do big clone.

Speaker 2 (08:26):
You said you had an insightful thought and that should
be recorded. So I'm talking to a clone.

Speaker 3 (08:32):
No, what if all these thoughts that you think are
insightful lead your clone to resent you and your clone's.

Speaker 1 (08:37):
Like let them go.

Speaker 4 (08:39):
No, no, But what I'm saying is like, you're having
this insightful you're thinking about your own mortality, and maybe
you're like, you know what.

Speaker 1 (08:49):
I think.

Speaker 4 (08:49):
I I've always been against alternative measures of preserving my life,
but now I want to and I don't know.

Speaker 1 (08:58):
Things change in life, and you're.

Speaker 4 (09:00):
Like, you know what, No, no, I put me on
a ventilator, do whatever.

Speaker 1 (09:05):
Listen.

Speaker 4 (09:05):
I have friends who are like, freeze my head, like
I don't care. I never want to die until they
have exhausted everything.

Speaker 1 (09:15):
And then there's some people who are like I don't
want any of that.

Speaker 3 (09:17):
Right, So is your own medical.

Speaker 4 (09:20):
But Diane is like history, let me pull out my
phone and update my tat butt.

Speaker 3 (09:25):
Is your own well being a part of this?

Speaker 4 (09:29):
Yes?

Speaker 1 (09:29):
Well no, no, no, no, But I know what you're saying.
Yes it would, it would be able to.

Speaker 4 (09:33):
I could also because it's in my phone and on
my portal, I can update my medical records, my medical
it could read into my medical records.

Speaker 1 (09:41):
So maybe I've had the conversation with my doctor.

Speaker 3 (09:44):
Is it also is this sort of a likelihood a
probability that it's factoring in in terms of survival.

Speaker 4 (09:53):
Well, I mean it's not you speaking so, yes, it
is a likelihood. Oh oh is it? Is it taking
into account what your likelihood is?

Speaker 1 (10:01):
Yeah, I don't know. I don't know.

Speaker 4 (10:04):
Nobody can guarantee that. Doctors will tell you that all
the time.

Speaker 3 (10:07):
Right, we can try.

Speaker 1 (10:11):
You gotta be careful practice.

Speaker 3 (10:13):
In your age group.

Speaker 1 (10:14):
You know they don't make it about you, right, boy?
I thought it was very smart. I'd love an AI
copy of myself.

Speaker 3 (10:24):
You're I'm bored completely, I.

Speaker 1 (10:27):
Think I am. And you know what it was.

Speaker 4 (10:30):
You know what it was in there is the the
like they talk about like this. This would be the
perfect scenario. This would be the perfect scenario. If I
were running around with one of the boys. The boys
had no idea what I want. Yeah, I've never talked
to them about what I want to do, and I die.
That's Jackie's problem.

Speaker 1 (10:50):
If I God forbid, God forbid, I'm driving.

Speaker 4 (10:53):
Around and I'm with the older one, right, and completely
car hits me.

Speaker 1 (10:59):
He doesn't know what I want. He would have no idea,
and what's he gonna do? Hold on, let me call jack.

Speaker 3 (11:07):
Well hopefully no, but you know what I mean.

Speaker 1 (11:10):
And Jackie may not be available.

Speaker 4 (11:12):
I know she's not at work, but she may not
be availed like there's time he wouldn't know.

Speaker 1 (11:18):
You know who knows? My AI means.

Speaker 3 (11:25):
Diane's still talking to her clone.

Speaker 1 (11:27):
The uh well hold on, I gotta thought. I better
pull out.

Speaker 2 (11:30):
And it makes me very uncomfortable. Why does it make
you unconfidence?

Speaker 1 (11:33):
Does why?

Speaker 2 (11:35):
I think because of my listen and I said impression
of AI in general from where you see on the internet.

Speaker 4 (11:41):
But there is going to be again what if? What
if I told you about every invention.

Speaker 2 (11:46):
Ever there's always somebody who's gonna use it for.

Speaker 1 (11:47):
Bad, for evil?

Speaker 4 (11:48):
Absolutely, So is there is there bad like I can think, Yeah,
I can think of bad AI stuff already. This jockey's
so those are bad, they're not They're not good like like.

Speaker 1 (11:59):
Pin ow the.

Speaker 4 (12:03):
So there is going to be bad AI, but there's
also going to be good.

Speaker 1 (12:07):
Right, there's going to be good.

Speaker 3 (12:08):
I know, Brian postsun X, I really need to stop texting.
Just shoot me anytime I am somewhere I don't want
to be.

Speaker 1 (12:21):
I fall into that category also where you.

Speaker 3 (12:25):
Put out thoughtless yes messaging.

Speaker 4 (12:28):
Yes like and sometimes against me in trouble like Remember
there was the person here who went to HR because
I said I was mad one day. I was like,
oh I wish somebody would blow up the building. And
they went to HR and they were like Elliot's asking
people to blow up the building. And then there was
Aaron Wilbourne who I was like, oh, I hope he
gets run over by a bus.

Speaker 1 (12:46):
His wife got pissed. So yes, I'm guilty of that.

Speaker 4 (12:49):
But again, it's not just going to take that one
text message from when. Also, AI is going to be
smart enough to know that you were pissed at that
time and.

Speaker 1 (12:57):
Just said, oh, just shoot me.

Speaker 2 (12:59):
Hopefully.

Speaker 1 (13:00):
Also, they're not going to shoot you. You're not You're
not a cow bringing.

Speaker 3 (13:04):
And Matthew writes, maybe I actually change my bio from
BRB dying to do something more positive.

Speaker 1 (13:13):
Okay, so nobody's taking it seriously.

Speaker 3 (13:15):
No, they're thinking about all the inputting that you're doing
on a daily basis.

Speaker 1 (13:20):
Yeah, people get it negative. Yeah, hello, hello, you're the dark.

Speaker 2 (13:25):
You should be the most worried one.

Speaker 1 (13:27):
I am. No, I'm not worried. I am very dark.

Speaker 3 (13:29):
But you're the most ready to sign up for this.

Speaker 1 (13:32):
Yeah yeah, yeah I am. I am, and I.

Speaker 4 (13:36):
Think the part of it is you're going Everybody thinks
of it as like, oh, I'm gonna be I'm gonna
be ninety five years old and my affairs are in
order and I know that, like like you you have
a will and all that other stuff. Right, does does
Marley know everything that you want done? And in what scenarios?

Speaker 1 (13:57):
Now?

Speaker 4 (13:58):
So you're in, You're in, You're in Savannah? Right, car
hits you bluey? Right, Marley doesn't know what to do?
What is she gonna say? My mom may have it
written down. We'll just wait here. No, you know who knows?

Speaker 1 (14:11):
I am Diane's robot.

Speaker 3 (14:13):
AI make it like that.

Speaker 2 (14:15):
It's worse.

Speaker 1 (14:17):
They would know what to do.

Speaker 4 (14:18):
And that's that's where That's where I think it hit me,
is that your kids.

Speaker 1 (14:22):
Don't know what to do.

Speaker 3 (14:23):
Elliott's Jennie doesn't know what to do like that. Yeah,
so we could be back in on Monday. Your clone,
I don't know.

Speaker 1 (14:36):
I thought it was genius.

Speaker 4 (14:39):
And I also like, I don't think of stuff like
I may not want CPR, but I don't know that
I'm against a ventilator.

Speaker 1 (14:45):
I've never even thought of it.

Speaker 2 (14:47):
Well, is it? It's one of those things where is it.
Am I going to be on a ventilator for a
week or am I going to be on a ventilator
for six years?

Speaker 1 (14:54):
Well done? Let me take out my uh my chat.

Speaker 3 (14:56):
But I don't know, you know what's not gonna help you?
Mocking the surrogate.

Speaker 4 (15:03):
You know that's Diane, Like, Oh, I'm having a thought.
Let me pull out my chat. But who I'm dating?

Speaker 3 (15:11):
Oh, I'm reading here about your.

Speaker 1 (15:13):
Life because I wouldn't mind. I wouldn't mind being on
a ventilator for a week. But I'm with you.

Speaker 4 (15:18):
I don't want to be shy both exactly. The no, like,
I don't how long are you on ventilators for?

Speaker 3 (15:24):
You're gonna you're gonna eat those words the phone?

Speaker 1 (15:31):
No, no, you know what I mean.

Speaker 4 (15:32):
Like, I've never given any thought to how long I
want to be If somebody said we could save your
life by putting you on a ventilator, in my mind,
you're on a ventilator for like two or three days.

Speaker 1 (15:40):
Well put me on. In my mind, I'm not on
a ventilator for the rest of my life. I don't
want to be on a ventilator.

Speaker 2 (15:46):
Pull me, they shoot me.

Speaker 3 (15:49):
If patients talk to the clone. It could actually discourage
them from having important conversations with family.

Speaker 1 (15:57):
Okay, yes, it shouldn't.

Speaker 4 (15:59):
That's why it's should It should be an eavesdropper, not
a friend.

Speaker 3 (16:03):
And what we've seen in so many these cases, and
many of them ending tragically, it becomes more than that.

Speaker 4 (16:11):
Okay, So I'm gonna walk behind Diane and like yell
into her her pocket.

Speaker 1 (16:16):
Pull the plug, Pull the plug. I'm Diane, pull the plug.

Speaker 3 (16:20):
No, if the device is that dumb, I'm not doing it.

Speaker 4 (16:27):
But again, you're you're You're focused on an isolated incident.
You're not focused on years of reading and understanding me.

Speaker 3 (16:36):
I'm just reading about your guy.

Speaker 1 (16:39):
Yeah, I don't know his name.

Speaker 3 (16:43):
Studies have found that if a patient fills out advanced directives,
it can become harder to determine their preferences because patients
may be less likely to discuss their preferences with loved ones.

Speaker 4 (16:56):
Okay, another good reason. You could have it all written
down and they could go what does it say? Like
it's you, you blew it, you just got hit, And
they go, Scott, what does it say is like.

Speaker 1 (17:07):
I have no idea? She never shared that info with me? Okay, Well,
and you don't know that.

Speaker 2 (17:12):
I'm saying for me, I've shared it with my sister,
who has like my medical power of attorney.

Speaker 3 (17:16):
But are they saying the advanced directive.

Speaker 1 (17:19):
Is Linda, you what to do about a ventilator?

Speaker 3 (17:22):
How loific for everything?

Speaker 2 (17:23):
For how long it's in the paperwork.

Speaker 1 (17:25):
It's in the paperwork?

Speaker 2 (17:26):
How long you'd be honest, it's been a long time.
I don't remember. I'd have to go back and review.

Speaker 1 (17:30):
But you've got a time stamp on a ventilator.

Speaker 2 (17:32):
No, I don't think Elliott does. It's more of a
it's more of a broader term where it's like extensive,
you know, life saving measures or something like that.

Speaker 1 (17:43):
Right, But I have to go back and look at
what if what?

Speaker 4 (17:46):
You wouldn't have to if I could AI surrogate you,
you could just say it like I just did.

Speaker 3 (17:51):
A week there comes Elliott's clone. It's this clone Ed Robertson.

Speaker 1 (18:07):
I think this is so smart. People don't know. Do
you have anything about dialysis in there?

Speaker 2 (18:17):
No?

Speaker 3 (18:17):
No, I don't. I know for a fact, I don't.

Speaker 4 (18:19):
Okay, that's on the list.

Speaker 3 (18:24):
A dialysis is an active thing.

Speaker 2 (18:27):
Yes, you're conscious when you're going through dialysis. Oftentimes, so
all cases of dialysis, maybe not in all cases. I
told you I got to go back and look at
what I wrote. You don't remember.

Speaker 1 (18:37):
I will bet you.

Speaker 2 (18:38):
I don't think dialysis is in there, thank you.

Speaker 3 (18:43):
I mean, if it is that bad.

Speaker 2 (18:44):
Just you know, if we're having to have these conversations about.

Speaker 3 (18:50):
Like hey, I'm at hey. If he doesn't look at
me for a week, you.

Speaker 1 (18:59):
Know the song.

Speaker 3 (19:06):
Your phone's always in the office.

Speaker 1 (19:07):
You can't mess with you exactly that, you know. I
think this is so smart.

Speaker 2 (19:16):
It makes me nervous, all of it does It shouldn't
it should be I'd be more likely to get into
way mow first short trip.

Speaker 1 (19:26):
Where am I going? Kristen.

Speaker 4 (19:28):
I think this is very smart. I think this could
actually be used for good. And again I'm not I'm
not telling you AI is good for everything. There's a
lot of evil that will come with it. Hi Elliott
in the morning, Hyer.

Speaker 5 (19:38):
Not to sound like an extremist, but this whole it's
like growing a brain in case your brain can't brain,
this whole growing organs thing. I've seen the island.

Speaker 2 (19:47):
Listen.

Speaker 5 (19:47):
I'm spiteful as hell. I will hunt me down and
I will kill.

Speaker 1 (19:50):
Me to see.

Speaker 3 (19:57):
But but don't got a real thing.

Speaker 4 (19:59):
No, No, it's yeah, it's not Megan, is like running
after you, Megan the no, No, it is just it
is able to decipher through conversations and text messages and
medical records and things that you've done, so that when
time is of the essence, time is of the essence,
it knows the answer, or it's got a really good

(20:21):
it's got. It's not a brain, ma'am that. Yeah, you
can't pinch your phone and make it hurt.

Speaker 1 (20:28):
You can't.

Speaker 4 (20:29):
Your phone can't cry like you can't do all of
those things.

Speaker 5 (20:33):
So my phone can't decideify liver dive though. That's I
think we we're kind of forgetting the Your.

Speaker 1 (20:38):
Phone isn't you are? It's just storing your info?

Speaker 5 (20:44):
Yeah, what if you need to be kidney.

Speaker 6 (20:48):
Drop?

Speaker 1 (20:50):
Yes, I'll phone Hey, how long? How long? How long?
How long do you want to be on a ventilator?

Speaker 2 (21:00):
How?

Speaker 5 (21:00):
Good day, guys, I'll talk to you later.

Speaker 1 (21:02):
Nobody knows. I know what the answer is.

Speaker 2 (21:04):
It's not a split second, one week decision.

Speaker 3 (21:06):
I want to say what. She doesn't want to be
tied down her model. That's why she left.

Speaker 1 (21:13):
You have to teach your model. If not, well, then great,
you know what. I think she's good. She partially blinked
at the balloon.

Speaker 3 (21:23):
It also says your linguistic markers could help with the learning.

Speaker 4 (21:32):
Oh so like no, no, no, no, see again, you
guys don't think so again.

Speaker 1 (21:37):
It's three am on a Saturday. You don't want to
be a man. I tell you, I don't even care.
I don't even kill me.

Speaker 3 (21:46):
Kill me.

Speaker 1 (21:50):
So they know, they know, so you can't be drunk.

Speaker 4 (21:54):
Oh mm hmm, you're what if you like?

Speaker 3 (22:03):
Your clone is such a good copy of you. It's
not giving the doctors.

Speaker 2 (22:09):
The doctor's gonna ask it, ask it questions, and it's
just gonna go say again.

Speaker 1 (22:22):
The only thing it knows against one week of beyond dialysis.
That's all that.

Speaker 3 (22:29):
Dialysis.

Speaker 1 (22:30):
No no, no, no wait, oh no, fatal mistake, No wait?

Speaker 4 (22:34):
What on not?

Speaker 1 (22:35):
Dialos? Ventilator, ventilator? Which one? Which one go to? Getting off?

Speaker 4 (22:47):
Can I stupid.

Speaker 3 (22:54):
Linguistic marker?

Speaker 1 (22:56):
Stupid? Hi, Hi Elliott in the morning.

Speaker 6 (23:05):
Hi, I do this for a living. Part of my
job is to talk with people about the witches, the families.
If people are ventilator, thank you, so.

Speaker 4 (23:13):
You understand where I'm coming from and why this could
be so effective?

Speaker 6 (23:18):
I do it's it's it sounds just like a portable
living will. It's a portable living will.

Speaker 4 (23:25):
Yeah yeah, okay, yeah. I like to think of it
as a living living will. It's a living document. So
like if you don't go in, if you have a
change of heart or a change of a change of decision,
unless you now, I gotta go, and I gotta get
it signed and witness instead.

Speaker 1 (23:43):
I've got it with me.

Speaker 4 (23:44):
It is a what do they refer to it as
like a living document? Right? So in this case, it's
a living living will.

Speaker 1 (23:53):
You know what I mean?

Speaker 4 (23:54):
Yeah, yeah, yeah, yeah, yeah.

Speaker 6 (23:56):
Every scenario is different, though it is very I see
the good and bad in it.

Speaker 1 (24:01):
But yeah, but you do seem more good, right, yes? Yes? Exactly?

Speaker 4 (24:09):
What do most people say? What do most people say
about a ventilator.

Speaker 6 (24:13):
Time limited trial? Who can only be on a ventilator
like the intubation that goes down your down your mouth
for two weeks max. And then you're gonna need a yeah,
and then you're gonna need a tracheostomy and you could
be hooked up to a ventilator forever.

Speaker 1 (24:30):
I don't want that.

Speaker 4 (24:32):
No. If I got a tracheotomy, it's from Marlborough. Yeah,
I don't want that that. I don't want no, okay, no,
And I don't even like two weeks just to get
to the whole, ma'am, write it down, put it in
your phone.

Speaker 1 (24:48):
One week. That's all I'm doing on a ventilator. I
got it. I got it all right, very good, very good,
Thank you, ma'am, Thank you. I love it.

Speaker 3 (24:56):
It's certainly food for thought.

Speaker 4 (24:58):
But also can step back for a moment, whether you
think it's smart or whether you hate it.

Speaker 1 (25:03):
Right, three favorite play fists?

Speaker 3 (25:08):
What's that from? When he said three three favorite playfists?
He wasn't Washington College Football on a Saturday.

Speaker 2 (25:16):
Was trying to do to meadows again?

Speaker 4 (25:22):
No, it sounds like I'm saying three favorite places where
you're trying to.

Speaker 3 (25:28):
Anyway you were saying.

Speaker 4 (25:30):
No, I was gonna say, think of again before it
be against it. That's fine, that's fine. Think of how
far we've come, just in your own lifetime. Look at
where we're at. That's all we are. We are so
smart and so.

Speaker 1 (25:53):
Smart.

Speaker 3 (25:55):
That's about it.

Speaker 1 (25:56):
No, but you know what I mean, look at how
far we've come.

Speaker 3 (26:00):
Ron is worried about scammers messing with your AI clone evil.

Speaker 1 (26:05):
Of course, you have to be careful.

Speaker 2 (26:07):
Scam scammos, oh okay, make a drop out of her.
Scammers will infiltrate anything.

Speaker 1 (26:14):
Of course, there's I already said that there's gonna be evil.
That's that. That is that's that, that's you know what.

Speaker 3 (26:20):
That I think my machine is broken?

Speaker 1 (26:31):
All right, very good? All right, let me do this.
Give me a upset. I'm not upset. I'm living.

Speaker 4 (26:37):
You guys are the one that will be like, I
don't know what to do, and you'll be like, well,
I guess maybe instead of stammering I should have made
a decision about my ventilation one week.

Speaker 1 (26:47):
That's what you should have done.

Speaker 3 (26:48):
Can we go to your robot to decide if you're living?

Speaker 1 (26:51):
Go ahead?

Speaker 4 (26:51):
It is the cancer. It is cancer. M
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

CrimeLess: Hillbilly Heist

CrimeLess: Hillbilly Heist

It’s 1996 in rural North Carolina, and an oddball crew makes history when they pull off America’s third largest cash heist. But it’s all downhill from there. Join host Johnny Knoxville as he unspools a wild and woolly tale about a group of regular ‘ol folks who risked it all for a chance at a better life. CrimeLess: Hillbilly Heist answers the question: what would you do with 17.3 million dollars? The answer includes diamond rings, mansions, velvet Elvis paintings, plus a run for the border, murder-for-hire-plots, and FBI busts.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.