Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
The Brian Mudsho podcast is driven by Brayman Motor Cars.
My family is a Brayman Motor Cars family. Your family
should be to visit Braymanmotorcars dot com. Faith Freedom in Florida.
This is the Brian Mudshow.
Speaker 2 (00:19):
Is AI as smart as your doctor? Is AI is
smartest your doctor. Joel has a very nervous little kind
of his face about this one.
Speaker 1 (00:32):
Yeah, I'm I'm hoping not.
Speaker 2 (00:35):
I'm hoping the humans went out, but I have my
dud's well. The long running Joe since the onset of
the Internet age is that if you search for what
ails you online, all roads will eventually lead to death. Right,
doesn't matter what's going on, you will find a way
in which you will die if you keep going. The
takeaway has been that if you know what ails you well,
(00:59):
you might be able to find additional information about how
to treat your condition or to go about living your
life online. If you don't, you probably won't and you
may be likely to overreact. Now, in the age of AI,
everything is changing once again, and when it comes to
the matter of obtaining a proper diagnosis and or coming
(01:20):
up with a treatment plan, the question has become whether
AI has an edge over your PA. Recent study put
this to the test. The study entitled Artificial Intelligence Versus
Human Clinicians a Comparative Analysis of Complex Medical Query Handling
across the USA and Australia. They try to get to
(01:42):
the bottom of this and the stated purpose of the study.
They said, this study has sought to explore the practical
application and effectiveness of AI generated responses in healthcare and
compare these with human clinician responses to complex medical queries
in the USA and Australia. Okay, so what were the findings.
A comparative analysis used seventy one hundred and sixty five
(02:06):
patients to assess AI generated responses versus human clinicians on accuracy, professionalism,
and real time performance using machine learning algorithms and various tests.
The study evaluated AI in human responses again in both
the US and Australia. What was the conclusive statement what
(02:28):
they find They said, the results showed AI generated responses
were generally more accurate than human responses, suggesting potential benefits
like increased efficiency, lower cost, and enhance patient satisfaction. Sounds
(02:49):
pretty good, Joel, I know you're scopting at this, but Hey,
if there's more efficiency, lower cost, and greater satisfaction. Yeah right,
it does say, However, significant can such as AI's lack
of emotional depth, data bias, and the risk of displacing
human clinicians, must be addressed to fully utilize AI and
(03:10):
clinical settings. I tell you that to me and kind
of like, you know AIS to the point. So let's
say that you did get like, hey, you're going to
die kind of diagnosis. AI's gonna say, yeah, you're going to.
Speaker 1 (03:19):
Die, right right, rather no bedside matter.
Speaker 2 (03:23):
And I was thinking about that because that is the
weakness here, right, They're saying that AI's biggest weakness just
the lack of emotional depth. Okay, I don't know about you,
but I've come across some doctors here and there. Yeah,
use don't work in that area too. But anyway, what
was the actual result here? The overall average performance of
AI tracked at about eighty percent and overall quality, which
(03:49):
notably is two percent higher than current patient ratings within
the United States. So you put everything together and in
real time in the study, AI beat your doctor beat
your medical service provider by two percent.
Speaker 1 (04:06):
So I don't want to robot doctor.
Speaker 2 (04:09):
Brian