All Episodes

December 10, 2025 33 mins

Waymo is causing traffic chaos around the country, including in school zones. In San Francisco, a driverless Waymo got stuck on one of the dead-end streets, and was soon followed by two other Waymo's, ending in a driverless stalemate that blocked the street. The crew goes to war over the use of the emdash, endash and hyphen. Meanwhile, President Donald Trump wants to take AI oversite away from U.S. states and sign an executive order on federal AI rules. Medicare wants to use AI to look at your medical insurance claims. Disney Cigna is being sued for using AI, denying hundreds of thousands of claims without physician review. Dr Oz was on NBC’s “Squark Box” talking about private companies using AI as weapons to deprive Americans their healthcare. 

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
You're listening to KFI AM six forty on demand.

Speaker 2 (00:10):
I was thinking of Mark when I saw the story
out of San Francisco. And it's a few way most
have you seen that you haven't done the driver of
this car? Right? No, and nor will I, okay, out
of principle or out of fear, well both, and until
we're in a complete minority report situation where they're on
tracks and they can't screw up, there's no way you're

(00:31):
gonna get me in one of those things. Okay. So
I lived in Phoenix when Phoenix Metro, when they were
starting to roll those things out, and so they became
normal everyday sites. They were all over the place. Just
quick background. I grew up in a very small town.

(00:53):
My folks still live there in northern Michigan. And the
weirdest thing happened Pops came out to visit me. This
would been about twenty sixteen, I guess. And I lived
in Mesa, Arizona, which is a suburb of Phoenix for
those unfamiliar, and waime O. Their primary testing site was
in Chandler, Arizona, so Chandler and Mesa border and I

(01:16):
was only I don't know four or five miles away
from ground zero and that stuff. So it came by
my house all the time because it was doing neighborhoods
to start with, and then it would go to the
arterial streets and then eventually out to the freeways. But
I saw it all the time, and so it was
it was one of those things where you go, oh,
that's a self driving car. And for the first I
don't know, two or three years, I had somebody that
was driving it, you know, that was sitting behind the wheel,

(01:37):
or at least they were they were monitoring it. And
then eventually it went without anybody there, no passengers, but
no drivers. And then now it's being used as basically
a taxi service. So Pops is out to visit. Mean,
my folks came out to visit, and he had I
think only recently retired. And he comes out and he's
in uh, he's there with us. We're outside, I'm doing

(01:59):
yardwork or some of the and and a Waimo comes by,
driverless vehicle and I go, hey, I take I said,
take a look, nobody's behind the wheel on that car.
He goes, oh, that's cool. And you know how they've
got the you know, the spinny deal up on top
and the flared fender cameras and things like that, and
I go, isn't that cool? It just drove by. I go,
there's nobody, nobody there, nobody's driving that car. It's a

(02:21):
it's autonomous. He goes, yeah, I heard about that. That's cool.
I thought, Wow, he's really not all that impressed by
this driverless vehicle. I would have thought that a guy
born in the fifties would have been like, Wow, the
future is here, sci fi all this stuff come to life.
But you think that, like they had candlestick phones when
they were kids, and like anything that reeks of a
little bit yaka yeah, yeah, yeah. I mean my mother

(02:41):
didn't have a TV until they were four, until she
was four or something like that, right, And my father
grew up and he had a poster of the Moon
landing on his wall, and so I would have thought
he would have he would have just acknowledged, like, wow,
look the advancement technology. I would have thought that would
have been a cool thing. All right. But here's where
the story takes a little bit of a twist. Uh.

(03:01):
A few minutes later, a car comes by, and do
you remember when h Lyft used to put those those uh,
those pink mustaches on the front of the car. Those
were cute. Yeah, so this lift comes by with a
pink mustache. Dad like drops his rake or whatever it
is that we were doing, and he goes. He goes,

(03:23):
is that is that? Is that one of those lift cars?
Is that one of those lifts? And I go yeah.
He goes, that's one of those lifts, isn't it. I
go yeah. He goes, I knew because the mustache. I'm like, yeah, Dad,
it's a ride share, it's it's a lift, you know.
He's like, man, I've never seen one of those before.

(03:43):
So no joke. We go inside, we're having lunch. My
mother's inside, and he goes, honey, guess what we saw.
She's what we saw one of those lift cars. She goes, Oh,
I've never seen one of those before. I've never seen
somebody get so excited about a ride share. And yet

(04:06):
I go, yeah, we also saw a driverless vehicle. Oh
they had they just didn't care at all. It blew
my mind. They're weird people, man, They're weird people. Well,
and back back when they were young, those old corveirs
without the seat belts, they didn't have the mustaches. I
guess I did Ralph Nader do anything about mustache? You're

(04:28):
exactly right. Yeah, oh my old dad's and so never
had one of those. So I had a mustache. Crazy crazy.
So I seen this video out of San Francisco, and
I've I've seen this before where the waymos go down
a dead end street and then you know how people
park at the end of a cul de sac er
dead end street and then they kind of fill up
the side. Well WAYMO didn't know how to turn around.

(04:49):
They didn't know how to do you know the skill
that we all learned driving in LA and that is
the nine point turn WAYMO just didn't know how to
do that. So I've seen that before where it kind
of gets stuck, almost like a robot vacuum in a closet.
And then in this case, another one comes by. So
you're on a dead end street. One way most trying

(05:10):
to turn around, another one doesn't know that it needs
to turn around yet, and so they end up with
this standoff. And then a third Waymo comes by and
then it's just it just blocked the entire end of
the street. So these things are all just it's this
it's just this this automated standoff, which from a human
standpoint looks like the dumbest thing in the world. As
if as if you're walking on on the street downtown,

(05:33):
any down pick a downtown and you're walking towards someone,
they're walking towards you, and you have that awkward moment
where you're, oh, I'll go left, you go right, Oh,
you know how it is, and then they're like, oh,
excuse me, and then you kind of you figure it out,
you move around. Imagine if you had two people that
were like, oh, I'm just gonna go oh, excuse me,
and then they just froze and then they just stared
at each other and they just didn't know what to do.
That's what happened with the Waymos. They just didn't know

(05:54):
what to do. They were stuck a robot pil up.
That's exactly what it was. I'm not gonna lie. I
would stay to watch to see it would have happened,
but it probably would have stayed like that for hours.
It became a thing. Other people were like, are they
going to fix this? One guy, a man can be
heard on the video saying, are they just going to
stay there forever? Now, eventually way Moo employee went to

(06:15):
the scene and then I don't know, programmed the car,
took over it whatever. Back then it got it out
of there. Anyway, they got it figured out, as you
knew they would. But it just feels like you're so dumb.
And then you've got way Mother that's doing some really
weird stuff, like like passing school buses with red flashing lights.
Uh oh.

Speaker 3 (06:35):
From maybe C seven this morning, Waymo is tapping the
brakes on some of the software and their self driving vehicles.

Speaker 2 (06:42):
They're tapping the brakes. Oh that's wild. Oh did you
get it? They're tapping the brakes. H oh, that is
a funny guy.

Speaker 3 (06:54):
After a series of close calls in Texas, some of
the robotaxis have been seen performing in legal maneuvers like
this one passing stop school buses. The Austin School District
says this has happened at least nineteen times since.

Speaker 2 (07:08):
Holy cow, nineteen times. It's the start of the school year.

Speaker 1 (07:12):
Some of them with children crossing the road.

Speaker 2 (07:14):
They don't need to be operating on a road with
with federal investigators. Oh no, yeah. People that are like
you Mark, They're like, no, we should never have these
on the roads. Never there's looking into the issue.

Speaker 3 (07:23):
Waimo says it's issuing a voluntary software recall, saying in
a statement, holding the highest safety standards means recognizing what
our behavior should be better. We will continue analyzing our
vehicle's performance and making necessary fixes.

Speaker 2 (07:38):
I gotta tell you that apology sounds like the crap
that chat GBT gives me when it violates the rules
that I set up. I'm like, here's how I want
my Here's how I want my uh, you know, my
synopsis of a story written out, and it does it wrong,
and I go, why'd you do that wrong? It's like, oh,
you're right. I need to recognize when I do things wrong,
and you're right to call me out. I won't do
it again. There's nothing like being passedized by an AI.

(08:01):
It's the worst anything AI writes. By the way, it's
a it's a real tip off. It uses the M
dash if you're unfamiliar with that, basically, instead of a
comma or an ellipses, it will use a dash that
New York Times loves to use. Those That offends me
because that is actually correct according to the Associated Press.

Speaker 4 (08:19):
Use.

Speaker 2 (08:21):
Yeah, that's an insulting topic. I get you, Nikki, but
there are options. Not every sentence needs to have one
in it.

Speaker 1 (08:30):
Well, I do agree that chat GTP or whatever it's
called does overegg the pudding.

Speaker 2 (08:34):
But the use is correct. It's not incorrect punctuation. But
I don't need to put an exclamation point on every
everything I write either. Even though it's not incorrect, there's
over usage and it's terrible. It's a real tip off.
Like college professors are looking for the M dash because
they know kids don't write with that, right. It's a

(08:55):
tip off. So I said, don't use the M dash.
Don't put yourself in a situation where it's not X,
it's Y with M dasheres. I said, don't do that,
and it does it anyway. People remember this from Blade Runner.
But when they're giving the replicants the void comptest to
find out if they're real or robots, well, I love
it when you make nineteen sixties film reference, it's nineteen eighties,

(09:15):
you swine. If a replicant uses an m dash, they
just get burned on the spot. Hey, you're completely right
about this. It's unnatural. It's unnatural, and I'm okay with it.
On occasion. It should not be in every sentence. Drives
me nuts, Nikki's you go ahead and use that, Nikki,
I'm gonna know exactly when you're cheating. I use those

(09:35):
dashes correctly, damn it. I know what a hyphen is,
I know what an N dash is, and I know
what an M dashes. Okay, good, I'm taking a look
at what you've written on the Today's show notes. By
the way I'm looking at it, you know what, you
didn't use a single one. I'm clever? Oh judicious? That's it.

(09:56):
It is not judicious. It crutches and makes me angry.
So there are just certain tip offs here when it
comes to the technology. But when it comes to that technology,
good news. We are going to fix it all on
a federal level. You're gonna find out why because at
first you thought every state had the right to make
their own rules. They do not. Why because that m

(10:18):
dash is being overused everywhere. But fortunately we've got new
laws against the m dash, banning the m dash from
chat GPT is next, Chris Merril.

Speaker 1 (10:30):
You're listening to KFI AM six forty on demand.

Speaker 2 (10:36):
We have a consideration about Ai or Mark You love
Ai oh Man you're never gonna ride away mo, and
I think I've talked to you out of using any
sort of chat GPT because it loves the m dash,
which I didn't realize we were going to have a
grammar war between old English nikki over here and in
me comal which mark which side are you on in

(10:58):
this neither. I'm just way for somebody to whip out
a knife, because this is serious. I'll tell you. The
one thing that the way Moo innovation has done for
me is that it's made me appreciate how much I
never really considered enough how much I like the horrible cologne,
the unbearable music, and the awful intrusive questions of real

(11:21):
life cab driver. I took that for granted for so long.
So long, Oh listening to them scream at someone overseas
in a different language for your entire trip, racing through
those railroad track barriers. Yeah, last time I did a
ride share, the guy backed into a bus and I

(11:44):
didn't appreciate it like I should have at the time,
And now you just look back fondly. Yeah, It's like
these things are basically like those those robot vacuum cleaners
who in the hell wants to write in a robot
vacuum cleaner. I get it. I get it. So one
of the things that I think is really interesting when
it comes to WEIMO is that Arizona basically said, no regulations.

(12:06):
You guys can do whatever you want, just don't kill anyone. California,
on the other hand, was actually, and it's no surprise,
we had more regulation. We like our regulations in California,
but you would have thought around San Francisco and the
Bay Area that where they were developing the technology would
have been a little more accommodating to what is a

(12:26):
major source of their economy. But they were really hesitant,
really stand offish on it. So really the question is
do you have open regulation or do you have heavy regulation.
I think the answer is always somewhere in the middle,
and that you have to have we have to weigh,
we have to weigh the pros and the cons and
I think there also has to be a little bit
of local autonomy that goes into these things. The President,

(12:50):
in talking about AI says, no, no, no, we can't
be doing that. We're gonna have to just take the
control away from the states, And that honestly surprised me
around the lobe.

Speaker 5 (13:03):
Everyone is talking about artificial intelligence.

Speaker 2 (13:06):
I find that too artificial, I get. I can't stand it.
I don't even like the name, you know.

Speaker 1 (13:11):
I don't like anything that's artificials.

Speaker 2 (13:12):
There you go, Mark, he's just like you. So could
we straighten that out? Please? We should change the name.
I actually mean that.

Speaker 5 (13:20):
I don't like the name artificial anything because it's not artificial.

Speaker 2 (13:23):
It's genius. Oh, it's pure genius. There you Mark, Maybe
you don't agree with Leave me out of this, please, Okay?
Fair enough. So now the President says we are going
to take action, and so he has stepped up with
his one rule in dealing with AI.

Speaker 6 (13:41):
The evolution of artificial intelligence and its integration into our
daily lives is happening so rapidly Congress simply isn't keeping
pace to regulate it, leaving a vacuum that states are
trying to fill with a patchwork of laws on AI safety,
copyright protection, and more. Monday morning, President.

Speaker 2 (13:59):
Trump, this, by the way, from the National Desk.

Speaker 6 (14:03):
There must be only one rule book if we are
going to continue to lead an AI. We are beating
all countries at this point in the race. But that
won't last long. If we are going to have fifty states,
many of them bad actors involved in rules and the
approval process.

Speaker 2 (14:15):
I think bad actors. Can I just ask something here?
Go on? Doesn't that reporter sound like an AI? Yes?
But really, I mean, how many like small market or
inexperienced reporters do that? I wish we had tape. But
the first time I did a newscast I started. First
time I got hired full time in radio, it was
to anchor news for a few months, and then they

(14:38):
were gonna slide me into a morning show. This is
a really small market. Like basically, they couldn't find anybody
that could speak English to do it because everybody else
spoke you know, farmer and so. So. I got the
gig when I was twenty two, but they had me
doing reading news and I was so bad because I
thought news guys read like this. Yeah. I was so bad,

(14:58):
so bad, and they're good. Pump the brakes, Pump the
brakes on that I will.

Speaker 6 (15:04):
Be doing a one rule executive order this week. You
can't expect a company to get fifty approvals every time
they want to do something.

Speaker 2 (15:11):
Great. President is likely.

Speaker 6 (15:12):
To find backers beyond his party last year in Colorado's
Democratic Governor Jared pole Is signed first in the nation
comprehensive AI regulation.

Speaker 7 (15:21):
To do this right, we really need federal action to
establish a national regulatory framework for AI to preempt the
states and avoid a patchwork of state laws that would
deter from innovation and make it less efficient for consumers
as well.

Speaker 2 (15:34):
All right, So there you've got the Colorado governor Democrat
who say we have to have a federal standard. That's
not a lot. Let's not leave this up to the states,
which I would have expected from a Democrat. Democrats tend
to be more of a larger centralized government with a
regulatory body. They trust government. Conservatives tend to be a
little bit more suspicious of government, and they see the
corruption that happens in different places. And I understand where

(15:56):
this suspicion comes from. Where I am a little bit
low lost on this is that you've got President Trump
who's all about state's rights. States should make their own
decisions when it comes to abortions. States should make their
own decisions when it comes to gay marriage. State should
make their own decisions when it comes to elections. Well,
the federal government should take care that states shouldn't have

(16:16):
the opportunity. Wait, I thought states should have the right. Well,
when it comes to education, well, the federal government should
take care of that. The federal government, and both Democrats
and Republicans agree with that, just that the Republicans want
to cut some of it and the Democrats want to
fund more on education and clamp down more and put
in more regulations. Right, So we run into this argument,

(16:38):
and ultimately I think it comes down to who we're
willing to allow to have control. You've got groups that
says everything should be decided by the states. Oh okay,
so elections then should be stated by state, which they
are right now. Well no, no, no, no no. That's
leading to a bunch of fraud and stolen elections. Oh okay,
so education should well no, no, no, no, no no,

(17:00):
because that's leading to a bunch of dei and wokeness.
Or no no, no, no no, because that's leading to
substandards and no child left behind, and that doesn't work.
So you've got the left and the right. They're arguing
over what the other side does, and they don't want
certain states to have certain rights. So my question is this,
what's the criteria for deciding on any particular issue, whether

(17:20):
or not the states get to make their own decisions,
because it seems that the parties don't. They're not really
interested in the states making their own decisions unless they're
not getting their way on a national level. So let
me make this recommendation. I'm gonna come to you with
a solution, not just not just whining. When it comes
to AI, I'm all for federal guidelines. I am all

(17:42):
for it, one hundred percent supported. Likewise, however, I think
we should also allow the states to add different enhancements
if they feel like the federal law doesn't have enough teeth. So,
for instance, consider civil rights and we go, here's the
baseline for civil rights nationally. That's it, civil rights national issue.
Nobody disagrees with that civil rights national issue. Okay, what

(18:06):
are protected classes? California has different protected classes than what
we have at a federal level, and I'm okay with that.
I think it's good that California can say no, no, no,
we're gonna we're gonna make sure that people have are
protected based on gender identity. For instance, right on a
federal level, they don't recognize gender identity, they recognize gender
right California is a little bit different, and I'm okay

(18:27):
with that. I think you have to have a baseline
when it comes to civil rightsts. To have a baseline,
here's how everybody, everyone is going to get these protections.
Different states may have different needs. We're a very large country,
three and almost three hundred and fifty million people in
this country. That and we live in different pockets that
have different interests. And I think it's important that states
still have some rights when it comes to the AI stuff. Yeah,

(18:50):
guardrails baselines federal, but states still have to have some say,
all right, you trust your doctor to make the call
on your care. But somehow, now somebody wants to bring
an algorithm into your conversation, right there into the exam room.
The real reason that people flinch when AI touches their
medical bills is next, Chris Merrill.

Speaker 1 (19:09):
You're listening to KFI AM six forty on demand.

Speaker 2 (19:15):
Come Chris Merrill is in any time on demand of
the iHeartRadio app. Remember when you are on that app,
you can hit the talk back button. Questions, comments, qus quotes, criticisms, compliments,
whatever you got to say, as long as it's illiterative. Yeah,
that's what I like the best. Yeah, well, David, it
was good I played on the air. If we've moved on,
I won't, but I do see all of them that

(19:35):
come through them. There is boy Mark, You're such a
lud eight when it comes to AI. We're back to
this here, we are back to this because let me
see if I can convince you on this one. I
don't think this is a tall order trying to convince
you anything, but I do think there are places where
it is beneficial. And one of the places is that

(20:00):
medicare wants to start using AI to take a look
at claims and pre offs. So this is from nine
News Denver, who was talking with one of the providers there.

Speaker 4 (20:12):
Well, so it sounds like a good idea, right right not?
I mean AI has kind of penetrated every aspect of
our lives. Why don't we have AI? Right the prior authorization,
submit it to the insurance company. But if a human
being is not looking on the other end, the AI algorithm,
which many think is designed to reject everything that's high cost,
rejects the services.

Speaker 2 (20:33):
Okay, just quick background. This is my wife's fortec. You
know she's she's I'm so fortunate I always say you
should marry someone that compliments you. I married someone that
can do my taxes and also take care of the
medical insurance. It's all I need. That's great. You know what, honey,

(20:55):
you take care. I'll cook dinner, that's fine. I'll do
the laundry, that's fine. I'll shovel the driveway that's I'll
raise the children, that's fine. You fight with the insurance companies,
that's great. That's what she does, sir, job contracting, compliance, billing.
She's a coder, she does all this stuff. She runs
a consulting business now, and I'm very fortunate. She's the
bread winner. She's the best lover. She's awesome. Can't tell

(21:16):
you how many conversations we have where she says, sweetheart,
they're all idiots. Now, granted, if you work in any environment,
you probably deal with people from other companies or even customers,
and you think that they're all idiots. That's fair, we
all do. I get that. But here's where this really
takes a turn. Insurance companies make money by hiring idiots.

(21:40):
If you are intelligent, you get promoted very quickly at
an insurance company. Do you know why that is? It's
because They don't want you dealing with the customers. They
want idiots dealing with the customers, so they have plausible deniability.
They go, oh, did you call on that, because we
don't see it in the notes here. Well, did your
person write notes? Oh? I don't know. They might not

(22:01):
have their idiots, but if they're not in the notes,
you know, we can't. We can't go back and say
that that we talked to you about it. So we're
gonna deny that claim. There is a there was a
My wife was telling me about this. She had to
leave a company she worked for early on in her career.
There's an insurance company she was on the insurance side.
Their policy was okay. Briefly, Duplicate claims happen all the time.

(22:23):
It's when it's when a provider, a doctor or a
clinic will send a claim in and say, look, insurance company,
you owe us for this service that we provided, and
then if they don't hear back from the insurance company,
they'll send another one. That's called a duplicate or a dupe.
And what happens is the insurance companies will see that
second one. We'll go you know what, Thanks, thanks for

(22:44):
reminding us, we already have it, and then they'll they'll
basically say, ignore that one. Just ignore that one. We
call it a dupe, and that one just gets automatically
denied because we already have the first one in the system.
Totally normal practice, right, that's normal. What's not normal is
there was a this happened in her office and she
was telling me about it. She was just mortified. The
policy was, when in doubt, dupe it out. In other words,

(23:08):
if you see a claim and it looks funny to you,
stamp it as a duplicate and throw it away. That
was the insurance policy. If that claim was too much
and you weren't sure if we should pay it, pretend
we didn't get it, throw it out. When in doubt,
dupe it out. That was the internal policy of this company.

(23:33):
That's been thirty years. I think this company's been absorbed
by about six other insurance companies since then. But the
whole policy was, if it's too much, deny the claim.
Now we're worried about AI doing that, but I'm telling
you that's not different.

Speaker 4 (23:45):
Starts to question the doctor's decision making, and patients are
sort of left fighting for their own care. So it's
really contrabard.

Speaker 2 (23:53):
Patients are already fighting for their own care.

Speaker 4 (23:55):
Troversial in medicine because a lot of private companies have
already launched this, and physicians one percent of physicians feel
that there has been an increase in denials since this
came through.

Speaker 2 (24:05):
That's again, that's policy. That's not because some private companies
have started using AI. It's because it's policy, that's their practice.
This is why people are so upset with the American
healthcare system. It's why one guy went and even assassinated
the head of UHC United Healthcare, the largest insurer in

(24:27):
the country. Now, I'm not saying this is absolutely not justified,
never justified. I absolutely abhor that sort of thing, But
you know what his motivation was. You don't agree with it,
but you know what his motivation was. It it was
the policy. So now we're saying, well, we're gonna have
the computers do it.

Speaker 4 (24:44):
Now the insurance companies are saying, no, that's not the case.
Our doctors always look at it. But that certainly the
concern is could this be abused?

Speaker 2 (24:52):
Could it be abused? Yeah, and now Medicare is gonna
start using this, You're gonna hear from doctor Oz after
we get a news update from Marc here in Midt
doctor Oz talking about the policy and the practice. But
I'm telling you it can't get any worse. It absolutely cannot.

(25:13):
I'll tell you what doctor I says about this in
just a few moments, what this program is called and
what you can expect, and also all of the arguments
about it are just Blowney. There's a new plan by
the administration that it could put your health care in
the hands of her brother in the circuit boards of
the AI. Judge. Your demise is next, Chris Merril.

Speaker 1 (25:37):
You're listening to KFI AM six forty on demand.

Speaker 2 (25:45):
But after marks nine o'clock news, and we'll talk about
this hostile big takeover and how this has absolutely got
the entertainment gurus, people whose livelihoods depend on entertainment just
spinning after Paramount steps in and says no, no, no, no, Warner,
don't Netflix. That is a that's coming up here after
nine o'clock. In the meantime, we were just discussing that

(26:07):
Medicare is going to start using AI to evaluate pre
authorizations and potentially reject claims. Now, private companies have been
doing this for a while in fact, Signa is being
sued for using AI. One of the country's largest health
insurance companies is facing a class action lawsuit, and this
is actually it's been going on for a couple of years.

(26:31):
The allegation was that they used artificial intelligence to deny
hundreds of thousands of claims without a physician's review. And
they say that Signa denied policy holders the thorough individualized
physician review of claims guaranteed to them by California law,
and that the payments for necessary medical procedures owed to
them under Signa's health insurance policies. So they've got about
two million members in the state. So here's the thing.

(26:53):
If you have hundreds of thousands of claims, who do
you think is reviewing that? Yeah? Theysicians. Yeah, practicing physicians.
They are people that the insurance company paid to come
in and write MD or DO next to their names
when they denied claims. It's not as though the insurance

(27:17):
company is offering you a second opinion. The insurance company
is paying doctors to not be doctors, but to use
the MD and the DO after their names in order
to say denied. It is not as though there's an
individual physician, who really gives a damn that your name
is on a claim. All they care about is who's
paying me, the insurance company, and what do they want denied.

(27:42):
That's how it works right now. Doctor Oz was on
Squakbox talking about the new plan here with AI being
used for the Medicare claim process.

Speaker 5 (27:54):
You can't be a wealthy nation without being a healthy nation.
But we need to make sure that the money is
spent in the right in the right place. So when
you go to a checkout counter, you put your debit
card in it, immediately are told whether you have money
in the bank, so that the vendor knows that they
are getting paid and that you know you can actually
pay for it.

Speaker 2 (28:11):
We need something equivalent to that with prior authorization.

Speaker 5 (28:14):
That's not what's happening now for private insurance companies and
for Medicare advantage. Oftentimes these programs are used as weapons
the deprived care. It's done for a variety of reasons,
but if we could make it work the way it's.

Speaker 2 (28:25):
Supposed to work now. It's done for one reason, profit.

Speaker 5 (28:29):
To allow us to adjudicate for sure that you're getting
the right care in the right place. You're not having
some unscrupulous individual doing an unnecessary procedure or giving you
an unneeded medication. That's a helpful thing for the system.
We think it could actually be about one hundred billion
dollars of benefit to the American people to attach pay
if you can use these services appropriately. So the President
used the power to convene, as he often does. He

(28:51):
realizes you can be nimble if you can get industry
to start doing the right thing. So he asked us
to pull together the insurance companies in America. We got
about the roughly eighty percent of the people in America's
insurance companies now pledging to do something with prior authorization.
They announced it at ahh's headquarters with Secretary Kennedy and myself.

(29:12):
We had members of Congress there and they have said, oh.

Speaker 2 (29:15):
Yeah, is that the one where the guy had the
medical emergency? And Oz jumped in and and Bobby Kennedy
like excused himself. Oh that's right. He fled the room.
You can see like a cartoon cloud of smoke beyond. Yeah.
I gotta give Oz credit.

Speaker 4 (29:28):
Man.

Speaker 2 (29:28):
He jumped right on that. He was like, I mean,
he went straight into doctor mode when he saw somebody
having a problem bowed.

Speaker 5 (29:33):
By the end of this calendar year, they will reduce
the number of procedures for which prior authorization is required,
only the ones that are most important. They're going to
be very clear on what the criteria are forgetting these
procedures of these medications, and by next year they'll do
it digitally.

Speaker 2 (29:49):
So just like that debit card.

Speaker 5 (29:50):
Goes in and tells you immediately, not in two weeks,
rite that moment, whether you have money to buy that cappuccino,
we'll be able to do the same thing for the
American people.

Speaker 2 (29:57):
Look, I actually think this is good. And I know
a lot of you are concerned about AI reviewing your
medical claims, but I do think it's good because another example,
my wife called on a claim. She was representing what
her client in this case was a rehab clinic, and
so she was calling to follow up on a claim

(30:18):
and they were they were trying to discuss whether or
not it was eligible based on the contract with that clinic.
And so she was going through and she was using
basic terminology, and the customer service rep on the other
end didn't know what the words were. She didn't know
what it meant, she didn't know what the daily rate was.
She had no idea. What do you mean daily rate?

(30:39):
What does that mean? Well, it's a facility, they mean
daily rate. She didn't know what a daily rate was.
So again, the insurance companies in this case, it wasn't
Medicare and Medicative, it was private. The insurance companies higher idiots,
and then they don't train them. They might not even
be idiots. They might be perfectly intelligent people who just

(30:59):
are not being trained well. And they don't want to
train them well because if they train them well, they
might actually end up accidentally paying out claims they're supposed
to claim pay excuse me, at least with AI, it's
pretty it's pretty cut and dry. Here's what the contract says.
Here's what the claim is. Here's what the doctor's notes are.

Speaker 5 (31:18):
Is that.

Speaker 2 (31:20):
Eligible for pre off? Is that eligible for payment? Yes? Yes, yes, checkbox, checkbox,
checkbox done. Streamline it take the human error out of it.
I know, Mark, you're gonna say no, no, no, you've
got You were just complaining earlier this hour. I was
that you keep telling chat GPT not to use the
m dash, but it keeps doing it anyway, What makes

(31:41):
you think it's going to be better at its job
when it comes to analyzing claims. Well, the thing I
love the most about the AI insurance decisions is that
they're essentially robot death panels. Do you remember we had
these discussions about this another Blain Runner reference. No, no, no,
I'm not going to hit you with any more movie stuff.
I gotta figure here. When United Health, Human and Signa
used AI to process claims, claims were denied with a

(32:04):
ninety error rate. So you're just replacing human error with
computer error. I mean, it's just six and one half
a dozen of the other. What do you like? Yes,
you like a human imbecile or a computer imbecile. Well,
I guess you could say at least with the human
imbecile that we're paying somebody, right, I mean, that is
that is the dark side of this, is that we're
going to be replacing an awful lot of people with those.

(32:26):
But I got another figure here. Go fund Me has
two hundred and fifty thousand pages for medical bills. Those
figures are wow. Okay, So every time somebody's denied, they
got to go somewhere. Yeah they need the care, Yeah
they do, And it's rude of them not to just die.
But yeah, how dare they here? We are a lot

(32:49):
of a lot of hospitals will offer hardship programs. A
lot of people don't know that. They just have to ask.
But these may not even be hospitals. They could be
that the hospital already settled their bill. But now you've
got to deal with the anesthesiologist and the doctor and
whatever else, all the other stuff. All right, You watch
the movies, you pay for the subscriptions. Meanwhile, studios are
getting swallowed up by billionaires and foreign investors and political insiders.
So who's ultimately winning in our studio game of Thrones?

(33:12):
That's next?

Speaker 1 (33:12):
Chris Merrill, KFI AM six forty on demand
Advertise With Us

Popular Podcasts

Las Culturistas with Matt Rogers and Bowen Yang

Las Culturistas with Matt Rogers and Bowen Yang

Ding dong! Join your culture consultants, Matt Rogers and Bowen Yang, on an unforgettable journey into the beating heart of CULTURE. Alongside sizzling special guests, they GET INTO the hottest pop-culture moments of the day and the formative cultural experiences that turned them into Culturistas. Produced by the Big Money Players Network and iHeartRadio.

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by Audiochuck Media Company.

The Brothers Ortiz

The Brothers Ortiz

The Brothers Ortiz is the story of two brothers–both successful, but in very different ways. Gabe Ortiz becomes a third-highest ranking officer in all of Texas while his younger brother Larry climbs the ranks in Puro Tango Blast, a notorious Texas Prison gang. Gabe doesn’t know all the details of his brother’s nefarious dealings, and he’s made a point not to ask, to protect their relationship. But when Larry is murdered during a home invasion in a rented beach house, Gabe has no choice but to look into what happened that night. To solve Larry’s murder, Gabe, and the whole Ortiz family, must ask each other tough questions.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.