Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:05):
Welcome to the
Reverse Mullet Healthcare
Podcast from BP2 Health.
Today we are at health and I'mwith my co-host, ellen Brown.
Speaker 2 (00:13):
And I'm with my
co-host, justin.
Blaine, we're going tointroduce each other today and
we have a guest co-host, ErinMartin, with FreshRx, Because
you know we might like food alittle bit and think food is
part of health.
Speaker 1 (00:24):
But yeah, we're
inserting it here, aren't we?
Speaker 2 (00:27):
Super excited for our
guest today, Andrew Toy.
Yes, and we happen to be bigfans of what Andrew's doing and
we were lucky enough to grab 15minutes for a Live at Health
edition.
So, Andrew, tell us aboutyourself.
Speaker 3 (00:41):
Sure, yeah, my name's
Andrew the Toy.
I'm the CEO of Clover Health.
We're a Medicare Advantage planmainly based out of New Jersey.
We only serve MedicareAdvantage, so that's the seniors
and disabled folks and we'revery, very focused on the early
detection and management ofchronic disease, a major passion
of mine.
Speaker 2 (00:59):
Okay, so I don't even
think that Justin was with me.
So we had lunch with I won'tlose any names here, but we had
lunch with you actually reallylike what I have to say.
So we were having lunch with avirtual primary care
organization and one of theirfounders I still don't think I'm
going to give them away and wewere talking about platforms and
(01:22):
we were talking about MedicareAdvantage plans and he brought
up that their physicians thatthey employ love working.
They love running into theClover platform because of the
information that it brings them.
Oh, that's fantastic.
And I was like, oh my gosh, thisis so great Because we think
it's such an interesting spin onjust Medicare Advantage in
(01:45):
general.
So to hear it from the clinicalside, the virtual guys.
Speaker 1 (01:49):
The virtual guys,
yeah, they love it he was like
it's such a great tool.
Speaker 3 (01:53):
I thought you would
love to hear that, I love that
and I didn't intentionally planthat.
I did not plan that either.
Speaker 1 (01:58):
Yeah, exactly, this
is the first time I'm hearing
about it and I'm absolutelydelighted that's what we're
going for, Like I think the ideais to actually I wish it wasn't
so unusual but to be a managedcare company who wants to help
clinicians manage care.
Wow, what an idea.
Yeah, like I'm like.
Speaker 3 (02:15):
That seems like I was
like isn't, isn't that the job?
Speaker 2 (02:18):
I thought that's what
we were going to do.
Yeah, so I'm delighted.
Thanks for sharing that.
That's awesome.
I thought you would enjoy that.
So talk for a minute about yourAI assistant and just what
you've built.
Tell us more.
Speaker 3 (02:28):
Yeah, absolutely so.
Our whole goal is to serveseniors by empowering their
physicians, and so the idea is,instead of having, like, a
managed care it's in vogue rightnow.
You know you have a lot of caremanagement teams and things
like that inside the plan.
You guys all know about that.
We don't employ like any ofthose really kind of people.
We do have some of our ownphysicians, but what we do is we
(02:49):
build software so that thephysicians who are already out
there the primary carephysicians can get access to
more data and practice medicinethe way they want, and by doing
that they'll do a better jobReally.
It's as simple as that.
Speaker 2 (03:04):
And then you deploy
AI as well.
Speaker 3 (03:06):
Yeah, so that's
behind the scenes, so we don't
brand it across there.
But I think that my thesis,speaking as a technologist, is
AI just makes technology morehuman.
That's its entire job, right,and so, behind the scenes, what
the AI is doing is sort ofanticipating what would be
interesting and important for aprimary care physician to see.
Speaker 2 (03:26):
So this person
actually said that that's what
their clinicians experience.
Speaker 3 (03:31):
I'm even more pleased
.
Speaker 2 (03:31):
Like described it.
Just like that.
Speaker 3 (03:34):
That's what we're
going for.
Like once you know, I wastalking to a payer and they were
like, oh yeah, but you know howare you going to make
physicians do what you want?
And they were like, oh yeah,but you know how are you going
to make physicians do what youwant?
And I was like I think thephrasing of your question is
just inherently the wrong thingto say Like I mean.
Speaker 1 (03:51):
I think if you're
going to try, and make them do
it.
Speaker 3 (03:54):
They're going to not
want to do it, I mean.
Speaker 1 (03:56):
I already don't want
to do what you want.
Speaker 3 (04:00):
So what the AI is
doing is positioning.
I always say like it shouldfeel to a physician like they
were doing something that feltreally natural, that they were
already going to do, even thoughin our data we clearly see that
they might not have done it.
That's where the AI reallycomes in to smooth over that
data, to frame it in a way thatis really effective.
Speaker 2 (04:20):
I mean, I know with
my 15-year-old that the power of
suggestion is a lot better thanauthority.
You just made me think of thatanalogy.
I was like you know thisapplies to me as a parent.
Speaker 1 (04:32):
Natural behavior.
It sounds so intuitive, butthings are done so differently
within managed care plans thanthat.
How did you think of this?
How did you come up with this?
Speaker 3 (04:43):
So I think about that
a lot.
I actually don't know the root.
It was fairly intuitive to meand that's not trying to mean
saying that I'm smarter thaneveryone else, but I think about
that a lot.
I actually don't know the root.
It was fairly intuitive to meand that's not trying to mean
saying that I'm smarter thaneveryone else, but I think what
probably informs this is I wasgiving a version of this talk
once.
I talked about the assistantand I talked about our approach
and then someone came up at theend.
They're like okay, they'retalking to me and they're like
(05:04):
because you're a doctor, right,right.
I was like no, I'm not a doctor, I'm a computer scientist.
And he's like oh no, you mustbe a doctor.
Like, no one would talk likethis.
And then I mentioned and youprobably might have seen like,
but I told I'm public that Ihave a condition called marfan
syndrome.
It's a congenital condition,it's a connective tissue
disorder.
Um, it's genetic.
So I've had it since I was born.
I've been very well managed,but I'm a very high touch.
I touch the healthcare systemevery year.
(05:27):
I go in for like scans andthings like that.
And then when I explained that,I think that this person was
like.
No, that's what it is.
You're not a doctor, but becauseyou're so engaged with the
healthcare system you see whatthe problems are.
That's right, like he was,literally he was like like this
is not intuitive to most peoplebecause most people just don't
(05:50):
aren't accessing the way theyare by the time they do.
Speaker 2 (05:51):
They're on medicare
and I'm looking after them.
Yeah, well, and the other thingthat I I I figured out is true
in our conversations preparingfor this is you are, in fact, an
engineer.
I am an engineer yes, and as wehave as we have, as we have
really jumped into this quest offinding the unicorns in
healthcare of people that arereally truly doing things
(06:12):
differently.
What I am blown away with is Ithink it's probably what Justin
like nine out of ten people areengineers by training that fall
into the unit.
Speaker 1 (06:23):
I'm not joking.
Speaker 2 (06:24):
It would be an
interesting study that, when you
really look at who is not justdoing what you're supposed to do
, how you've always done it, butmaybe slightly better Sure,
it's the people that are designand systems thinkers, that
process.
Speaker 1 (06:39):
Yeah, you're trained
to solve problems.
We started to purposefully stopasking about people's
background until we hear alittle bit more, and then we're
like, oh, this is reallyinteresting.
Speaker 2 (06:47):
And then it's like
what do you do, Engineer?
I was an engineer, I'm anengineer, it's amazing.
So, all right, I'm going to askyou the big question that we
ask everybody.
So what do you think can affectreal all caps change in health
care?
Speaker 3 (07:00):
So many things, so
many points of pressure, yeah,
and you'll have like fourminutes, right, yeah, yeah, yeah
.
So I'll keep the room.
Speaker 2 (07:05):
Only one word
actually.
No, I can do it in one word, ohwow.
Speaker 3 (07:09):
Impressive so my word
is education, but it's probably
not the kind of education youmight think it's going to be.
It's not clinical education.
I can take care of that.
I think AI will help a lot withthat.
The kind of education is when Ithink about know what an eob is
and I deliberately say eob forthe listeners who are like what?
Speaker 1 (07:31):
is an eob.
I'm like you can check I canguarantee you have one, yeah.
What is it?
What is an eob?
Speaker 3 (07:38):
um, and you're all
laughing because you know
exactly what I meant by that butI was the kind of person where
I was like, okay, eob, and atthe top it says this is not a
bill and I'm like, then why didyou send it to me?
I don't understand what this iswhat do you want me to do with
this?
And then I would just give itto my wife.
I'm like, here you go you justget this and you can do what you
like with it, right?
Um, I didn't understand co-pays.
(07:59):
I certainly didn't understandco-insurance, right, I did like,
like, like pre-auth like allthose things all those things.
And so, if you think about it,what do we normally do when
someone doesn't understandreally important things?
We teach them those importantthings.
Right, like that's education.
But how, where did you go tolearn these things?
Like how many of your friends,I'm sure you guys have these
(08:21):
conversations where, likethey're like, oh, I don't
understand it.
I'm like, okay, I gotta takeyou through this.
Speaker 1 (08:25):
This is how this is
gonna work.
Speaker 3 (08:26):
And then their eyes
glaze over very quickly but then
you're like but this is soimportant like it's so important
yeah you need to be looking atthose.
Yeah, you need to be looking atthose or if they, you need to
be looking at those.
Or if they call and they'relike oh, my insurance won't pay
for it, and I'm like no, no, no,no, no, no, no.
Speaker 2 (08:44):
That's right.
The conversation does not endthere.
Speaker 1 (08:48):
Right.
Speaker 3 (08:49):
Exactly Like like how
, where do you even learn that
Right?
Like who's teaching you thosethings?
So to me learning to navigatethe healthcare system care
system.
Now, knowing what I know right,I'm like okay, chat, gpt is
going to write a protest letteryou know something like we'll
take care of this, um, but, andso I think that's a really
important thing that we could dothat's really interesting.
Speaker 2 (09:09):
Okay, so you guys
have any more questions?
We have three extra.
We have three additionalminutes.
Aaron, your eyes just got big,you have a question?
Yeah, I had one and then itjust escaped me.
Come on, it'll come back, Okay,so yes, so Casey and Kelly
Means are on Joe Rogan's podcastand they were talking about
writing the that doctors shouldbe writing a medically a medical
(09:32):
necessity letter for people tohave medical for food, for food.
Oh, for food, for food andexercise and for exercise, and
that way their HSAs could payfor that.
And I was just interested tosee what do you think about that
?
Is that something that you'veseen happen, or do you think
that's something they could beasking their primary?
Speaker 3 (09:53):
care doctors for See.
This is why we have the guestco-host, because she comes up
with questions that Justin and Iwould never ask.
Yeah, so a couple of differentdimensions there.
Like I don't necessarily thinkthis is a procedural answer, I
don't as an insurance person I'mnot a hundred percent sure that
just because your doctor wouldwrite an order for it, that I
would necessarily cover it outof an.
HSA, but I would have to readthe statute and the rules around
(10:14):
that.
I think spiritually it makessense.
I think the issue that I seewhen you actually go in there
and we have done a lot of stuff,like, if you looked at like,
food is medicine and all thesekinds of things is that the
dimension of this and this is aninnate part of healthcare is
that what people actually andI'm sure you run into this all
the time what people want to dois not remotely correlated to
(10:37):
what is good for them.
Speaker 1 (10:38):
Oh sure, sure, sure
sure.
Speaker 3 (10:39):
And so just, and so
my first, where my mind first
goes to, is the kind of personwho already wants to use their
HSA to dollars to buy food thatis good for them.
Speaker 1 (10:49):
It's a different
person.
It's different.
Speaker 3 (10:50):
Right Now.
Could you find someone whowants to use their HSA?
Speaker 1 (10:55):
to buy Doritos.
Oh sure can you find someonewho would want to use it.
No, we're not doing that.
Speaker 3 (11:01):
But you see what I'm
saying there, right yeah?
Speaker 2 (11:03):
But if you're like,
oh, you know, you have to be a
pretty educated consumer to beasking your doctor for that,
that's right, absolutely that'sright, and there's not a massive
selection bias around thosepeople who would have that
conversation in the first place,and so I'm all for it.
Speaker 3 (11:17):
I I think it's a good
idea, but I'm not sure it
cracks the nut on that problem.
Speaker 2 (11:19):
I just got an idea
that I will leave my idea and
it'll be like a mic drop walk.
How do you incorporate foodeducation for providers into an
AI tool?
So like I see nine codes areinforming it, medical history is
informing it, right, but thenit's like can you, is there,
(11:41):
it's, it's not just food, itgets to.
Does that ai get to a pointwhere it actually helps educate
providers on the knowledge gap?
Speaker 3 (11:49):
yes, so I have a very
specific view here, but this is
a podcast I'm going to give.
Like, this is not my officialwork view, but I'm gonna give
like a podcast view.
So I I have personal view thatone of the reasons why
physicians have a difficult timeon food as education is, at the
end of the day, physicians arepeople too, and I think that
(12:10):
there's a certain layer of itwhere I don't think most
physicians necessarily followall the same rules.
Speaker 1 (12:16):
In fact, I'm pretty
sure that they are.
Speaker 3 (12:18):
I'm being very
generous in that statement, and
so it's difficult for them to doit, because it's hard to talk
to somebody about give themadvice.
Speaker 2 (12:26):
Yeah, when you're a
human.
Speaker 1 (12:27):
When you don't follow
it yourself, or even know it.
Speaker 3 (12:30):
Or know it.
Speaker 1 (12:31):
So my answer is, and
they're not trained in it.
Speaker 3 (12:33):
So my answer is
basically here is that the
physician?
We can actually just have theAI have the conversation on
behalf of the physician.
Yes, and even if we train them,take them out of that awkward
position.
Speaker 2 (12:46):
Yes, sure, I like
that.
Speaker 3 (12:47):
And then they can be
like look, I think you should
have a chat with this nutrition.
Speaker 1 (12:51):
Yes, and then?
Speaker 3 (12:52):
they'll have the nice
chat.
Speaker 2 (12:53):
I'll do all those
things and the physician doesn't
have to do that anymore.
I love it.
This is such a great.
I always like talking to you.
Speaker 1 (12:59):
Thank you, I was
really looking forward to having
you Just a very unique point ofview and like it is refreshing,
that's the word.
It's refreshing.
Speaker 2 (13:07):
Well, we wish all the
success to Clover Health.
Speaker 1 (13:09):
We really do.
Speaker 3 (13:10):
We're excited to see
what all you do and we're
excited to see the success thatyou're realizing it's well, very
good timing.
Speaker 2 (13:23):
People like it well.
Thanks, andrew.
Thanks so much, andrew.
Thank you, thank you you.