All Episodes

December 2, 2025 18 mins

In this special two-part series, Rob Gerberry, Senior Vice President and Chief Legal Officer, Summa Health, speaks with Michael Peregrine, Partner, McDermott Will & Schulte, about the health care corporate governance oversight of artificial intelligence (AI). In Part One, they discuss the board’s core role regarding AI, the specific details of that role, and the board’s connection to AI deployment decisions.

Watch this episode: https://www.youtube.com/watch?v=kKLPJAv0vGQ

Essential Legal Updates, Now in Audio

AHLA's popular Health Law Daily email newsletter is now a daily podcast, exclusively for AHLA Premium members. Get all your health law news from the major media outlets on this podcast! To subscribe and add this private podcast feed to your podcast app, go to americanhealthlaw.org/dailypodcast.

Stay At the Forefront of Health Legal Education

Learn more about AHLA and the educational resources available to the health law community at https://www.americanhealthlaw.org/.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
SPEAKER_00 (00:04):
This episode of AHLA Speaking of Health Law is
brought to you by AHLA membersand donors like you.
For more information, visitAmerican Health Law.org.

SPEAKER_01 (00:16):
Hello, everyone.
This is Rob Gerbery.
I'm the Chief Legal Officer ofSUMA Health and the
president-elect designate of theAmerican Health Law Association.
I'd like to welcome you to thelatest in our continuing series
of podcasts on corporategovernance issues affecting
healthcare organizations.
Today's topic focuses on acritically important governance
responsibility.

(00:37):
The oversight of artificialintelligence is deployed by the
company.
And as most of our listenersknow well, there can be few
issues as pressing to thehealthcare industry as the use
of artificial intelligence tosupport operations,
administration and patient careservices.
AI developments of the potentialfor blazing new trails in how
healthcare is managed anddelivered.

(00:58):
It also creates certain riskswith its deployment that need to
be monitored closely.
The rapidity of change andinnovation in this field places
great focus on the possible roleof the board of directors in
connection with its oversightand in its decision-making
responsibilities.
And it's a role that has notpreviously been well defined,
but rather has been evolving.
For those and similar reasons,we're treating the topic in a

(01:21):
special two-part episode of ourpodcast series to provide our
listeners with hopefully usefulinformation on the board's core
role, the specific details ofthat role, how AI can be used in
support of board duties, and theboard's connection to AI
deployment decisions that mayhave negative implications on
the company's workforce.

(01:41):
And in that regard, we'll flagfor you certain existing
resources that may be helpful asyou work with your management
and board leaders in creating abalanced approach and an
effective pathway forward forboard involvement.
And as always, we're lucky to bejoined by our HLA colleague and
fellow, Michael Peregrine, who'salso a fellow of the American
College of Governance Council.

(02:03):
So, Michael, before we dive intothis uh new and important topic,
I have a basic question that'sbeen sort of bugging at me.
We've been doing this podcast inlate November of 2025, but AI
has been around now as an issuefor at least three years.
Are we a little late to theparty and coming up with this
podcast series?

SPEAKER_02 (02:23):
I don't think so, Ram, not at all, for a couple of
reasons, uh, which kind of go tothe complexity of AI.
First of all, the question ofthe board's proper governance
role is really far from beingsettled.
And that's an important pointfor our listeners to take into
consideration.
The important NationalAssociation of Corporate
Directors report on this issuedidn't come out until what

(02:45):
October of 2024.
Second, there really aren't anystatutory guidelines, at least
the federal at the federal levelor case law on the subject.
Third, many boards reallyhaven't yet resolved their own
approach to governance of AI atthe board level.
There continue to be realquestions about the board's
capacity to address AI issues.

(03:06):
Just check out the newspaperreporting, including the Wall
Street Journal, on today, theday we're reporting it, we're
recording this.
Fifth, the trust and reputationissues that have long been
associated with AI are neverending.
And sixth, human capital issuesare now rising to the top with a
bullet as it relates to AIdeployment.

(03:27):
Is that helpful?

SPEAKER_01 (03:29):
So with that list, maybe we should be doing a
three-part series.
But seriously, what is it aboutAI that makes it such a
challenging board concern?

SPEAKER_02 (03:39):
Well, first of all, I have to disclose I am not a
tax specialist.
So I uh, you know, my my myknowledge about the pure
efficiency and effet uhapplication of AI is gonna be
limited, but I think looking atit from a governance
perspective, it gets moresimple.
And I think the answer to yourquestion is the sheer complexity
of the technology, which maycome easy to some board members

(04:00):
and certainly to members of theexecutive leadership team, isn't
gonna come to all board members.
There's just gonna be a widediversity of AI proficiency
within an organization.
And in my experience, that'sjust bound to create tension,
particularly between the boardand management.
That's where the that's wherethe problem is.
And also it's gonna be betweenthe board, uh the tech managers

(04:21):
within the organization, andexecutives who live constantly
in this world, with innovatorsand the venture capital people
and the like, constantlypresenting new research and
development opportunities andplatforms.
It can get messy as there is agap between the knowledge level
of the board and the knowledgelevel of managers who are
dealing with this and theopportunities on a day-to-day

(04:43):
basis.
We've got to close that gap.

SPEAKER_01 (04:45):
So, Michael, with that potential imbalance,
doesn't that potentially argueagainst strong board
involvement?

SPEAKER_02 (04:51):
Well, we can't let that be the prevailing thought.
The simple, inescapable fact isthat in a corporate entity
model, whether it's for-profitor not-for-profit in an
organization, the board ofdirectors can and I think must
play an important role in thehealthcare company's approach to
machine learning technologieslike A, like AI, and in the
mitigation of the associatedrisk.

(05:12):
Period, exclamation point.
We just can't let that narrativeproceed, even though it's a
logical one.
And I think the urgency, Rob, isthat the pace of AI evolution is
so great that the board, if itdelays itself at all in
developing a corporategovernance structure, can fall
behind in proficiency, and ifthen it becomes really unable,

(05:33):
it's always catching up, andit's not in a position to truly
evaluate the tough AI call.
When that happens, managementloses its confidence.
It's and it's things spiraldown.

SPEAKER_01 (05:45):
So maybe, Michael, as we take a step back and just
thinking about building thatfoundational model for a board,
how do you think it looks likerelated to the oversight of AI?

SPEAKER_02 (05:53):
Well, I think it reflects the traditional
oversight decision making andinformation flow themes that are
consistent with the corefiduciary duties that we always
talk about.
But I think the basic modelshould be broad enough in
coverage to include things likecore strategy development,
research, acquisition,investments and partnerships on

(06:14):
innovation, deployment,compliance and risk, trust and
reputation, and human capital,and more on that later.
Again, I think the board'sgovernance structure should have
a finger on the pulse of all ofthose matters.
And I think it also should bedesigned to supplement the AI
supervision already in place atthe operational level, and it
should help communicate tocorporate constituents how AAI

(06:36):
is being used responsibly, thecompany's operation and the
provision of healthcare-relatedservices.
So oversight, communication,decision making, a lot of the
same traditional themes thatreally zeroed in on how AI
operates in the organization.

SPEAKER_01 (06:52):
Well, with that answer, you make it all seem
simple enough.
So what is the barrier then togetting such a structure put in
place since it seems reallyconsistent with a lot how a lot
of boards are alreadystructured?

SPEAKER_02 (07:03):
Well, I'm I'm gonna get in trouble here, but uh, you
know, part of the problem liesin a couple of factors.
Uh, one is uh in the absence ofany all-encompassing AI
regulatory framework at thefederal level.
Not making a political statementhere, but you know, we went from
one administration that was veryfocused on having specific uh uh
regulations at the federal leveland a comprehensive approach to

(07:24):
AI regulation, to another whichis focused on uh eliminating
barriers to uh initiative anddevelopment of AI.
Second of all, uh we've got umthe release of the NACD Blue
Ribbon Commission report that Imentioned before.
That was a real game changer.
Um before that time, we neverhad any real accepted best

(07:47):
practices for AI oversight bythe board.
So again, we've got uh the alphaof the absence of any kind of
real federal uh scheme orframework for AI regulation, and
the omega, we finally got someguidance at a best practice
level on how boards shouldapproach AI oversight.
Um both of those combine, Ithink, to really force specific

(08:12):
evaluation of what the board'sdoing, and it gives them a
pathway forward.
But I think all of this has beencompounded by what I would say,
and again, I'll get in trouble,but it's a there's a significant
lack of internal and externalappreciation for the
contributions that the board canmake to address AI risks and
strategies.
And you see a lot of that intoday's Wall Street Journal
story today being, what is it,November 12th, and sort of

(08:34):
taping this, where you have alot of experienced directors
saying, I'm having a hard timeunderstanding this stuff.
That's a problem.

SPEAKER_01 (08:42):
So, with that lack of understanding, how do we move
forward?
How do we get everyone toappreciate uh the need for AI
and then the oversight of it?

SPEAKER_02 (08:49):
Well, I think you've got to drill down with the
people with the people in theoperational uh an awareness of
why the board has a real stakein this game.
That's the biggest challengebecause I think in some
organizations there's a fairamount of resistance from
corporate leaders to a strongboard role.
If we can't overcome that, if wecan't make the sales, so to
speak, and why the board shouldbe involved, we've got huge

(09:11):
problems.
You know, I see this in myexperience.
We see the resistance comingfrom tech leaders, researchers,
scientists, developers, theinnovation people within the
organization, all these folkswho believe that, um, and maybe
with some amount ofjustification, that any kind of
corporate monitoring system willneedlessly frustrate innovation
and the competitive advantage,and therefore in our business,

(09:33):
uh uh be a detriment to our tothe delivery of health care in
the most efficient way.
Uh it's a theme that also showsup, as I said, in the in the
approach to federal regulationof AI, i.e., the more
regulation, the more thecritical progress and innovation
will be stifled.
That is a problem.
So I I think uh my point, wehave got to be able to make the

(09:55):
sale that corporate governancehas a critical role to play,
it's a meaningful role, and thatthe board is up to skip you
know, is up to the challenge.
But as I said, some of thecriticism is fair, uh,
especially with respect to boardproficiency.
As I said, the journal issue uhtoday, um, where you have some
prominent and seasoned boardmembers express their own
frustration with the complexityof AI, that's the essence of the

(10:18):
problem.
And we go back to the NACDreport, and we'll talk about
that more later.
It starts off, you know, what'sthe pathway to effective board
governance?
Proficiency, proficiency,proficiency.
But the system just doesn'twork.
The checks and balances we needto run an organization will not
work without some board rolewith respect to AI.

(10:38):
It's just that simple.

SPEAKER_01 (10:40):
So it sounds like the board chair, the chief legal
officer, and others may have toplay a role in bringing along
some of the naysayers andstressing the importance to the
rest of the board.

SPEAKER_02 (10:49):
Well, yes, and I think the chief legal officer
should get battle pay for uh forfor engaging in that role.
But I think there's a couple ofapproaches that you take to sell
the this idea of board oversightto the naysayers.
I think first is the argumentthat there absolutely must be a
system of checks and balanceswith respect to AI deployment in
order to make sure that uhstakeholder interests are

(11:10):
served, that compliance isaffected, and that the issue,
critical issues of trust andreputation are addressed.
And that can't come from theoperational side, it must come
from work folks who have afiduciary obligation to the
organization.
You know, without that kind ofoversight, the the company, the
healthcare uh provider is flyingblind into a thunderstorm of

(11:31):
risk.
Second is that the argument youmake is that in the absence of
any, again, I as I said, the inthe absence of any meaningful
federal regulation on AI use, atleast at this point, it falls on
the board and its care markobligations to make sure that
there's a system of complianceand risk oversight that protects
the organization throughappropriate policies,

(11:52):
procedures, and practices.
Try saying that three piecessometime, Mr.
Girl, right.
Um, and then there's theliability exposure to the board
should it fail to exercise somelevel of oversight.
It should just basically say wewash our hands, this is too
complex, and and we're walkingaway.
That would be catastrophic.
There's no DNO coverage in theworld that's going to cover you

(12:13):
when you just you throw up inyour hands and you say uh this
is too complex, we'll let othersdeal with that.

SPEAKER_01 (12:20):
Not to mention it seems like this responsibility
is consistent with bestpractice.

SPEAKER_02 (12:24):
Well, yeah, that's absolutely true.
And listeners to our podcast,uh, Rob, will remember I'm not
wild to throwing out the termbest practice slightly because,
as we know, there's no realpractice or process to confirm
its existence.
I'm not issuing a pent legalopinion that something is the
best practice.
But I do uh kind of evaluatethis from the perspective of

(12:46):
who's preparing the analysis,where is it coming from, who's
adopting it?
And that's why, again, the NACDreport from last year, which I
think is called TechnologyLeadership of the Boardroom, is
so significant.
Uh it was came out in October,and it's the byproduct of a
really notably diverse blueribbon commission consisting of
leaders from tech, finance,management, military,

(13:08):
consulting, higher ed,insurance, and the law.
So you've got the people inplace who are saying, here are
the aspirational goals.
This isn't a consulting firmsurvey, as valuable as they are.
It's the input of industryleaders who who and they're
providing recommendations aswell as toolkit for moving
forward.

(13:29):
So does that is that constitutebest practice?
It's the best we have.
It is absolutely the best wehave, uh, and I think it's a
very credible board resource.
Is that dodging the question?

SPEAKER_01 (13:41):
I don't think so, because by my read, I think that
NACD report, you know, does makeit clear uh that it's an
affirmative statement that doessupport the board's role uh in
this oversight function.

SPEAKER_02 (13:52):
It really does.
Uh the report's overarching uhconclusion, and I don't get any
piece of the action formembership in ACD, let's get
that straight.
I wish there were otherstatements out there from some
of the other um thoughtleadership organizations, but
they're not.
Um it's what we have to dealwith.
Uh but the report's overarchingconclusion is this in the

(14:12):
current environment, effectivecorporate governance has a
significant impact on whetherand how new technologies will
drive value creation and will beor won't be accepted by
organizations, economies, andsocieties.
Period end of quote.
That is the message that has tobe delivered internally.

(14:33):
You there is a tremendous valueproposition to effective
corporate governance overoversight.

SPEAKER_01 (14:41):
For our listeners that haven't uh dove through all
the pages of this lengthyreport, can you give them a clip
notes version?

SPEAKER_02 (14:47):
Yeah, I think it's long and it's detailed, but it
often, at least to me, and ofcourse I may have uh different
definitions of marvelous, it's amarvelous pathway forward.
But in a nutshell, uh the reportfocuses on three imperatives.
This is the pathway, this is thedirection forward, this is the
suggestion about how a board cango ahead and implement effective
governance.

(15:08):
Uh, the first imperative or thefirst step is strengthening
oversight.
Um, so they say, let's ensuretrustworthy technology by
aligning it with theorganization's purpose and
values.
That's not touchy-feely stuff.
Uh, what do your statement ofcorporate values mean and how
does how does your AI use applyto that?
We'll come back to that maybe inour second episode.

(15:30):
It means upgrading boardstructures with expertise on
technology governance.
It means clearly defining theboard's role in data oversight.
And I think this is particularlyimportant, Rob.
It defines decision-makingauthorities for technology.
What when do you have to go upthe stream at the board and
management levels?
I think the lines of authorityare clear.

(15:51):
The second is um they calldeepening insight.
Uh, I I call it you know,proficiency is everything.
It says, all right, first wewant to establish and maintain
the tech um proficiency that'scritical amongst the board.
And that's going to be a sl, itcould be a long slog, but it has
to happen.
You evaluate it periodically.
Um there's a uh an interest,there's an interesting story,

(16:14):
uh, a news story the other dayabout a corporate executive, CEO
of a major, major, majorcompany, which basically says,
we expect you, the employees, tocome up to speed and train
yourself and and update yourselfso you will be useful and AI
knowledgeable and help ourcompany going forward.
You know, if you don't, all betsare off.

(16:35):
So it's appropriate that we weturn that lens on the board as
well.
Are you through our evaluations,have you been learning your and
training learning and teachingyourself and picking up on the
education for AI?
Um, and then have appropriateand clear metrics for the
oversight of uh technology bythe board.
You're giving the board aframework, uh, a dashboard,

(16:56):
whatever, on how to measure it.
That's the insight factor.
The third factor is kind of athreefold one.
The board and management agreethat tech is a tech generally is
a core element of the long-termstrategy of the organization,
and why?
Uh then you enable exploratoryboard and management technology

(17:18):
discussions.
People talk together, they gettogether, they share ideas and
thoughts.
Then you, this is the kind of aninteresting one that I like.
You design board calendars andagendas.
You just kind of hardwired intothe system to make sure there's
going to be appropriate focus onforward-looking discussions.
We build in, we come back, weremind this is going to be on
the agenda, this for the nextfive, ten meetings or whatever.

(17:40):
Uh, I think that's thethree-pronging approach that
NACD takes.
Uh, it's not the only approach,but again, as we said before, in
terms of thought leadership,it's all we've got.
But it's substantive.

SPEAKER_01 (17:50):
Well, Michael, that's probably as much AI as
our human listeners can handlefor one podcast session.
So let's go ahead and hit thepause button.
Uh we'll return again for parttwo of this series where we'll
press more on what is thegovernance role uh structured
and looking like, and then we'llalso look at emerging issues as
well in this space.
So, Michael, thank you very muchfor uh the kickoff to this
podcast series.

SPEAKER_02 (18:11):
Uh, look forward to that.
What a cliffhanger, huh?

SPEAKER_01 (18:14):
That's right.
Thank you.

SPEAKER_00 (18:20):
If you enjoyed this episode, be sure to subscribe to
AHLA Speaking of Health Lawwherever you get your podcasts.
For more information about AHLAand the educational resources
available to the health lawcommunity, visit American Health
Law.org and stay updated onbreaking healthcare industry
news from the major mediaoutlets with AHLA's Health Law
Daily Podcast, exclusively fromAHLA comprehensive members.

(18:43):
To subscribe and add thisprivate podcast feed to your
podcast app.
Go to americanhealthlaw.orgslash daily podcast and
Advertise With Us

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Are You A Charlotte?

Are You A Charlotte?

In 1997, actress Kristin Davis’ life was forever changed when she took on the role of Charlotte York in Sex and the City. As we watched Carrie, Samantha, Miranda and Charlotte navigate relationships in NYC, the show helped push once unacceptable conversation topics out of the shadows and altered the narrative around women and sex. We all saw ourselves in them as they searched for fulfillment in life, sex and friendships. Now, Kristin Davis wants to connect with you, the fans, and share untold stories and all the behind the scenes. Together, with Kristin and special guests, what will begin with Sex and the City will evolve into talks about themes that are still so relevant today. "Are you a Charlotte?" is much more than just rewatching this beloved show, it brings the past and the present together as we talk with heart, humor and of course some optimism.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.