All Episodes

December 5, 2025 26 mins

In this special two-part series, Rob Gerberry, Senior Vice President and Chief Legal Officer, Summa Health, speaks with Michael Peregrine, Partner, McDermott Will & Schulte, about the health care corporate governance oversight of artificial intelligence (AI). In Part Two, they discuss what an AI governance framework might look like, the board/management dynamic, the role of an AI subcommittee, oversight of workforce issues, and whether AI can support board functions.

Watch this episode: https://www.youtube.com/watch?v=frFnd8VMT1g

Watch Part One: https://www.youtube.com/watch?v=kKLPJAv0vGQ

Essential Legal Updates, Now in Audio

AHLA's popular Health Law Daily email newsletter is now a daily podcast, exclusively for AHLA Premium members. Get all your health law news from the major media outlets on this podcast! To subscribe and add this private podcast feed to your podcast app, go to americanhealthlaw.org/dailypodcast.

Stay At the Forefront of Health Legal Education

Learn more about AHLA and the educational resources available to the health law community at https://www.americanhealthlaw.org/.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
SPEAKER_00 (00:04):
This episode of AHLA Speaking of Health Law is
brought to you by AHLA membersand donors like you.
For more information, visitAmerican Health Law.org.

SPEAKER_02 (00:16):
Hello, everyone.
This is Rob Gerberry.
I'm the Chief Legal Officer ofSUMA Health and the
president-elect designate of theAmerican Health Law Association.
I'd like to welcome you to thelatest in our continuing series
of podcasts on corporategovernance issues affecting
healthcare organizations.
Today's topic is the second partof our two-part series on a
critically important governanceresponsibility the oversight of

(00:39):
artificial intelligence asdeployed by healthcare
organizations.
As we underscored in our firstepisode, there can be few issues
as pressing to the healthcareindustry as the use of
artificial intelligence tosupport operations,
administration, and patient careservices.
AI developments offer thepotential for blazing new trails

(00:59):
and how healthcare is managedand delivered.
It also creates certain riskswith its deployment that need to
be monitored closely.
Rapidity of change andinnovation in this field places
great focus on the possible roleof the board of directors in
connection with its oversightand in its decision-making
responsibilities.
And it's a role that's notpreviously been well defined,

(01:20):
but rather has been evolving.
In our first episode, we coveredthe basics of the board's
relationship to the governanceof AI, the factors that
prompting it, the barriers toimplementing it, the fundamental
rationality for it and a relatedapproach to establishing a
governance framework.
In the second episode, we'regoing to focus on what the

(01:40):
governance framework would looklike, as well as how AI may play
out in assisting the functionsof the board, E, and new issues
associated with AI deployment inthe board's oversight of human
capital.
And as always, I'm joined by ourHLA colleague Michael Peregrine,
who's also an HLA fellow and afellow of the American College
of Governance Council.

(02:02):
So, Michael, when we left off inour last episode, the
cliffhanger episode, youdescribed the NACD's report and
its recommended practices forenhancing board integration with
AI.
Can you now share with ouraudience some of the rules and
bolts, the nuts and bolts,rather, of what a governance
structure might look like?

SPEAKER_01 (02:21):
Sure, but Rob, first let me make a point that I think
needs to be explained before wedo that.
And it's a point which I thinkkind of underscores some of the
complexity we're dealing withhere.
And it really also reflects thedueling interests of management
and the board.
Over the last several years,we've seen the rise of a concept
or organizational functionreferred to as AI governance.

(02:44):
It's critical, it's vital, it'sessential.
From my perspective, myunderstanding, the way I view
it, it's an internal operationalframework that's specifically
tasked with establishing andimplementing policies,
procedures, and standards forthe proper development and
management of AI, and to bemonitoring risk, compliance, and

(03:07):
related functions to making surethings are working, the right
hand, left hand, the wholeessentially operational ball of
wax.
It's like a management levelexecutive team.
It's a totally laudablearrangement.
It works, it's critical, but itshouldn't be confused with or
viewed as a substitute foractual governance by the

(03:27):
organization's board ofdirectors.
I wish we could use anotherphrase because you know I'll get
questions.
Why do we need a focusedgoverning board effort?
We've already got AI governance.
Isn't that enough?
It surely helps that the boardshould ordinarily be expected to
rely on much of the work of AIgovernance.
Absolutely, it does a it's atremendously focused,

(03:50):
coordinated effort.
But the board also should havethe right to exercise oversight
over its portfolio.
So again, let's be careful whenwe use the term AI governance
because in many organizations itmeans something completely
different than the boardoversight and governance of AI.

SPEAKER_02 (04:05):
So I must admit, sometimes there are people that
get confused by that reference.
As you think, though, again,about clarifying that and making
sure our members know uh theproper path to building a
structure.
What would you suggest?

SPEAKER_01 (04:18):
I think you go to the leaders of the AI governance
group and say, what do you, youknow, what's your definition?
And then you go back and say,here's what we think.
It goes back to what we talkedabout, uh Rob in our first
episode.
I think the most critical stepis the internal sales job, why
there needs to be boardoversight of AI.
And we're dealing oftentimeswith people at the management
level who are not used toworking with the board function,

(04:41):
then don't necessarily see thevalue of the board function as
well as the board does or seniormanagement.
So there's got to be acommunication.
We go to them and say, here'swhat we view our role as at a
board.
Tell us more of what you believeAI governance entails.
Where does this stuff overlap?
How can we work together?
If the board sees you as sometype of super compliance
committee uh uh or compliancecommittee on steroids, which is

(05:04):
fabulous, great, how do we worktogether?
And then do you are youlistening to us and understand
that we have a role too?
A lot of conversation.
Sometimes it needs to befacilitated, has to happen.

SPEAKER_02 (05:15):
So as we move past some of that conversation in
theory and maybe to some of thepractical impact, when a chief
legal officer calls you andsays, Michael, help me draft
some duties in my charter aroundan AI governance committee, what
do you typically think of?

SPEAKER_01 (05:27):
But besides, I want $100,000 up front right away in
cash without before I no.
Um, I I think we start with thebasic role here, um, and that is
an understanding that what we'retalking about here is is
application of the board's basicfiduciary duties uh in support
of research, development,acquisition, application, what

(05:50):
else uh of the AI by theorganization.
We're we're taking the basicduties of care and loyalty and
candor, as we talked about inone of our last podcasts, and
we're applying them to thisfunction within the
organization, the strategy.
Uh supplemental to this would beto support management and the
coordination of the variousinternal touch points, as we

(06:10):
just talked about, theright-hand, left-hand thing.
So I think we we we start offwhen we're looking at
organizational structure, we'resaying we're not bringing in
some new wild system.
We're applying the traditionalrules and the traditional
approach to a non-traditionalbusiness function strategy, and
we're coordinating it with thosefolks who are not necessarily

(06:31):
used to working with the board.
And we're clarifying this iswhat we're doing, this is what
you're doing, here's how weintegrate, but we're not trying
to pull a fast one.
We're simply applying ourtraditional expectations to this
function.

SPEAKER_02 (06:44):
So if the board takes all that on, what's left
for management?

SPEAKER_01 (06:48):
Well, let's talk about that.
You know, what are the typicalduties exercised by the board in
this regard?
I think they need to be narrowand focused, um, but they're
traditional.
Working with management on thedeveloper organizational
strategy, the board's notdeveloping the strategy, the
board is asking management todevelop the strategy.
The board then exercisesoversight of its implementation.
And that, you know, again, thatgoes back to the proficiency

(07:10):
thing, Rob.
We have to keep coming back tothat.
Another task is monitoring theuse of AI within the
organization.
Um that goes to keeping a fingeron the pulse of what the what of
what we're doing in AI acrossthe health system or across the
healthcare company.
Uh and then I think there's a shthis is where the tie-in to AI
governance comes.
It absolutely has to meanassuring the development of

(07:33):
effective compliance and riskmanagement controls over AI,
basic policy formation anddecision making, uh, as to you
know the deal making.
Uh, the board needs to have adecision tree, I think, set
forth as to uh deals, uh, youknow, innovation projects,
ventures, whatnot, investmentstrategies that relate to AI.

(07:54):
Because, as you know, a lot ofbig money is involved in that.
And so the board says we got toknow about it and we have to
have an approval process for it.
But we're not second-guessingmanagement and we're not
micromanaging.
We're just simply saying basickinds of oversight structures,
and then when did when doAI-related issues come to us for
a decision?
The board essentially takeseverything else.

(08:14):
They're developing the strategy,they're implementing it with
board control, they're umthey're they're monitoring the
org the application oforganizational controls, uh, and
they're monitoring humanresources, they're hiring and
firing people, they're doing allthe things that management would
ordinarily do.
All we're saying is that thereis, as it relates to the

(08:35):
development and deployment of AIwithin the organization, nothing
is too complex, nothing is toofancy, nothing is too new not to
be subject to some level ofboard oversight.
Where it gets a littlesensitive, Rob, is um where the
board is exercising oversightover of issues of conflicts of
interest, compensation, andconsistency with corporate

(08:56):
values as those arise in the AIfunction, because they can be a
little different there.
Um but the my message to Manjiwould be this is nothing
different than the way the boardordinarily extends its control
or supervision over operations.
It's just this is a pretty funkykind of operation, and we all
need to work more clearlybecause we're not used to it.

SPEAKER_02 (09:18):
So that takes us to expertise in the subject matter.
I noticed in the NACD report andin some of the national media
coverage, there's been a lot ofnoise about how do you find
board member expertise in thisspace?
And for those of us in the chieflegal officer role, is sometimes
we play the chief recruiterrole.
We know where to find an auditand compliance committee member
or an investment committeemember and where those

(09:39):
expertises sit.
But where do we find the AIgovernance committee members?

SPEAKER_01 (09:43):
Not easy.
And this comes back to a point Ithink we're gonna talk about
later about the use ofcommittees to oversee AI.
You talk about a seller'smarket.
Holy cow.
Um, you know, yeah, everycorporation in America is
looking for AI specific tech uhdirectors.
Um, it how you find them, youknow, I think we're gonna have
to go to search firms uhespecially.

(10:05):
You're gonna have to really umbe patient in your search for
them.
You're gonna need external help.
Um, this will increase thepressure on existing board
members to have some level of AIuh proficiency.
Again, it all comes back tothat.
There's just no excuse, uhevading it.
But I think the other thingthat's important, and and I'll

(10:27):
get on my soapbox for a littlebit, as you know, I I've just
long, I've done this now for 46years, and I have long been a
proponent for compensating boardmembers.
I uh even not the profit boardmembers.
I just don't get the analysisthat, oh, you're a charity, you
shouldn't be compensating boardmembers.
The law allows billion-dollarcharities, uh, you ought to be

(10:48):
able to hire and pay uh andcompensate directors, uh,
nevertheless, and you're gonnahave to do that, as you and I
were joking.
It's kind of like NIL in collegefootball.
You're gonna have to do it ifyou're gonna get the right
talent.
Uh and why should some archaicrule about uh being unable to
compensate board membersrestrict you from getting top
talent to exercise oversightover a function that that meets

(11:12):
people's needs?
I don't get it.
But again, off a soapbox, uh,proficiency, recruitment, spend
money.
Those are the big three there.

SPEAKER_02 (11:21):
So, Michael, back to structure.
When we think about where thisfunction uh best reports, you
know, should management bereporting to the full board?
Should it be reporting to thesubset of the board?
How do you see the calibrationthere?

SPEAKER_01 (11:34):
I think that's a real challenge.
Um, the and I think it's it'sreally uh a facts and
circumstances situation.
Um when we're talking aboutinformation or and reporting
flow, I think we we consider youknow the CAREMARC standard and
what its expectation is in someof the case law.
Now, I have not seen any casesemerging at this point, and I
may have missed them, thatspeciek speak specifically to

(11:57):
application of uh CAREMARC to umAI, but uh you know I I think
that it really depends upon thecircumstances.
And going to your question, howmany board members do we have
that are really trulyproficient?
And I think do we have enough?
Um I I look at it kind of in acombination of things.
Uh uh the uh whether you reportto the full board or tech

(12:21):
committee, I think it um youknow in the perfect world,
you're you're reporting to thefull board.
I think though that because ofscarcity of resources, it
probably makes sense.
And NACD clearly recommends uhthat you establish a tech
committee that that carries muchof the weight in terms of the
oversight, and then it in turnreports to the full board.
That's a totally appropriate uhapproach.

(12:44):
It doesn't excuse all boardmembers from um gaining AI
proficiency, it justacknowledges that you're going
to have some board members whoare more proficient than others.
Uh, and it really, you know,what works best uh uh in that
particular governancecircumstance.
Now the real question would be,I think we're taking it the next
step.
Is that committee uh does ithave uh board-designated

(13:05):
authority, a board-designatedauthority to make decisions, or
is it simply an advisorycommittee?
Um that's a tough one.
And and I'm not sure that that'sgoing to be again the byproduct
of uh the level of expertisethat's involved and experience.
I'd be a little hesitant.
Uh I mean, I I see arguments forboth.
I guess I'd start off with thesuggestion that uh giving uh for

(13:27):
for a uh a new and challengingexciting technology with ups
with risks and rewards, uh I Iwould probably be less excited
about delegating the board'sauthority to a committee.
Uh uh, but you know, again, Ithink that's uh that's something
that that just depends on aconversation between the board
and its committees and itsadvisors.

SPEAKER_02 (13:49):
So, Michael, you mentioned a technology
committee.
Are you a fan of a technologycommittee handling this work, or
do you think they need aseparate AI committee with the
volume of uh adoption of AItools?

SPEAKER_01 (13:59):
I don't care what you call it.
I think I I as I said before, Ithink ultimately uh there should
be some kind of board-designatedboard committee with authority
with respect to technology data,uh AI.
I I don't care what it is.
Uh I what I don't want to see,Rob, is that it's filled with
board members who are serving onthree or other four other
committees.
Uh I think in this situation youyou put people on the committee,

(14:22):
you do you make uh give reallyuh careful thought to what its
charter is, and as we said, isit advisory or non-advisory, but
what's the scope of its review?
And I think you leave thosepeople alone in terms of I'm not
I'm not gonna ask you, Rob GerGerberry, to serve on the AI
committee and also the auditcommittee.
That's nuts.
Uh uh So I think there this iswhat you the the decision on

(14:44):
staffing the committee uh aswell as the creation of the
committee has to depend uponwhat our resources are, how big,
how broad, how detailed is ouruh application of AI within the
organization, and uh how big isthe board and and what are the
what are the board members'capabilities of getting engaged
in complex committee activities.

(15:05):
Um it's gonna depend, you know,some organizations, you know,
very heavy uh research-orientedAMCs are gonna probably want a
very detailed, sophisticated AIcommittee.
One thing I will say, Rob, in inmy experience is it's the place
to be.
I think board members reallylike to sit on these committees.
Some like to it because they canlearn from it, others is where

(15:27):
it clearly is the action is it'san exciting area, you're cutting
edge and you're really at theforefront of where healthcare
care is going.
So I think in in one respect toyour question, a lot of sitting
board members are going to wantto be on that committee.
I don't think you're gonna wantfour people ready to serve.
It's the question of where doyou get the true experts?
And one other point on that,Rob, is uh again, if you if you
are limited under state law fromwhat you can compensate a board

(15:50):
member for, compensate, bringthem in as a special advisor of
the committee.
And outside, you know, get thewell, however you need to get
that expertise.
It doesn't have to be a boardmember if if that person is
working for the board.
That's great advice.

SPEAKER_02 (16:02):
So you mentioned our last episode uh AI and the
workforce, the human capitalelement of this.
Do you think if the boardundertakes that oversight
responsibility, how does itbalance not micromanaging uh as
it learns more about AI adoptionand its impact on the workforce?

SPEAKER_01 (16:17):
Well, uh I think this is another one of those
areas where the board needs towork with management and say, we
have a seat at this particulartable.
Uh, we've all seen the newsreports over the last days,
weeks, and months about hugelayoffs that are at companies
across industry sectors that arearising from AI implementation.

(16:38):
And you know, you think you youtalk to the uh uh outplacement
firms and the consulting firmslike Challenger Grand Christmas,
and they'll say, look, it's notall um, you know, we're not it's
not all job loss.
A lot of the AI deployment isresulting in job creation.
But I think the management needsto understand that, and it's the

(17:02):
board's responsibility to helpthem understand that there's the
board has always had a fiduciaryobligation for oversight of
human capital.
Whether it's exercised or not, Idon't know.
But it's always been there.
Uh, workforce culture is acorporate asset that has to be
protected uh and nurtured, uh,and the board's responsibility

(17:22):
is to do so.
What you're seeing now is a uhan interesting combination of
law, regulation, best practice,and corporate social
responsibility.
Uh and let me explain.
Uh I think the board's firstquestion is are we obligated to
get involved in these AIsituations when uh it involves

(17:48):
people losing their jobs?
Well, you know, we all know thatthere are uh some federal and
some uh some laws that uh SECreporting requirements in the
WAN Act that basically require anotice if you're going to be
terminating a lot of employees.
Uh there are even some stateregulations.
I think New York is the firstone.
I think New York even has itsown state WAN Act.

(18:08):
Again, that there so there's noobligation to the or law that
says the board, you must monitoruh AI deployment when it affects
the workforce.
But there is a best practice,and again, it arises out of the
obligation to oversee humancapital, and and there are
interesting, uh many interestingstatements of thought

(18:29):
leadership, including from NACDand others like it, would say
that extends to the employmentissues that are affected in
technology transition, not justAI, but when you are
implementing technologydecisions and strategies, and
that is going to necessarilycause workers to be replaced by

(18:50):
machines, the board needs to beinvolved.
And I don't think that's itthat's a that is an issue at
all.
It just simply has to have aseat at the table, not to block
management's decisions, but tobe part of it.
And frankly, to make sure thecorporate values are supported.
And the final thing, and peoplemight say, we don't need to hear

(19:10):
from our lawyers about moralresponsibility, about corporate
responsibility, but you need tobe aware, you people, uh board
boards and management need to beaware that there is a lot of
emerging discussion aboutwhether or not there's a moral
obligation uh uh to for theboard and the corporation.
Uh we know that Pope Leo, andI'm Episcopalian, I'm not

(19:32):
Catholic, so I'm not obligatedto do what he says, but you
know, Pope Leo has been veryhonest about concerns about
where AI is going and theimportance of protecting the
dignity of the workers in thisarea.
You even have people like uh uhthe former Chief Justice of the
Delaware Supreme Court, uhJustice Trine speaking out very,

(19:53):
very authoritatively on themoral obligations with respect
to uh supporting the rights anddignity of workers as they face
the displacement by AI.
So that's a long-winded answerto something that I think is
going to be if if I was gonnapredict, Ron, on today, November
12th, what is going to be anissue that's going to be at the

(20:13):
forefront of AI discussions overthe next year, it's gonna be
this one.
And and it's gonna be an issuethat I think management is gonna
push back on, understandably so.
So it again it comes back to adiscussion and of this is why
the board needs to be involvedin this.
We there are expectations, notnecessarily legal, but

(20:37):
governance principles andcorporate social responsibility
principles, except theprinciples of fiduciary duty
that said we need to be involvedand work with you to make sure
that this is not abusive, thatwe are respectful of respectful
of the workforce, because that'swhat our corporate mission
statement says we do.

SPEAKER_02 (20:58):
So Michael is a good Catholic.
You made my day by citing to thePope as one of our authorities
in our podcast.

SPEAKER_01 (21:03):
Well, you know, I you know, it's it's interesting,
and this is something I'm surewill fascinate all of our
listeners.
Uh the Pope said, uh who is aWhite Sox fan, I would point
out, um uh the the Pope said hetook the name Leo because his
the last Pope that was named Leowas very active uh in speaking
out during the first industrialrevolution, which we'll all

(21:24):
remember from high school,junior year history classes.
And but he he wrote anencyclical on the whole question
of the dignity of the workforce.
So there's some thought here.
This isn't just willy-nillystuff.
And and there will be people onthe board uh who will be
motivated by or who will beinfluenced by these issues,
rightly or wrongly.
It's just not an issue, Rob, Ithink that's going to go away.

SPEAKER_02 (21:47):
So, Michael, one last question before we leave
this topic.
I know my board chair, my peerswill ask doesn't this properly
reside with the Human ResourceCommittee?
Why should it sit potentially inthe AI committee?

SPEAKER_01 (21:57):
Well, I think it's I think there has to be
coordination.
That's a terrific idea.
Um the uh the the reason that Iwould ask the AI committee to
take a lead, and again, this isright-hand, left-hand stuff, is
that the the ability to evaluatethe um pros and cons of AI
deployment that might affect theworkforce, I think needs to

(22:19):
involve board members whounderstand the strategy.
Are those members on the AIcommittee and the human capital
committee?
I I don't know.
I I would also note that that alot of the emerging principles
over this aspect of uh fiduciaryduty uh expect the board to hold
management accountable for itsdecisions.

(22:40):
Uh, you better be right on this.
If we're gonna go ahead andwe're gonna remove a lot of
workers, we're gonna we're gonnareally hold you to making sure
that we've achieved thesebenefits that you predict.
Uh uh and so is the humancapital committee the right one
to do that?
Is you know, I think it has tobe a combination of working, a

(23:01):
combination of reach out.
And there's nothing wrong withthat.
Uh they just need to integratethe workforce.
And I think the general counselis the key person to say, hey,
wait a minute.
Uh these maybe we have co wehave combined meetings of these
committees to address theseissues.
No need to be silent.

SPEAKER_02 (23:16):
So before we wrap up our two-part podcast series,
Michael, I've got to ask, AI ispushing all aspects of a
healthcare organization to bemore efficient.
How about the corporategovernance function?
Can you see uh ways that AIcould support us in our board
functions?

SPEAKER_01 (23:30):
Uh you know, Rob, I'm a get off my lawn guy.
I don't have warm and fussythoughts about this.
I think it's too early.
I I've seen AI in the boardroomintroduced in the areas like
information flow, actualgovernance operations, which I
think is crazy, risk evaluation,and of course, minute taking.
I know a lot of reallyknowledgeable consultants are

(23:52):
pushing the pedal on these andother applications.
Good for them.
You know, at some point uh itmay work out.
But but right now I'm not seeingthe evidence where they really
materially improve the qualityand efficiency of governance.
I've seen how they, especiallywith information flow, they
actually can overwhelm the boardand paralyze it with information
and options and data.

(24:14):
Uh they're just huge trustissues with essentially
delegating um boardroom dutiesto AI.
Uh I know that there areknowledgeable people out there
right now who are breaking theirpencils and throwing their
computers against the wall andsaying, what does this idiot
think he's talking about?
You know, he doesn't know whathe's talking about.
You know, on this I do, I'msorry.

(24:35):
Um and I'd also point out thatAI can't take minutes like a
board secretary.
If Rob, if you're taking minutesof a board meeting, you sense
what's going on, you sense theflow of the meetings, you
understand what points areemphasized and what's not, you
understand the tenor ofcomments.
AI can't do that.
Um, it cannot be alert to theemotion.
Maybe it can, I don't know, butI'd rather trust Rob Gerberry

(24:57):
than Robbie the robot to bealert to the emotions of various
board members and officers.
But let's wait a while, let'sgive it a chance.
I don't close the door totally.
I'm just saying right now, Idon't see the evidence of
positive.
I see the evidence that itoverwhelms board members and it
actually works against whatthey're we're trying to do.

SPEAKER_02 (25:16):
Well, Michael, we've thrown a lot at our listeners
over these last uh two podcasts.
I want to thank you forcontinuing to be a thought
leader in the corporategovernance space, particularly
on an emerging issue like AI.
So thank you for that.
Thank you to our loyal listenersalso for hanging in and being a
part of this series with us.

SPEAKER_01 (25:31):
And can we let our AI alter eagle do that
presentation for us?

SPEAKER_02 (25:37):
Only if you can finish with your CD player and
allow it to happen.

SPEAKER_01 (25:40):
There you go.
All right, well, thank you, Rob,very much.

SPEAKER_02 (25:43):
Thank you.

SPEAKER_00 (25:49):
If you enjoyed this episode, be sure to subscribe to
AHLA Speaking of Health Lawwherever you get your podcasts.
For more information about AHLAand the educational resources
available to the health lawcommunity, visit American Health
Law.org and stay updated onbreaking healthcare industry
news from the major mediaoutlets with AHLA's Health Law
Daily Podcast, exclusively forAHLA comprehensive members.

(26:12):
To subscribe and add thisprivate podcast feed to your
podcast app.
Go to americanhealthlaw.orgslash daily podcast.
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

The Bobby Bones Show

The Bobby Bones Show

Listen to 'The Bobby Bones Show' by downloading the daily full replay.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.