Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Patrick Sullivan (00:12):
Hello, you're
listening to EPITalk
Paper, a monthly podcast fromthe Annals of Epidemiology.
I'm Patrick Sullivan,Editor-in-Chief of the journal,
and in this series we take youbehind the scenes of some of the
latest epidemiologic researchfeatured in our journal.
Today we're here with Dr.
(00:34):
Larry Hearld to discuss hisarticle "Integrating Existing
and Novel Methods to UnderstandOrganizational ontext: A case
study of an academic publichealth department partnership.
" Context a case study of anacademic public health
department partnership.
You can read the full articleonline in the December 2024
issue of the journal atwwwanalystofepidemiologyorg.
(00:55):
Our guest today, Dr.
Larry Hearld, is a Professorand Director of Research in the
Department of Health ServicesAdministration at the University
of Alabama at Birmingham.
He is an Associate Director inthe UAB Center for AIDS Research
and the UAB Center for Outcomesand Effectiveness Research and
Education, where he serves asthe Director of the
Dissemination, Implementationand Improvement Sciences Core.
(01:17):
He's currently the Co-Editor ofHealthcare Management Review
and serves on the editorialboard of Medical Care Research
and Review and Health ServicesResearch.
His research focuses on theantecedents and consequences of
organizational changes inhealthcare, with special
emphasis on dissemination andimplementation science in health
(01:38):
care.
Dr.
Hearld, thank you for joiningus today.
My pleasure, happy to be here.
So I want to start out bytalking a little about
implementation science.
Your manuscript is part of aspecial issue that Annals has
been hosting on implementationscience, and I think it's
something that a lot ofepidemiologists are hearing
about but may have variableproximity to.
(02:00):
So can you just sort of give usan idea about what
implementation science as afield is about, and particularly
this idea of organizationalcontext and how that supports
the implementation ofinterventions that have been
found to be efficacious, thatare evidence-based but need to
move into that stage of makingit into public health practice?
Larry Hearld (02:22):
Yeah, that's a
great question, so I can give
you sort of a what I wouldconsider a more technical
definition and then kind of giveyou some what I think are
examples.
I think most people would agreethat you know, implementation
science is the systematicexamination of efforts to get
people and organizations to usemedicines, therapies,
(02:44):
technologies and other types ofinterventions, as you said, that
are known to be effective, thatis, they have some kind of
evidence base behind them.
Organizational context, broadlyspeaking, refers to the kind of
social, political and resourceconditions within organizations.
Some good and well-knownexamples of things that I think
(03:06):
people think of asorganizational context include
things like organizationalculture and the decision-making
structure of organizations.
So, for example, does theorganization have a flat
decision-making structure wheremost decisions are made at the
front lines, or is it a morecentralized and hierarchical
(03:28):
decision-making structure whereorganizations are made by top
level executives?
And the reason thatorganizational context is so
important to consider forimplementation science is that
many, if not most,evidence-based interventions in
healthcare are either directlyimplemented in healthcare
organizations like hospitals orclinics, or are mediated by
(03:53):
professionals within theseorganizations, such as
physicians, nurses or socialworkers.
And efforts to implement anevidence-based intervention in
organizations typically entail achange in the organizational
context by introducing neworganizational routines and work
processes or modifying existingones.
(04:15):
So I think it's reallyimperative that implementation
science studies consider kind ofthese existing organizational
routines and how these routineswill need to be changed to
support implementation.
So that's how I think the fieldof implementation, science and
organizational context go inmany ways hand in hand.
Patrick Sullivan (04:37):
Great.
So this paper is about oneparticular academic public
health department partnership.
Can you talk a little bit aboutwhat the goals of that
partnership were and how that issort of depicted in your
manuscript?
Larry Hearld (04:51):
Yeah, so the
overarching purpose of our study
or goals of our study, which inthis case was entitled Coastal,
are to increase HIV testing inpriority populations, decrease
the amount of time it takes tolink people who test positive to
HIV care, and, of course,decrease time to viral
(05:11):
suppression amongst those whoare in care.
And the way we're approachingthis and the way we're trying to
do this is by adapting andeventually, or we're currently
implementing three Centers forDisease Control evidence-based
interventions.
The first one is thisdata-driven approach to direct
community-based HIV testing inareas with low testing coverage.
(05:35):
The second part of theintervention, project Connect,
is trying to expedite linkage atthe time of diagnosis.
And then the third component ofthis bundled intervention is a
rapid antiretroviral therapystart program.
A rapid start program.
So to understand how to adaptand implement that intervention
(05:55):
or those three components ofthat bundled intervention, one
of the first things we wanted todo was to understand in detail
the local conditions, that is,the organizational and community
context that may requirecertain types of adaptations,
before rolling out theseinterventions for full-scale use
in our local health departmentthat's partnering with us in
(06:17):
this work and that is really thefocus of the work that was
presented in this paper was tokind of present how we
approached understanding thatcontext sort of the setting.
That was presented in thispaper was to kind of present how
we approached understandingthat context sort of the setting
of where we were implementingthis intervention in our local
health department partner.
Patrick Sullivan (06:34):
So can you
walk us through the methods you
used in this process and justgive a little more detail about
how you actually went about thiswork?
Larry Hearld (06:43):
Yeah, absolutely
so.
We began with a, in our case,with a two-day site visit to the
county health department thatwas our partner in this project.
So six members of the researchteam participated in this
initial site visit and duringthe site visit, the research
(07:04):
team met with various members ofthe health department,
including the mobile testingteam, community outreach team
including disease interventionspecialists and community health
workers, as well as theinfectious disease clinical team
, and really just kind of askingthem to describe their
respective work responsibilitiesand the workflows and how
activities were coordinatedacross these different parts of
(07:26):
the organization.
So, for example, we asked thedisease intervention specialist
when a person with HIV isreferred to you, who is your
first point of contact in theclinic?
Who do they see after that?
And the goal was to get kind ofthis general understanding of
the kind of the layout of theorganization, both physically
(07:47):
and as well as sort ofprocess-wise and who is involved
in different work processes andhow these processes were
coordinated between peoplewithin and across the health
department, coordinated betweenpeople within and across the
health department.
Following that site visit wewent back, took a bunch of field
notes and multiple team memberstook different field notes and
(08:11):
we went back and reallyconsolidated those and used them
to construct kind of an initialprocess map that identified and
linked those activitiestogether in sort of a workflow
in a sequence.
Next, another research teammember visited the health
department several months laterand observed, with permission of
course, different workactivities such as linkage to
care interviews that were beingconducted by disease
intervention specialists,observed like new patient
(08:35):
appointments for someone thatwas recently diagnosed with HIV
and entering into care for thefirst time.
The research coordinator alsokind of conducted more informal
observations in differentlocations like the clinic
waiting room and was therereally to note things like
patient flow in the clinic andobserve conversations between
(08:55):
staff and as well as staff andpatients.
But after that we conductedsemi-structured interviews 13
interviews, in this case viaZoom, and these interviews were
really kind of informed by thatprovisional process map that we
made after the initial sitevisits and we kind of
specifically selected members ofthe health department teams
(09:15):
that were involved in testinglinkage in HIV treatment, as
well as some of the healthdepartment administrators.
Kind of the final step in thatprocess was a virtual
collaborative process mappingsession between the research
team and members of the healthdepartment implementation team,
and this entailed a 60-minutevirtual think aloud session
(09:39):
where we presented our initialprovisional process map.
That was based on the initialsite visit, those participant
observations and interviews, andwe essentially asked
participants to tell us what didwe get wrong by correcting
things on the process map inreal time.
So we used the virtualwhiteboard, in this case a mural
(10:00):
board.
Participants could make thosechanges themselves or instruct
the facilitator to do it, andthe end product was this
finalized set of processes thatdescribe how a community member
may interact with and flowthrough the health department.
So that's obviously all in themanuscript where we describe
kind of the process of gettingto that process map as well as
(10:22):
kind of an illustration of thatprocess map.
Patrick Sullivan (10:25):
Yeah, thanks
for that level of detail about
how you actually did this, andyou sort of mentioned at the end
that the sort of key productwas this map.
You know that talks about howpeople might move through
services.
In developing that, what werethe key takeaways or lessons
learned, or was there anythingin that process that you felt
(10:47):
like?
These are folks, some of yourpartners are people who live in
these environments right, soyou're helping them to think
about things maybe in adifferent way.
So what were some of the keytakeaways that came out of that
process that you think mightshape how the services are
delivered or the impact of them?
Larry Hearld (11:04):
Yeah, Patrick,
great question.
I think that also, and I thinkthere are really maybe two
takeaways.
One, as you point out, I think,from the perspective of the
participants, that is, thehealth department members, I
think being able to kind ofdocument and observe how a
patient or how a client kind ofmoves through that process and
(11:25):
moves through their organizationis really kind of illuminating.
It sort of requires them tokind of think about, as well as
question, some of the processesthat exist in the organization
and so I think there's sort ofthat benefit that exists kind of
directly for, in this case ourhealth department partner.
(11:46):
For me and I think, for us as aresearch team, one of the key
takeaways from this study and Ithink this is true of a lot of
my work in recent years is kindof a recognition of the
importance of getting close toyour subjects, in this case the
health department.
I was formerly trained as amanagement kind of organization
(12:08):
scientist.
So my training I kind of cut myteeth, so to speak, using large
secondary data sets to ask andanswer questions about how
things like the design andstructure of healthcare
organizations may affectperformance, so things like
quality of care and cost of care, but at a very sort of macro
level, organizational level,right, and that type of work
(12:32):
frequently didn't require me todirectly interact with
organizations.
So, something I've come toappreciate more and more as my
research career has evolved,including and especially as I've
gotten more closely involved inimplementation science
research, is the importance ofgetting your hands dirty If
you're really going to beeffective at implementing things
(12:55):
in organizations and studyingthese things in organizations
you really need to get under thehood of these organizations.
It's not enough to know the makeand models, so to speak.
You need to understand how itis engineered to run, and this
study was fun because itrepresented one of the most
in-depth explorations of anorganization, where we were able
(13:17):
to apply a lot of differentmethods to understand how and
why things were done in aparticular way.
So I think that's, for me,really one of the main takeaways
from this study, as well asthis type of work.
Patrick Sullivan (13:30):
Great and you
touched on this a bit, but I
think this idea of themulti-method integrated approach
and these different ways oftrying to understand the
organizational function, do youhave any other thoughts in
general about the strengths orchallenges of that kind of a
method as you described it?
Larry Hearld (13:51):
Yeah, certainly.
I think the real strength ofthis type of integrated approach
is that it enables researchersto get a more comprehensive
picture of the organizationalconditions that may support or
hinder efforts to implementinnovations by combining
(14:11):
multiple perspectives andmultiple types of data.
So much like any mixed ormultiple method approach case,
however, it was one focused veryspecifically on understanding
the organizational context which, as I mentioned, I think is
particularly important whentrying to understand how to
(14:31):
adapt interventions to thosesettings and how to implement
them.
I also think a strength of thisapproach is that it was very
iterative and cumulative, thatis, we were able to incorporate
things we learned in previousinteractions and data collection
activities to inform futureinteractions with our partners,
(14:53):
and I'd like to think that thatmade the final product really a
more accurate depiction of whatthings actually looked like in
the organization.
Now, the challenge to thisapproach, of course, is the time
and resources required toconduct such an iterative,
multi-method approach tounderstanding organizational
context.
(15:22):
So we were fortunate in someways that we were only doing
this with one organization.
I'm not sure we would have beenable to do it at scale, so I
think that's something you'dhave to consider for others who
are trying to apply this islike- how to balance that need
to be deep, but also when youhave to try that kind of
approach across many differentorganizations.
Patrick Sullivan (15:37):
his, and I
think that that challenge of
sort of breadth and depth is onethat's common to many kinds of
studies or inquiries that we do,but maybe especially may fall
into both the opportunities andthe challenges in this kind of
approach.
Now a little bit to a part thatwe call Behind the Paper, and
(16:03):
it's really meant to understand,just for us as people, as
scientists, as folks who want toimprove health.
You know what the personalexperience is of doing the
different kinds of analyses andwork that we do, and here I
think the real highlight isabout the partnership with the
county health department.
So can you just talk a littlebit about your experience in
that partnership with the healthdepartment, and how is that the
(16:24):
same or different thancollaborative models that we
might encounter in other kindsof research?
Larry Hearld (16:44):
Yeah, I think
that's a great question, and
it's fitting cause I think a lotof the work in this space, both
implementation science but alsospecifically to understanding
how to both diagnose and preventor treat HIV, I think really
does embrace this kind ofcollaborative model or
collaborative approach and, as Ithink many people who do this
work would say or recognize,this model of kind of
co-production can be reallyincredibly fruitful in that it
does a better job ofincorporating the perspectives
(17:09):
of the end users, which ofcourse is critical in
implementation science.
Ultimately, we are trying tokind of support the use of
things that we know to beeffective by people who will
benefit from their use.
But it takes time.
You have to cultivate andconstantly curate relationships
with your partners, which meansyou have to have a certain level
(17:31):
of patience, which isn't alwayseasy when you're doing funded
research and you have clear kindof milestones and expectations
that you need to meet from afunder perspective, but you
really can't rush therelationship building process.
So because of that you kind ofalso have to be proactive in
your planning when you're doingthis type of research.
(17:53):
So, because if other researchactivities are contingent upon
establishing these relationships, you need to hit the ground
running in a study like this or,more realistically, I think
these relationships you need tohit the ground running in a
study like this or, morerealistically, I think these
relationships exist prior toimplementing the study, perhaps
cultivated in the proposaldevelopment stage or in previous
(18:13):
research studies, but it'sdifficult to imagine.
You know, just kind of whenwe've run into problems with
this type of research is whenyou're trying to build those
relationships while also tryingto implement a study.
That can be really challenging.
So in many ways, I think youneed to begin a study like this
with those relationships alreadyeither intact or in development
(18:35):
.
Patrick Sullivan (18:36):
I appreciate
that and I think the whole piece
of the relationship here isexactly because each partner is
bringing a different set oftools and experiences and
organizations that are providingservices are understandably
focused on the best ways toprovide those services to their
clients, and so the sort ofimplementation science process
(18:58):
or we might say a researchprocess or this process, is in
some tension, right, becauseit's actually coming in and
trying to be a little moresystematic and analytical about
what folks are doing.
Sometimes it can feel, I think,for people in the organizations
can feel, if not critical, atleast that sense of being
observed.
(19:19):
And so I think that this levelof comfort and the shared goals
and the reassurance that whatcomes out of the end of this is
meant to help, you know, ismeant to reflect back a little
bit and help think about how theprocess might be optimized to
promote the organization and theclients.
That has to be clear and havethat trust throughout, or I
think it can otherwise you knowbe a little bit awkward because
(19:40):
of the different perspectivesthat people are bringing.
So I really appreciate youtalking through how you handle
that.
So I really appreciate thismanuscript, which is a different
kind of manuscript than wesometimes get at Annals of
Epidemiology, but reallyappreciate you sort of linking
it into the idea of howinterventions are implemented to
(20:02):
make public health impact, andI think that's a critical part
of a lot of the articles that wesee published need to move
through that phase, and so Ijust want to point out to people
who are interested in thistopic of implementation science
that we currently have an openspecial issue on the website-
there's a special section for it.
You can see other manuscriptsthat relate to implementation
(20:24):
science and how they integratewith epidemiology on the website
.
So I'd encourage folks who wantto learn a little more just to
look at what else is in thespecial issue, and I think there
are going to be some thingsthat are of interest.
That brings us to the end ofthis episode.
Thank you again, Dr.
Hearld, for joining us today.
It was such a pleasure to haveyou on the podcast.
Larry Hearld (20:44):
My pleasure.
Thank you for having me.
Patrick Sullivan (20:50):
I'm your host,
Patrick Sullivan.
Thanks for tuning in to thisepisode and see you next time on
EPI:Talk, brought to you byAnnals of Epidemiology, the
official journal of the AmericanCollege of Epidemiology.
For a transcript of thispodcast or to read the article
featured on this episode andmore from the journal, you can
visit us online at www.
(21:11):
annalsofepidemiology.
org.
Thank you.