Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:07):
In industry, root cause analysis, or Rcas, are a widely
used collection of approaches and tools.
They're meant to be a rigorous process for digging deep after
an adverse event, identifying contributing factors in crafting
solutions that will prevent reoccurrences.
The expectation is that an RCA leads to real, tangible
(00:28):
improvements. But what does the latest
research actually show about theeffectiveness of Rcas?
Are Rcas consistently deliveringon their promises to make our
workplaces safer? Or are we sometimes just going
through the motions without seeing the desired impact?
Good day, everyone. I'm Ben Hutchinson, This is Safe
as a podcast dedicated to the thrifty analysis of safety,
(00:51):
risk, and performance research. Visit safetinsights.org for more
research. Today's study is from Martin
Delgado ET al 2020, entitled Howmuch of Root Cause analysis
translates into improved patientSafety?
A systematic review published inMedical Principles and Practise.
(01:11):
It's a systematic review that explored whether root cause
analysis are effective to reduceyour recurrence of avoidable
adverse events in healthcare, orelse call them adverse events.
21 studies were included, nine of moderate quality, 5 of
considerable quality, seven of high quality.
It's another healthcare study. As you picked up, it's worth
(01:33):
highlighting that RCA is an umbrella term.
It's not some monolithic tool, but rather includes many
different tools and methods. Because of this study design and
the overall limited evidence, they couldn't sort the study
into a comparison of the specific RCA tools.
So what did they find? Overall, they found the
(01:55):
effectiveness of Rcas is unclearand inconsistent.
According to the paper, it's notclear if root cause analysis is
effective in preventing the recurrence of adverse events.
And although early studies suggested that Rcas are
effective in promoting ideas forpreventing recurrence, more
recent studies don't confirm these findings.
(02:17):
Put simply, we're not seeing stronger or more consistent
evidence that RCA processes are actually help prevent future
harm. Further in the details, in only
two studies could it be established that Rcas
contributed to the improvement of patient care to some extent.
And these two studies were themselves limited by the number
of the Rcas reviewed, really weakening the evidence overall.
(02:41):
Also, they found in 50% of the cases the recommendations from
the Rcas were quite weak, which didn't lead to a reduction of
adverse events. So half of the time the outputs
from the Rcas aren't even strongenough to really have any
tangible improvements of the things they're supposed to
actually help prove. They also found that the action
(03:01):
plans saw the corrective actionsare really poorly designed and
untested. One study found that action
plans didn't follow any controlled implementation
pattern, so no link could be established between the plans
and actual tangible operational improvements.
Put simply, the fixes that result from these RCA processes
may be too flawed or too poorly implemented to really help
(03:23):
improve operational outcomes, and maybe they can even create
some new ones. Also, they found that most of
the recommendations ignored deeper systemic issues.
For instance, most of the proposed recommendations focused
on active errors from people andneglected latent contributing
factors. So the focus on people provides
(03:45):
maybe short term solutions, but really only partially helps to
avoid future problems. It's not really improving the
conditions that people work in. So instead of addressing system
flaws, these RCA processes, according to this research,
often results in blaming frontline staff of a really
shallow solutions. They also found there was a
strong lack of follow up and verification of the
(04:07):
improvements. So the RCA processes didn't
really require any cheques to see whether the improvements
were actually carried out. This disconnect undermines A
usefulness. Without follow up, even good
recommendations can fail. Not surprisingly, back down that
there was quite low involvement from those closest to the
incident. So managers and personnel
(04:28):
involved in the actual adverse events had low participation
rates in the investigation teams.
This limited not only their insight into what was actually
happening, but also reduced potential psychological recovery
for second victims and those emotionally affected by the
incident. Put another way, leaving out
frontline staff leads to weaker analysis and missed chances for
(04:49):
healing and improving. And they found that blame
culture really discourages reporting and RCA participation.
So they found cultures that theywere really focused on searching
for guilty party, those responsible.
And this created tensions in thework environments that really
challenged interprofessional relationships.
(05:09):
His fear and lack of trust LED professionals to refuse to even
participate in the incident processes.
Simply put, if people fear punishment, they won't speak up.
So in conclusion, Rcas well may be useful for some things can
really lead to very remote and immediate fixes, but they don't
seem to be very effective for long term or effective
(05:32):
implementation measures to prevent the reoccurrence of
incidents. And although some studies have
demonstrated the usefulness of Rcas and in the recommendations,
most of the published studies inat least in this study in
healthcare found that just over half of the recommendations
weren't even useful enough to prevent the same incidents from
recurring in the future. They were disconnected from the
(05:54):
things that they were supposed to be focusing on and fixing.
So they conclude that RCA approaches can potentially help
understand some contributing factors, but they often failed
to help with the fixes or fix the issues.
In contrast, the fixers tend to be really weak, shallow, or even
I'll suited to improve these complex environments like
(06:14):
healthcare. So what do we make of these
findings? Well, again, it's not one tool.
RCA covers a lot of ground. In any case, I think we really
need to be clear around on what we want the RCA processes to
achieve, can achieve, should achieve and evaluate whether our
processes are configured for those results.
(06:35):
Do we have any unfair expectations or unrealistic
expectations around what we wantthem to be able to achieve?
Do we even have the right resources and expertise?
Maybe a shake up is needed. Do we need less investigations
and maybe more proactive learning?
Do we need better feedback and engagement channels with workers
or design level interventions? What about different teams being
(06:55):
involved or trying different tools and approaches?
And maybe instead of building a shopping list of contributing
factors and detailed timelines that will benefit few, maybe
spend more time on learning about daily work, the
constraints and designing in better, safer work.
Importantly, work that I coveredpreviously from Lundberg
(07:17):
suggested that investigations, probably like most processes in
organisations, are affected by ahost of sociopolitical factors
separate to the incident itself,but still significantly impact
what investigations find they construct and ultimately try and
fix. Maybe to a degree some of these
constraints are just fundamentallimits within organisations.
(07:39):
So for limitations there was a few, but I think the main 1
really is there's a relatively limited body of evidence on Rcas
and how what they actually change in occupational settings,
also unlike a broken record. But there's also the issue with
using statistically rare events like incidents.
It's a huge limitation. It's just really difficult to
(08:02):
connect upstream factors with downstream outcomes.
However, a number of other healthcare RCA studies using
different measures like corrective action quality also
find widespread and systemic issues with the RCA approaches.
That's it on Safe. As I'm Ben Hutchinson, please
(08:22):
help share, rate, and review, and check out safetyinsightsorg
for more research. Finally, feel free to support
Safe ads by shouting a coffee link in the show notes.