All Episodes

July 13, 2025 11 mins

Are our safety myths--like most accidents being the result of human error--holding back genuine improvement within safety?


Can myths like these actually hamper learning, and increase operational risk?


Today's article is from Besnard, D., & Hollnagel, E. (2014). I want to believe: some myths about the management of industrial safety. Cognition, Technology & Work16, 13-23.


Feel free to shout me a coffee to support my site &podcasts: https://buymeacoffee.com/benhutchinson


More research at SafetyInsights.Org

 

Intro/Output "Dark Synth Wave" by ElephantGreen(PixaBay.com)


Make sure to subscribe to Safe AF on Spotify/Apple, and ifyou find it useful then please help share the news, and leave a rating and review on your podcast app.
I also have a Safe AF LinkedIn group if you want to stay up to date on releases.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:07):
Are we stuck in a safety trap, constantly pointing fingers at
human error or adding more layers of ineffective
protection? What if the very beliefs guiding
our safety efforts are holding us back?
Good day everyone. I'm Ben Hutchinson and This is
Safe as a podcast dedicated to athrifty analysis of safety, risk
and performance research. Visit safetyinsights.org for

(00:31):
more research. Today's paper is from Bernard
and Holnagel 2014, titled I Wantto Believe Some Myths about the
Management of Industrial Safety in Cognition, Technology and
Work. The authors argue that many
industrial safety practises are littered with fragile beliefs,
rendering safety management flawed and ineffectual.

(00:54):
They define a myth not merely asan assumption, but as an idea or
story that many people believe, but which is not true.
These myths are deeply rooted inour culture, widely shared, and
profoundly influence decisions and actions.
The authors contend that acknowledging these myths is a
crucial first step towards genuinely improving industrial

(01:16):
safety. Let's explore 6 myths.
Myth 1. Human error is the largest
single cause of accidents and incidents.
This belief is pervasive. For instance, a 2010 report
declared that human error is involved in over 90% of all
accidents and injuries in the workplace.
This idea has a long history. It remains A fundamental part of

(01:39):
many investigation methods, often marking the deepest point
of analysis. However, the authors argue that
this is a simplistic and counterproductive view.
They highlight that labelling something human error is a
judgement often made with hindsight bias.
It kind of implies wrongdoing inseeking A culprit.

(02:00):
Crucially, it typically focuses only on sharp end operators, the
people directly involved with the process, ignoring the
broader context and working conditions imposed by managers
and the organisation. And if human error is the cause
of an event going wrong, what about the countless times human
actions make things go right? They propose A revised

(02:23):
statement. Human error is an artefact of a
traditional engineering mindset that treats humans like fallible
machines, failing to consider the vital role working
conditions play in shaping performance.
Myth 2 Systems will be safe if people comply with the
procedures. We often assume that following a
procedure not only gets the job done, but gets it done safely,

(02:46):
and any deviation from procedures automatically creates
risk. This treats people as if they
were machines. But this isn't helpful.
Procedures are inherently incomplete.
They can't cover every possible situation or fully describe
their reaction. Humans constantly interpret and
adapt to procedures based on thesituation and their experience.

(03:08):
The authors draw on a major accident and they highlight that
rigid blind compliance can actually be detrimental to
safety and efficiency. Human flexibility is essential
to compensate for the brittleness of procedures and
actually contribute to safety. A revised statement is actual
working situations usually differ from what the procedures

(03:30):
assume and strict compliance maybe detrimental to practise.
Procedures should be used carefully and intelligently.
Myth 3. Safety can be improved by
barriers and protection. More layers of protection
results in higher safety. This seems intuitive, reflecting
the defence in depth approach. However, the relationship

(03:51):
between protection and risk is not straightforward.
One reason is psychological. People often adjust their
behaviour based on perceived risk.
The authors tie this point to risk homeostasis, suggesting
people maintain a certain level of comfortable risk.
For example, some limited studies suggested that taxi
drivers with ABS braking systemsdrove more aggressively in

(04:15):
curves and actually had a slightly higher accent rate than
those without ABS. However, this argument hasn't
aged too well because the risk homeostasis hypothesis as
advanced by Gerald Wilde isn't well empirically substantiated.
But behavioural adaptations is well supported, meaning people

(04:35):
can and do change their behaviour in response to
interventions and designs etcetera.
The second reason is technical. Adding protection inherently
increases the system's complexity.
More components, more couplings not only introduce new value
points, but also exponentially increases the number of

(04:56):
combinations that can lead to unwanted outcomes.
A revised statement is technology is not neutral.
Additional protection changes behaviour so that the intended
safety improvements might not beobtained.
Myth 4. Accidents have root causes, and
root causes can be found. Root cause analysis is a common
umbrella term for a number of different techniques and

(05:18):
approaches, assuring that systemparts are causally related and
effects propagate orderly, allowing us to trace problems
back to their origin. However, this method relies on
assumptions that don't really hold for complex systems that
events repeat, outcomes are strictly bimodal, that is
correct or incorrect, and cause effect relations can be fully

(05:41):
described. Human performance isn't bimodal,
it varies. It rarely fails completely, and
humans can recover from failures.
When an analysis points to a human error as the root cause,
it often overlooks how the same human flexibility makes things
go right most of the time. The paper suggests that the
preference for simple root causeexplanations is due to a

(06:04):
psychological desire for comfortand satisfaction in tracing
something unfamiliar back to something familiar.
Instead, the authors argue that unwanted outcomes in complex
systems don't necessarily have clear, identifiable causes.
A revised statement is human performance cannot be described
as if it was bimodal. In complex systems, things that

(06:27):
go wrong happen largely in the same ways that things go right.
Myth 5. Accident Investigation is a
rational process. While investigations are serious
undertakings, they are rarely purely rational.
Practical constraints like deadlines and resource
limitations often dictate the depth of analysis and the

(06:47):
methods used, becoming a trade off between efficiency and
thoroughness. Moreover, every investigation
method embodies some sorts of assumptions about how accidents
happen. Perhaps most challenging is the
need to establish responsibilities.
This need can heavily bias investigations, making finding a

(07:07):
culprit more important than understanding the contributing
or causal factors. As Woods at output it,
attributing error is fundamentally a social and
psychological process and not anobjective technical one.
The paper suggests that AccidentInvestigation is a social
process where causes are constructed rather than found.

(07:28):
Indeed, investigators operate ona what you look for is what you
find principal meeting their chosen method and their existing
world views directs what they see and don't see.
To overcome this, it's essentialto pursue what they call second
stories, deeper analysis that gobeyond the simplified first
stories of apparent causes. A revised statement is.

(07:50):
Accident Investigation is a social process where causes are
constructed rather than found. Myth 6 Safety first statements
like safety always has the highest priority and will never
be compromised, are often made for communication purposes,
expressing noble values and goals.
The paper cites a 2004 assessment of the BP Texas City

(08:13):
refinery accident, which had a low injury rate but experienced
a major explosion, killing fifteen people before the
explosion, employees ranked making money, cost and budget
and production as their top priorities with major incidents
and the coming in 5th. Safety has undeniable financial
implications. Safety costs are immediate and

(08:36):
tangible, while their benefits are often potential and distant.
This leads to trade offs where safety is often as high as
affordable. Safety budgets, like any other
constraint, are limited and decisions involve prioritisation
and feasibility, often trading safety against economy.
A revised statement is safety will be as high as affordable

(08:59):
from a financial, risk, acceptability and ethical
perspective. So some takeaways is that these
myths shared across all layers of organisations in society lead
to flawed safety practises. They are resistant to change
because they are rarely questioned.
The authors propose a radical shift.
Instead of defining safety as a property that a system has,

(09:22):
safety should be seen as a process, something a company
does. It's dynamic, constantly
negotiated, and varies in response to changing conditions.
More importantly, the goal of safety should shift from
focusing on what goes wrong to also understanding and enabling
what goes right. We spend immense efforts

(09:43):
preventing unsafe sort of functioning, but relatively
speaking, hardly any efforts aredirected towards bringing about
safe and reliable functioning. Measuring safety solely by a low
number of negative outcomes is insufficient.
It should be tied to indicators of an organisation dynamic
stability, its ability to succeed under varying

(10:05):
conditions, its capacities to respond, to, monitor, anticipate
and learn. In a complex world with multiple
interacting constraints, operating perfectly is
impossible. The safety myths covered here
support an unrealistic ideal of safety management.
To successfully operate increasingly complex systems, we

(10:26):
need to abandon these myths and adopt more sensible and
sustainable assumptions about safety.
Hence, complex systems work because people learn to identify
and overcome design flaws and functional glitches.
Quoting Deco. People finish the design in
practise. People can adjust their
performance to the current conditions.
People interpret and apply procedures to match the

(10:48):
situation. People can detect when something
is about to go wrong and intervene before the situation
becomes seriously worsened. This means that systems work
because people are flexible and adaptive, rather than because
the systems have been perfectly thought out and designed.
But of course, these same positive capabilities can also

(11:09):
be our roads to ruins, because people may adapt in ways that
are locally optimised but more hazardous at a global,
organisational level. That's it on Safe As I'm Ben
Hutchinson. Please help share, rate and
review, and check out safetyinsightsorg for more
research. Finally, feel free to support

(11:30):
Safe As by shouting a coffee link in the show notes.
Advertise With Us

Popular Podcasts

24/7 News: The Latest
Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

The Clay Travis and Buck Sexton Show

The Clay Travis and Buck Sexton Show

The Clay Travis and Buck Sexton Show. Clay Travis and Buck Sexton tackle the biggest stories in news, politics and current events with intelligence and humor. From the border crisis, to the madness of cancel culture and far-left missteps, Clay and Buck guide listeners through the latest headlines and hot topics with fun and entertaining conversations and opinions.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.