100% found this document useful (1 vote)
209 views26 pages

4 Safety Models & Accident Models

The document discusses several safety and accident models: 1. Mental models refer to implicit beliefs about what contributes to safety in a system and how accidents occur. These impact system design and operational decisions. 2. Early models viewed accidents as "acts of God" beyond human control. The simple sequential accident model viewed accidents as arising from a sequence of events that could be prevented by removing one link in the chain. 3. Later models recognized accidents as arising from a combination of active errors and latent environmental factors. The bow-tie model maps causes and impacts, with preventive and protective barriers. 4. The loss of control accident model depicts accidents as occurring when a system destabilizes beyond recovery points
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
209 views26 pages

4 Safety Models & Accident Models

The document discusses several safety and accident models: 1. Mental models refer to implicit beliefs about what contributes to safety in a system and how accidents occur. These impact system design and operational decisions. 2. Early models viewed accidents as "acts of God" beyond human control. The simple sequential accident model viewed accidents as arising from a sequence of events that could be prevented by removing one link in the chain. 3. Later models recognized accidents as arising from a combination of active errors and latent environmental factors. The bow-tie model maps causes and impacts, with preventive and protective barriers. 4. The loss of control accident model depicts accidents as occurring when a system destabilizes beyond recovery points
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 26

Safety models & accident models

Eric Marsden
<eric.marsden@risk-engineering.org>
Mental models

▷ A safety model is a set of beliefs or hypotheses (often implicit)


about the features and conditions that contribute to the safety of
a system

▷ An accident model is a set of beliefs on the way in which


accidents occur in a system

▷ Mental models are important because they impact system


design, operational decisions and behaviours
Accidents as “acts of god”

▷ Fatalism: “you can’t escape your fate”

▷ Defensive attitude: accidents occur due to circumstances


“beyond our control”

▷ Notion that appeared in Roman law: reasons that could


exclude a person from absolute liability
• e.g. violent storms & pirates exempted a captain from
responsibility for his cargo
Simple sequential accident model

H. Heinrich’s domino model


(1930)

Assumptions:
▷ Accidents arise from a
quasi-mechanical sequence of
events or circumstances, that
occur in a well-defined order

▷ An accident can be prevented


by removing one of the
“dominos” in the causal
sequence
Simple sequential accident model
The “safety pyramid” or “accident triangle”
(H. Heinrich, 1930 and F. Bird, 1970)

Assumptions:
▷ Each incident is an “embryo” of an accident
(the mechanisms which cause minor
incidents are the same as those that create
major accidents)

▷ Reducing the frequency of minor incidents


will reduce the probability of a major
accident

▷ Accidents can be prevented by identifying


and eliminating possible causes
Simple sequential accident model

According to this model, safety is improved by identifying


and eliminating “rotten apples”
▷ front-line staff who generate “human errors”

▷ whose negligent attitude might propagate to other


staff

Some accidents (in particular in high-risk systems) have more complicated origins…
On “human error”

‘‘
for a long time people were saying most
accidents were due to human error and this is
true in a sense but it’s not very helpful. It’s a
bit like saying that falls are due to gravity…
— Trevor Kletz

A useful alternative concept to human error is


performance variability.
Is it relevant to count errors?

▷ Counting errors produces a quantitative assessment of the “safety level” of a system

▷ Allows inter-comparison of systems

▷ Can constitute the point of departure for a search for the underlying causes of incidents

inverse relationship
number of errors safety level

quantity quality

This simplistic model is very criticized


Is counting errors relevant?

Who is more dangerous?


Is counting errors relevant?

▷ 700 000 doctors in the USA


▷ between 44 000 and 98 000
people die each year from a
medical error
→ between 0.063 and 0.14
accidental deaths per doctor
per year
Is counting errors relevant?

▷ 700 000 doctors in the USA ▷ 80 million firearm owners in


▷ between 44 000 and 98 000 the USA
people die each year from a ▷ responsible for ≈1 500
medical error accidental deaths per year
→ between 0.063 and 0.14 → 0,000019 accidental deaths per
accidental deaths per doctor firearm owner per year
per year
Is counting errors relevant?

▷ 700 000 doctors in the USA ▷ 80 million firearm owners in


▷ between 44 000 and 98 000 the USA
people die each year from a ▷ responsible for ≈1 500
medical error accidental deaths per year
→ between 0.063 and 0.14 → 0,000019 accidental deaths per
accidental deaths per doctor firearm owner per year
per year

of a
t the human error
The probability tha es hig her
e is 7500 tim
doctor kills someon De kke r]
m ow ne r. [S.
than for a firear
Epidemiological accident model

procedures

s
l barriers

d worker
systems andnagement

ion
cooperat
James Reason’s Swiss

safety ma
accident
technica

sharp-en
cheese model
event incident
from "Human Error" (James Reason)

Assumption: accidents are produced by a combination of active errors (poor safety


behaviours) and latent conditions (environmental factors)

Consequences: prevent accidents by reinforcing barriers. Safety management requires


monitoring via performance indicators.
Bow-tie model

impacts
causes

preventive top protective


barriers event barriers
impacts
event tree
event
top
no flow to
receiver
no flow from
component B

fault tree
no flow into component B
component B blocks flow
no flow no flow
from com- from com-
ponent A1 ponent A2
Bow-tie model

no flow from component no flow from component


source1 A1 blocks source2 A2 blocks
flow flow
causes
Bow tie diagram
Bow-tie: example
Loss of control accident model

Destabilization point
PREVENTION

RECOVERY

ACCIDENT MITIGATION

Figure source: French BEA


Drift into failure
Human behaviour in any large
system is shaped by constraints:
profitable operations, safe
economic failure
operations, feasible workload.
Actors experiment within the
space formed by these constraints.

unsafe
space of possibilities

unacceptable
workload

Figure adapted from Risk management in a dynamic society, J. Rasmussen, Safety Science, 1997:27(2)
Drift into failure
Human behaviour in any large
system is shaped by constraints:
profitable activity, safe operations,
economic failure
feasible workload. Actors
experiment within the space
formed by these constraints.

unsafe Management will provide a “cost


gradient” which pushes activity
towards economic efficiency.
management
pressure for
efficiency

unacceptable
workload

Figure adapted from Risk management in a dynamic society, J. Rasmussen, Safety Science, 1997:27(2)
Drift into failure
Human behaviour in any large
system is shaped by constraints:
economic, safety, feasible
economic failure
workload. Actors experiment
within the space formed by these
gradient towards
least effort constraints.

unsafe Management will provide a “cost


gradient” which pushes activity
towards economic efficiency.

Workers will seek to maximize


unacceptable the efficiency of their work, with
workload a gradient in the direction of
reduced workload.

Figure adapted from Risk management in a dynamic society, J. Rasmussen, Safety Science, 1997:27(2)
Drift into failure
These pressures push work to
migrate towards the limits of
acceptable (safe) performance.
economic failure
Accidents occur when the
system’s activity crosses the
boundary into unacceptable safety.

unsafe
A process of “normalization of
drift towards failure deviance” means that deviations
from the safety procedures
established during system design
progressively become acceptable,
unacceptable
workload
then standard ways of working.

Figure adapted from Risk management in a dynamic society, J. Rasmussen, Safety Science, 1997:27(2)
Drift into failure
Mature high-hazard systems
apply the defence in depth design
principle and implement multiple
economic failure independent safety barriers. They
also put in place programmes
aimed at reinforcing people’s
questioning attitude and their
chronic unease, making them more
unsafe
sensitive to safety issues.

These shift the perceived


effect of a boundary of safe performance to
“questioning
the right. The difference between
attitude”
unacceptable the minimally acceptable level
workload of safe performance and the
safety margin
boundary at which safety barriers
are triggered is the safety margin.

Figure adapted from Risk management in a dynamic society, J. Rasmussen, Safety Science, 1997:27(2)
Non-linear accident model

Systemic models
▷ FRAM (Hollnagel, 2000)
▷ STAMP (Leveson, 2004)

Assumption: accidents result from an unexpected combination and the resonance of normal
variations in performance

Consequences: preventing accidents means understanding and monitoring performance


variations. Safety requires the ability to anticipate future events and react appropriately.
▷ Sodom and Gomorrah burning (slide 26): Picu Pătruţ, public domain, via
Image Wikimedia Commons
credits ▷ Dominos (slide 27): H. Heinrich, Industrial Accident Prevention: A Scientific
Approach, 1931

For more free content on risk engineering,


visit risk-engineering.org
Feedback welcome!
This presentation is distributed under the terms of the
Creative Commons Attribution – Share Alike licence

@LearnRiskEng

fb.me/RiskEngineering

Was some of the content unclear? Which parts were most useful to
you? Your comments to feedback@risk-engineering.org
(email) or @LearnRiskEng (Twitter) will help us to improve these
materials. Thanks!

For more free content on risk engineering,


visit risk-engineering.org

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy