Sie sind auf Seite 1von 3

insight

To err is human...
but organisations make mistakes too. By understanding the cause of errors,
accident, incident and near-miss investigations can be more effective. Ronny
Lardner and Mark Fleming tell us more

E
ARLIER THIS year the Health and Box 1: Underlying causes of accidents accident investigations
Safety Commission issued a are carried out, they
A NUCLEAR industry study identified the following underlying
discussion document on proposals frequently concentrate
organisational, team and individual causes of accidents:
to introduce a new duty on employers to on determining the
investigate the cause of workplace immediate technical
% of accidents
accidents. There was support for goal- causes, and often stop
Deficient procedures or documentation 43
setting legislation, further guidance, and when there is someone
Lack of knowledge or training
promotion of root-cause investigation of 18 to blame for the event
Failure to follow procedures
the fundamental causes of accidents and 16 (often those closely
incidents. Deficient planning or scheduling 10 involved with the
So what are the implications for
Miscommunication 6 circumstances), failing
chemical engineers? The 80:20 accident
Deficient supervision 3 to get to the underlying
causation rule of thumb (80% human and causes.
Policy problems 2
organisational causes; 20% technical So what determines
Other 2
causes) suggests hard-won technical when a halt is called in
knowledge will need to be supplemented the search for root
with an insight into how human andorganisation, and extends beyond front- causes? It seems people have stopping
line workers involved at the time of an
organisational factors can cause or rules in their search for explanations,
incident to include management
contribute to accidents or unplanned which are guided by the underlying
responsible for work organisation,
incidents. Although this short article cannot model or paradigm of human behaviour
equipment design, organisational
provide all the answers, it seeks to whet and error they hold. Four major error
priorities and the like.
the appetite for this complex field of paradigms have been distinguished 1
knowledge, and point interested readers in (see Box 2). Which paradigm(s) do you
the right direction. subscribe to?
Models of human and
The basic problem is that we all can All of these error paradigms have
organisational error
and do make errors, regardless of how some merit. However the uncritical
The HSE discussion document on
well-trained and motivated we are. This adoption of one paradigm to the
accident investigation notes that when
human fallability exists throughout the exclusion of others is likely to result in
human and organisational root causes of
Box 2: Error paradigms accidents and incidents being missed, as
Error paradigm Basic assumptions Solutions in the example in Box 32.

1. Engineering error people are an unreliable remove people from the Useful tools
component in the system system via automation To help the non-specialist systematically
improve human reliability uncover root human and organisational
through good workplace causes, and learn from such incidents,
and interface design help is available. Larger organisations
may employ or retain human factors
2. Individual error poorly motivated people discipline those involved
specialists. Generic, commercially-
commit unsafe acts, or reward safe behaviour
break rules and procedures available human error root-cause
reduce organisational analysis techniques exist, as do versions
pressures to violate rules tailored to specific industries. For those
and procedures without such resources, or wishing an
introduction to the topic, the following
3. Cognitive error human error occurs due match people with
to a mismatch between demands of job may be useful.
individual capabilities and HSEs Successful health and safety
ensure job and workload
the demands of the job management guidance3, Appendix 5,
is do-able
contains an approach to analysing
4. Organisational error poor management examine adequacy of immediate and underlying causes of
decisions create conditions management accidents, including human and
which influence likelihood audit safety management
organisational contributions such as
of error systems individual behaviour, control,
cooperation, communication,

18 the chemical engineer 7 October 1999


insight

competence, and arrangements for


monitoring and review. Case study: The application of human factors root
More detail on how human factors cause analysis
job, individual and organisational can THREE very similar incidents occurred over a two-year period in a petrochemicals
influence health and safety are provided process plant. On each occasion a series of valves had been wrongly aligned,
in a recent HSE publication Reducing resulting in contamination of pipework and unintended cross-contamination of raw
error and influencing behaviour 4. materials, finished product and waste by-products.
Significantly for the accident The first two incidents had resulted in a series of procedural, training and plant
investigator, this publication includes a design recommendations. Following the occurrence of the third very similar incident,
detailed account of the human management suspected an unknown factor was contributing. The present author
contribution to accidents, and examples reviewed what was known about all three incidents, and visited the work site.
of solutions implemented to address All three incidents involved the erroneous operation of two pairs of two valves,
human factors aspects influencing which were replicated at two nearby locations. These eight valves were designed as
accidents, near-misses or potential shown in Figure 1.
health problems. A human factors Can you spot the potential for error?
checklist which can be used
preventatively, without waiting for an
Location 1
accident to occur, is also included on
pages 5455. Pipeline 2
Pipeline 1

Box 3: Two ways of


investigating an
accident
Pipeline 1 Pipeline 1 Pipeline 2 Pipeline 2
TWO fitters slackened the bolts on product valve waste valve waste valve product valve
the wrong flange of a double-bodied
valve. This caused the valve to fall
apart and release flammable material. Location 2
An initial investigation revealed the
fitters were unfamiliar with this type Pipeline 3 Pipeline 4
of valve, and recommended a verbal
reprimand to the fitters to take more
care.
Question: What error paradigm was Pipeline 3 Pipeline 3 Pipeline 4 Pipeline 4
operating during this initial waste valve product valve product valve waste valve
investigation?
Further investigation indicated the Figure 1: An error designed to happen
fitters had been retrained as part of a
multi-skilling exercise, and their
original expertise was in electrical At location 1, pipeline 1, the product valve is on the left and waste is on the right.
engineering. This more in-depth This layout is transposed for the adjacent pipeline 2. Furthermore, at the nearby
investigation identified the need to location 2, the valve layout is the exact opposite to location 1. It is therefore very
examine: easy for operators to apply the right action (open/close a valve) to the wrong object
the adequacy of training; (product valve instead of waste valve, or vice versa) due to the very confusing valve
the way jobs were allocated so they layout. Human error theory predicts this is most likely to happen when our attention
were only given to those who were is diverted, or we are preoccupied by other things. Indeed this is what happened in
competent; two of the incidents described. Many readers will recognise the every-day nature of
the operation of the permit-to-work this phenomena. How many of us have, when preoccupied or distracted at home,
system which should have ensured placed the milk in the oven instead of the fridge, or poured hot water into the sugar
the pipework was isolated and bowl instead of teapot, or similar?
drained before work started. Experienced operators and plant managers were not aware of the confusing
layout, and had not noticed the potential for error.
Question: What type of error Having identified the missing human factors root cause in the three incidents,
paradigm was operating during the recommendations could be made to address each of the error paradigms described
subsequent investigation, and how in Table 1.
might this influence the likelihood of a In the short-term, warn operators of poor design, and likely consequences
recurrence? (individual/cognitive error paradigm)
During the second investigation, a Improve valve layout to remove confusion, and physically prevent problematic
less restricted set of stopping rules valve line-ups (engineering error paradigm)
meant the emphasis on blame was Include human factors expertise in future accident investigation teams, specify
replaced by learning how to avoid human factors input to design of new plant, and raise awareness of human factors
similar incidents in future. issue amongst senior managers (organisational error paradigm)

7 October 1999 the chemical engineer 19


insight

Learning from near- where a blame culture exists. Ronny Lardner and Mark Fleming can be
misses Investigation of near-misses contacted at The Keil Centre, 5 South
Another useful human factors must overcome any limiting Lauder Road, Edinburgh EH9 2LJ UK,
technique for uncovering root error paradigms used not just by tel +44 131 667 8059, fax +44 131 667
causes is near-miss (or near- investigators, but also by the 7946, email ronny@keilcentre.co.uk
hit) reporting schemes. Rather person filing the report. In our
than passively waiting for an case study, the person
accident/ incident to happen, responsible for turning the valve
in the third incident believed
References
near-misses are reported to a
1. The Causes of Human Error, by
confidential, trusted source for they were entirely to blame, and
Lucas, D. in Redmill, F. and Rajan, J.
analysis and identification of steps had considerable difficulty accepting they
(1997) Human factors in safety-critical
necessary to prevent a recurrence. The were essentially programmed to fail by
systems Oxford: Butterworth Heinemann
best-known examples are in the aviation those who had designed the system years
and aerospace industries. These before. Thus an early task in 2. Adapted from A new duty to
techniques rely on individuals willingness implementing near-miss reporting is to investigate accidents: Health and Safety
to report incidents they may have been widen everyones conception of accident Commission Discussion Document, HSE
personally involved in, so will not thrive causation. Books (1999)
3. Successful health and safety
Forthcoming HSE root cause analysis tool management (1997) Suffolk: HSE Books
PREVIOUS work by HSE has developed both a model for health and safety 4. Reducing error and influencing
management (HSG 65) and a methodology for costing accidents (HSG 96). Current behaviour HSG48 Suffolk: HSE Books
HSE research has made the link between these two publications by developing and 11.50
testing a refined and simplified method of costing accidents and a new root cause
analysis tool. The root cause analysis tool is based on an organisational model, which HSE publications can be obtained by
leads the accident investigator back to failures in high-level elements of how the contacting HSE Books on tel +44 1787
organisation is managed. This work will be published by HSE in early 2000. 881165 and fax +44 1787 313995

20 the chemical engineer 7 October 1999

Das könnte Ihnen auch gefallen