Sie sind auf Seite 1von 4

6 | Loss Prevention Bulletin 227

October 2012

Trevor Kletz anniversary

Missed opportunities in accident investigation


Trevor Kletz
There is an old story about two manufacturers of shoes who sent representatives to an undeveloped country to look into the opportunities for sales there. Both of them soon sent reports back. One said, No business here. People dont wear shoes. The other said, Great opportunities here. People dont wear shoes. Both representatives had the same data but their interpretations of it were different and told us more about them than about the opportunities for sales. Similarly, different accident investigators can draw different conclusions from the same evidence and propose different actions. The collection of evidence is usually adequate, at least in the oil and chemical industries, but the interpretations of the evidence vary and we often miss opportunities to learn from it. Some accident reports tell us more about the interests, experiences and beliefs of the investigators than about the best ways of preventing it happening again. In a paper presented at the Hazards XVI Conference in 20011 I described a number of learning opportunities that are frequently missed, during or following the preparation of a report. Having paid the tuition fee, we should learn the lessons. This paper summarises the 2001 paper, with a few changes, and illustrates some of the missed opportunities by different examples. industry, if we exclude explosives. The rupture of a temporary pipe caused a large leak of hot flammable liquids, which vaporised and exploded, destroying the plant and killing 28 people. The report identified the lack of any procedure for the management of change and the lack of mechanical engineering expertise on the plant at the time. It also described in great detail the events that led to the rise in pressure that ruptured the temporary pipe. Though of interest this was of minor importance as the plant could have withstood the pressure change before the temporary pipe was installed. All plants undergo pressure changes from time to time for many reasons and are fitted with relief systems to prevent damage by pressures greater than design. The rise in pressure that ruptured the pipe did not reach the set point of the relief valve. Nevertheless the reason for the rise in pressure has been debated ever since.3 The most important lesson to be learned from the explosion was missed by the inquiry and by most commentatorsthe leak and explosion were so big because the plant contained about 400 tonnes of hot flammable liquid under pressure. It contained so much because the conversion was low, about 6%, so 94% of the feed got a free ride and had to be recovered and recycled many times. If we could increase the conversion then the inventory in the plant, and the maximum size of leak, would be lower. The plant would be inherently safer. One company did start to develop a more efficient process but abandoned the research when they realised that they would need no new plants in the foreseeable future. Nevertheless, Flixborough drew attention to the value of inherently safer designs and much progress has been made in developing such processes and equipment.

Accident investigations often find only a single cause


Many accident reports identify only a single cause, though many people, from the designers, down to the last link in the chain, the operator who closed the wrong valve, had an opportunity to prevent the accident. The single cause identified is usually this last link in the chain of events that led to the accident. Just as we are blind to all but one of the octaves in the electromagnetic spectrum so we are blind to many of the opportunities that we have to prevent an accident. But just as we have found ways of making the rest of the spectrum visible, so we need to make all the ways of preventing an accident visible.

Accident investigations list human error as a cause


Human error is far too vague a term to be useful. We should ask, What sort of error? because different sorts of error require different actions if we are going to prevent the errors happening again4. Was the error a mistake due to poor training or instructions, so that the intention was wrong? If so, we need to improve the training and instructions and, if possible, simplify the task. Was the error due to a violation or non-compliance, that is, a deliberate decision not to follow instructions or recognised good practice? If so, we need to explain the reasons for them, as we do not live in a society in which people will uncritically do what they are told. We should simplify the task if possible because it is difficult to persuade everyone to use the correct method if an incorrect method is easier to use; we should check from time to time to see that instructions are being

Accident investigations are often superficial


Even when we find more than one cause, we often find only the immediate causes. We should look beyond them for ways of avoiding the hazards, such as inherently safer design and for weaknesses in the management system. For example, consider the official report2 on the explosion at Flixborough in 1974, the worst accident in the UK chemical

Institution of Chemical Engineers 0260-9576/12/$17.63 + 0.00

Loss Prevention Bulletin 227

October 2012 | 7

followed and never turn a blind eye. Remember also that many, perhaps most, violations occur when people think they have found a better way of doing a job. Was the task beyond the ability of the person asked to do it, perhaps beyond anyones ability? If so, we need to redesign the task. Was it a slip or lapse of attention? In contrast to mistakes, the intention may have been correct but it was not fulfilled. It is no use telling people to be more careful as no one is deliberately careless. We should remove opportunities for error by changing the design or method of working. Designers, supervisors and managers make errors of all these types, though slips and lapses of attention by designers and managers are rare as they usually have time to check their work. Errors by designers produce traps into which operators fall, that is, they produce situations in which slips or lapses of attention, inevitable from time to time, result in accidents. Errors by managers are signposts pointing in the wrong directions.

Accident reports look for people to blame


The gut reaction of many people, including the press, after an accident is to ask, Who is to blame? However, blame is only relevant when the error is a violation and even then most violations occur because a blind eye has been turned to them in the past or because the people concerned thought they had found a better way of carrying out the job. If the instructions are wrong or do not cover all situations, violations can prevent an accident. However, blaming human error for an accident diverts attention from what can be done by better design or better methods of operation. In recent years the tendency to blame operators has decreased but there is now a greater willingness to blame managers. The press and politicians argue that accidents occur because managers put costs and output before safety. The vast majority do not do so. Managers, even those at the top, are not superhuman. Like everyone else they make errors because they lack knowledge, do not realise they could do more, cannot do everything at once and so on.

We change procedures rather than designs


There are several different actions we can take after we have identified a hazard as a result of an accident (or in some other way) to prevent it causing another accident or to mitigate the consequences if it does: Our first choice should be to remove the hazard by inherently safer design. For example, can we use a safer material instead of a toxic or flammable one? Even if we cannot change the existing plant we should note the change for possible use on the next plant. If we cannot remove the hazard then our next choice should be to keep it under control by adding passive protective equipment, that is, equipment that does not have to be switched on or does not contain moving parts. The third choice is active protective equipment, that is, equipment switched on automatically; unfortunately the equipment may be neglected and fail to work or it may be disarmed. The fourth choice is reliance on actions by people, such as switching on protective equipment; unfortunately the person concerned may fail to act due to forgetfulness, ignorance,

distraction, poor instructions or, after an accident, because he or she has been injured. Finally, we can use the techniques of behavioural science to improve the extent to which people follow procedures and accepted good practice. By listing this as the last resort I do not intend to diminish its value. Safety by design should always be our aim but may be impossible and experience shows that behavioural methods can bring about substantial improvement in the everyday types of accident that make up most of the lost-time and minor accident rates. However, the technique has little effect on process safety. Behavioural methods should not be used as an alternative to the improvement of plant design or methods of working when these are reasonably practicable. To make these various ways of preventing incidents clearer, let us consider a simple but common cause of injury and even death in the homefalls on the stairs. The inherently safer solution is to avoid the use of stairs by building a single storey building or using ramps instead of stairs. If that is not reasonably practicable a passive solution is to install intermediate landings so that people cannot fall very far or to avoid particular types of stair, such as spiral staircases, which make falls more likely. An active solution is to install a lift. Like most active solutions it is expensive and involves complex equipment that is liable to fail, expensive to maintain and easy to neglect. The procedural solution is to instruct people to always use the handrails, never to run on the stairs, to keep them free from junk and so on. This can be backed up by behavioural techniquesspecially trained fellow workers (or parents in the home) look out for people who behave unsafely and tactfully draw their attention to the action. In some companies the default action after an accident is to start at the wrong end of the list of alternatives and recommend a change in procedures or better observation of procedures, often without asking why the procedures were not followed. Were they, for example, too complex or unclear, or have supervisors and managers turned a blind eye in the past? Changing procedures is, of course, usually quicker, cheaper and easier than changing the design, but is least effective. For example, rainwater and spray passed through the intake of a ventilation fan and fell on to a switchboard. This occurred on a ship but could have occurred elsewhere. A short circuit started a fire; it was soon extinguished but all power was lost. The ship had to ask for help and was towed back to port. The water entered through the ventilation intake as the louvres in it had been installed so that they directed spray and rain into the engine room rather than away from it. (The report5 said that they had been installed upside-down but the author must have meant back-to-front.) The reports first recommendation was that louvres should be checked to make sure they are fitted correctly. The author did not realise that they should also be designed so that they could not be fitted wrongly, or so that it was obvious if they were, or at least so that the inside and outside, top and bottom were clearly labelled. The report did, however, recommend that switchboards should be covered to prevent water entering from above. Computers provide many other examples of an unrealistic attitude to human error. A control room operator was asked

Institution of Chemical Engineers 0260-9576/12/$17.63 + 0.00

8 | Loss Prevention Bulletin 227

October 2012

Rain
(Reprinted from Still going wrong, Trevor Kletz, pg 73, copyright 2003, with permission from Elsevier)

Rain

Louvres installed incorrectly

Louvres installed correctly

Figure 1: the louvres were installed so that they directed rainwater and spray through them to switch a spare transformer on line in place of the working one. He inadvertently isolated the working transformer before switching on the spare one. He realised his error almost immediately and the supply was restored within a minute. The report on the incident blamed distraction, as the control room was used as a thoroughfare for people moving about the building. The report also suggested greater formality in preparing and following instructions when equipment is changed over. However, it did not recommend that when the computer is asked to isolate a transformer it should display a warning message such as, Are you sure you want to shut down the electricity supply? We get such messages on our computers when we wish to delete a filethere is no need for control programs to be less user-friendly than word processors. Notice again that the default action of the people who wrote the report was to describe ways of changing someones behaviour rather than to look for ways of changing designs or methods of working. Operators provide the last line of defence against errors by designers and managers. It is a bad strategy to rely on the last line of defence and neglect the outer ones. Good loss prevention starts far from the top event, in the early stages of design. Blaming users is a camouflage for poor design.

a bucket? Many people would say that it is more dangerous to bring in the matches, but nobody would knowingly strike them in the presence of a leak and in a well-run plant leaks are small and infrequent. If a bucket is allowed in, however, it may be used for collecting drips or taking samples. A flammable mixture will be present above the surface of the liquid and may be ignited by a stray source of ignition. Of the two causes of the subsequent fire, the bucket is the easier to avoid. I am not, of course, suggesting that we allow unrestricted use of matches on our plants but I do suggest that we keep out open containers as thoroughly as we keep out matches. (Editors note: See page 28 for a description of an incident of this type). As we have already seen, listing human error as a cause leads people to think only of ways of reducing human error, often by simply telling them to take more care. It does not encourage them to look for the improvements in design that can reduce the opportunities for human error. Instead of listing causes we should list the actions needed to prevent a recurrence. This forces to people to ask if and how each so-called cause can be prevented in future. Some people list a wrong attitude as the cause of an accident. But attitude is generalisation deduced from peoples actions. Instead of trying to change attitudes directly we should help people to change their actions, by better training and all the other ways previously suggested. Others will then say that they have changed their attitude.

We may go too far


Sometimes after an accident people go too far and spend time and money on making sure that nothing similar could possibly happen again even though the probability is low. If the accident was a serious one it may be necessary to do this to re-assure employees and the public, but otherwise we should remember that if we gold-plate one unit there are fewer resources available to silver-plate the others. Here is a simple example: While a welder was burning a hole in a pipe, 150 mm diameter, with walls 12 mm thick, in a workshop, a sudden noise made him jerk. His gun touched the pool of molten metal and a splash of metal hit him on the forehead. If the wall thickness had been above 12 mm a hole would have been drilled in the pipe first and the accident report said that this technique should be used in future for all pipes. However, this would be troublesome, as a crane would be needed to move the pipe to the drilling bay and back. In practice, nothing was done. The welder had cut thousands of holes before without incident, the chance of injury was small and it may have been reasonable to do nothing and accept the slight risk of another incident. However, the foreman and his engineer were not willing to say so and instead they recommended action that they had little or no intention of enforcing. UK law does not require everything possible to prevent an accident, only what is reasonably practicable. This phrase implies that the size of a risk should be compared with the cost of removing or reducing it, in money, time and trouble. When there is a gross disproportion between them it is not necessary to remove or reduce the risk. In recent years HSE has provided detailed advice on the levels of risks they consider tolerable and the costs they consider disproportionate.6

Accident reports list causes that are difficult or impossible to remove


For example, a source of ignition is often listed as the cause of a fire or explosion. But it is impossible on the industrial scale to eliminate all sources of ignition with 100% certainty. While we try to remove as many as possible it is more important to prevent the formation of flammable mixtures. Which is the more dangerous action on a plant that handles flammable liquidsto bring in a box of matches or to bring in

Institution of Chemical Engineers 0260-9576/12/$17.63 + 0.00

Loss Prevention Bulletin 227

October 2012 | 9

We do not let others learn from our experience


Many companies restrict the circulation of incident reports, as they do not want everyone, even everyone in the company, to know that they have blundered, but this will not prevent the incident happening again. We should circulate the essential messages widely, in the company and elsewhere so that others can learn from them, for several reasons: Moral: If we have information that might prevent another accident we have a duty to pass it on. Pragmatic: If we tell other organisations about our accidents they may tell us about theirs. Economic: We would like our competitors to spend as much as we do on safety. The industry is oneevery accident affects its reputation. When information is published people do not always learn from it. A belief that our problems are different is a common failing. When reports are published they often lack impact as the essential message is obscured by detail of only local interest. Those who spread safety messages are, like teachers, parents and tax collectors, giving information that many of their listeners would rather not hear, so the messages have to beclear.

References
1. Kletz, T.A., Accident Investigation, Missed Opportunities, Hazards XVI Analysing the Past, Planning the Future, Symposium Series No 148, Institution of Chemical Engineers, Rugby, 2001, pp. 111. Reprinted in Process Safety and Environmental Protection, 80B(1):38, Jan 2002. 2. Parker, R.J., The Flixborough Disaster Report of the Court of Inquiry, HMSO, London, 1975. 3. See the letters pages of The Chemical Engineer in the issues dated 20 April, 11 May, 25 May, 22 June, 14 Dec (article), 2000 and Feb 2001. See also Venart, J.E.S., Froude Modeling of the Flixborough By-Pass Pipe, Hazards XV The Process, its Safety and the Environment, Symposium Series No 147, Institution of Chemical Engineers, Rugby, 2001, pp. 139153, 2000. 4. Kletz, T.A., An Engineers View of Human Error, 3rd edition, Institution of Chemical Engineers, Rugby, 2001. 5. Anon., Upside down louvres, Safety Digest Lessons from Marine Accident Reports, No. 2/2001, Marine Accident Investigation Branch of the UK Department of Transport, Local Government and the Regions, London, 2001, p. 40. 6. Health and Safety Executive, Reducing Risks, Protecting People HSEs Decision Making Process, HSE Books, Sudbury, 2001. 7. Kletz, T.A., Lessons from Disaster how organisations have no memory and accidents recur, Institution of Chemical Engineers, Rugby, 1993

We read or receive only overviews


Lacking the time to read accident reports in detail, senior managers consume pre-digested summaries of them, full of generalisations such as, There has been an increase in accidents due to inadequate training. They describe this as taking a helicopter view. However, from a helicopter we see only forests. To understand the causes of accidents we need to land the helicopter and look at individual trees or even twigs and leaves. As already mentioned, the identification of underlying causes can be very subjective and is influenced by peoples experience, interests, blind spots and prejudices. One factory manager had realised that failure to learn from past experience had been a major cause of accidents and was looking for ways to improve it. But none of the individual reports, nor the annual summary of them, referred to this at all. All managers should read at least some of the accident reports written by their subordinates to see if they agree with the assignment of underlying causes.

Loss Prevention Panel comment


In his globally recognised distinguished career Professor Trevor Kletz has written many papers but this one in particular offers us the opportunity to reflect on what went wrong and the lessons we should learn in order to prevent incidences. Generally speaking, people often leave organisations without ever leaving their experiences behind so that others may learn and benefit. As such, organisations and those running them have a duty and moral responsibility to capture and retain within organisational memory the lessons learnt so that others may make better use of them to prevent accidents from happening again. If we do not do this, sadly the cost will be very heavy to burden and above all reputations lost may be very difficult to regain. Major accidents such as the likes of Flixborough, Chernobyl, Piper Alpha, Bhopal, Texas City, Deep Water Horizon and more recently the Fukushima nuclear accident in Japan are there by way of examples for us to reflect on. As Trevor asserts Unless we take action to improve the learning from experience, nothing will happen except a repetition of the accident. The main purpose of an accident investigation is to prevent it happening again. If it is allowed to, the investigation was a waste of time. Iqbal Essa HSE & Chairman of Loss Prevention Panel

We forget the lessons learned and allow the accident to happen again
Even when we prepare a good report and circulate it widely, all too often it is read, filed and forgotten. Organisations have no memory7. Only people have memories and after a few years they move on taking their memories with them. Procedures introduced after an accident are allowed to lapse and ten years later the accident happens again, even on the plant where it happened before. If, by good fortune, the results of an accident are not serious, the lessons are forgotten even more quickly. Unless we take action to improve the learning from experience, nothing will happen except a repetition of the accident. The main purpose of an accident investigation is to prevent it happening again. If it is allowed to, the investigation was a waste of time.

Institution of Chemical Engineers 0260-9576/12/$17.63 + 0.00

Das könnte Ihnen auch gefallen