Beruflich Dokumente
Kultur Dokumente
October 2012
October 2012 | 7
followed and never turn a blind eye. Remember also that many, perhaps most, violations occur when people think they have found a better way of doing a job. Was the task beyond the ability of the person asked to do it, perhaps beyond anyones ability? If so, we need to redesign the task. Was it a slip or lapse of attention? In contrast to mistakes, the intention may have been correct but it was not fulfilled. It is no use telling people to be more careful as no one is deliberately careless. We should remove opportunities for error by changing the design or method of working. Designers, supervisors and managers make errors of all these types, though slips and lapses of attention by designers and managers are rare as they usually have time to check their work. Errors by designers produce traps into which operators fall, that is, they produce situations in which slips or lapses of attention, inevitable from time to time, result in accidents. Errors by managers are signposts pointing in the wrong directions.
distraction, poor instructions or, after an accident, because he or she has been injured. Finally, we can use the techniques of behavioural science to improve the extent to which people follow procedures and accepted good practice. By listing this as the last resort I do not intend to diminish its value. Safety by design should always be our aim but may be impossible and experience shows that behavioural methods can bring about substantial improvement in the everyday types of accident that make up most of the lost-time and minor accident rates. However, the technique has little effect on process safety. Behavioural methods should not be used as an alternative to the improvement of plant design or methods of working when these are reasonably practicable. To make these various ways of preventing incidents clearer, let us consider a simple but common cause of injury and even death in the homefalls on the stairs. The inherently safer solution is to avoid the use of stairs by building a single storey building or using ramps instead of stairs. If that is not reasonably practicable a passive solution is to install intermediate landings so that people cannot fall very far or to avoid particular types of stair, such as spiral staircases, which make falls more likely. An active solution is to install a lift. Like most active solutions it is expensive and involves complex equipment that is liable to fail, expensive to maintain and easy to neglect. The procedural solution is to instruct people to always use the handrails, never to run on the stairs, to keep them free from junk and so on. This can be backed up by behavioural techniquesspecially trained fellow workers (or parents in the home) look out for people who behave unsafely and tactfully draw their attention to the action. In some companies the default action after an accident is to start at the wrong end of the list of alternatives and recommend a change in procedures or better observation of procedures, often without asking why the procedures were not followed. Were they, for example, too complex or unclear, or have supervisors and managers turned a blind eye in the past? Changing procedures is, of course, usually quicker, cheaper and easier than changing the design, but is least effective. For example, rainwater and spray passed through the intake of a ventilation fan and fell on to a switchboard. This occurred on a ship but could have occurred elsewhere. A short circuit started a fire; it was soon extinguished but all power was lost. The ship had to ask for help and was towed back to port. The water entered through the ventilation intake as the louvres in it had been installed so that they directed spray and rain into the engine room rather than away from it. (The report5 said that they had been installed upside-down but the author must have meant back-to-front.) The reports first recommendation was that louvres should be checked to make sure they are fitted correctly. The author did not realise that they should also be designed so that they could not be fitted wrongly, or so that it was obvious if they were, or at least so that the inside and outside, top and bottom were clearly labelled. The report did, however, recommend that switchboards should be covered to prevent water entering from above. Computers provide many other examples of an unrealistic attitude to human error. A control room operator was asked
October 2012
Rain
(Reprinted from Still going wrong, Trevor Kletz, pg 73, copyright 2003, with permission from Elsevier)
Rain
Figure 1: the louvres were installed so that they directed rainwater and spray through them to switch a spare transformer on line in place of the working one. He inadvertently isolated the working transformer before switching on the spare one. He realised his error almost immediately and the supply was restored within a minute. The report on the incident blamed distraction, as the control room was used as a thoroughfare for people moving about the building. The report also suggested greater formality in preparing and following instructions when equipment is changed over. However, it did not recommend that when the computer is asked to isolate a transformer it should display a warning message such as, Are you sure you want to shut down the electricity supply? We get such messages on our computers when we wish to delete a filethere is no need for control programs to be less user-friendly than word processors. Notice again that the default action of the people who wrote the report was to describe ways of changing someones behaviour rather than to look for ways of changing designs or methods of working. Operators provide the last line of defence against errors by designers and managers. It is a bad strategy to rely on the last line of defence and neglect the outer ones. Good loss prevention starts far from the top event, in the early stages of design. Blaming users is a camouflage for poor design.
a bucket? Many people would say that it is more dangerous to bring in the matches, but nobody would knowingly strike them in the presence of a leak and in a well-run plant leaks are small and infrequent. If a bucket is allowed in, however, it may be used for collecting drips or taking samples. A flammable mixture will be present above the surface of the liquid and may be ignited by a stray source of ignition. Of the two causes of the subsequent fire, the bucket is the easier to avoid. I am not, of course, suggesting that we allow unrestricted use of matches on our plants but I do suggest that we keep out open containers as thoroughly as we keep out matches. (Editors note: See page 28 for a description of an incident of this type). As we have already seen, listing human error as a cause leads people to think only of ways of reducing human error, often by simply telling them to take more care. It does not encourage them to look for the improvements in design that can reduce the opportunities for human error. Instead of listing causes we should list the actions needed to prevent a recurrence. This forces to people to ask if and how each so-called cause can be prevented in future. Some people list a wrong attitude as the cause of an accident. But attitude is generalisation deduced from peoples actions. Instead of trying to change attitudes directly we should help people to change their actions, by better training and all the other ways previously suggested. Others will then say that they have changed their attitude.
October 2012 | 9
References
1. Kletz, T.A., Accident Investigation, Missed Opportunities, Hazards XVI Analysing the Past, Planning the Future, Symposium Series No 148, Institution of Chemical Engineers, Rugby, 2001, pp. 111. Reprinted in Process Safety and Environmental Protection, 80B(1):38, Jan 2002. 2. Parker, R.J., The Flixborough Disaster Report of the Court of Inquiry, HMSO, London, 1975. 3. See the letters pages of The Chemical Engineer in the issues dated 20 April, 11 May, 25 May, 22 June, 14 Dec (article), 2000 and Feb 2001. See also Venart, J.E.S., Froude Modeling of the Flixborough By-Pass Pipe, Hazards XV The Process, its Safety and the Environment, Symposium Series No 147, Institution of Chemical Engineers, Rugby, 2001, pp. 139153, 2000. 4. Kletz, T.A., An Engineers View of Human Error, 3rd edition, Institution of Chemical Engineers, Rugby, 2001. 5. Anon., Upside down louvres, Safety Digest Lessons from Marine Accident Reports, No. 2/2001, Marine Accident Investigation Branch of the UK Department of Transport, Local Government and the Regions, London, 2001, p. 40. 6. Health and Safety Executive, Reducing Risks, Protecting People HSEs Decision Making Process, HSE Books, Sudbury, 2001. 7. Kletz, T.A., Lessons from Disaster how organisations have no memory and accidents recur, Institution of Chemical Engineers, Rugby, 1993
We forget the lessons learned and allow the accident to happen again
Even when we prepare a good report and circulate it widely, all too often it is read, filed and forgotten. Organisations have no memory7. Only people have memories and after a few years they move on taking their memories with them. Procedures introduced after an accident are allowed to lapse and ten years later the accident happens again, even on the plant where it happened before. If, by good fortune, the results of an accident are not serious, the lessons are forgotten even more quickly. Unless we take action to improve the learning from experience, nothing will happen except a repetition of the accident. The main purpose of an accident investigation is to prevent it happening again. If it is allowed to, the investigation was a waste of time.