Beruflich Dokumente
Kultur Dokumente
Richard Devon
Engineering Design Program ▪ School of Engineering Design, Technology, and Professional Programs ▪ College of Engineering ▪ The Pennsylvania State University
Defining Risk
• Informal Approaches
• Formal Theories
Engineering Design Program ▪ School of Engineering Design, Technology, and Professional Programs ▪ College of Engineering ▪ The Pennsylvania State University
The Law of Unintended Consequences
• Almost all human actions have at least one unintended consequence caused
by,
• Ignorance It is impossible to anticipate everything.
• Error Incomplete analysis of the problem, or following habits that worked
in the past but may not apply to the current situation. All decisions are
about the future but based on past experience, unless simulation models are
used.“…the actor fails to recognize that procedures which have been
successful in certain circumstances need not be so under any and all conditions.”
• Immediate interest may override long-term interests.
• Basic values may require or prohibit certain actions, even if the long-term
result might be unfavorable.
• Self-defeating prophesy Fear of some consequence drives people to find
solutions before the problem occurs, thus the non-occurrence of the
problem is unanticipated. (the chance it won’t happen is ignored)
Edward Tenner, in Why Things Bite Back: Technology and the Revenge of Unintended Consequences
Engineering Design Program ▪ School of Engineering Design, Technology, and Professional Programs ▪ College of Engineering ▪ The Pennsylvania State University
Design for X, where X is Unknown
• In Design for X terms, unintended consequences could be viewed as those Xs
that are unknown. They represent unknown tradeoffs that would often
change the design if they were known.
• Design Axiom A: It is never possible to know all Xs, but it is possible to
take unknown Xs as undesirable and hunt them down as far as possible.
• Whereas Tenner, and opponents of many technologies, promotes
technophobia, the dreams of technophiliacs* are even more common, …
• X-rays (good for health – cure for the common cold, everlasting life, and
ESP), radio (promotes privacy by replacing telephones and nosy operators,
and promotes safety by allowing emergency calls from individuals – some
truth to both claims actually), skyscraper cities (clean, efficient, ordered,
beautiful), atomic power (powering ships, planes, cars, and providing
electricity too cheap to meter),
*Joseph Corn, Imagining Tomorrow. 1986. “The future is not what it used to be”
Engineering Design Program ▪ School of Engineering Design, Technology, and Professional Programs ▪ College of Engineering ▪ The Pennsylvania State University
Dreams are Inaccurate
• "There is no reason anyone would want a computer in their home."
--Ken Olson, president, chairman and founder of Digital Equipment Corp.,
1977
• "It is not too much to expect that our children will enjoy in their homes
electrical energy too cheap to meter; will know of great periodic regional
famines in the world only as matters of history; will travel effortlessly over the
seas and under them and through the air with a minimum of danger and at
great speeds, and will experience a lifespan far longer than ours, as disease
yields and man comes to understand what causes him to age. This is the
forecast of an age of peace." Lewis L. Strauss, chairman of the Atomic Energy
Commission 1955
• Design Axiom B: For engineering designers the lesson is clear, beware of
those who are converting unknowns into dreams or nightmares.
Engineering Design Program ▪ School of Engineering Design, Technology, and Professional Programs ▪ College of Engineering ▪ The Pennsylvania State University
Managing Risk in Complex Systems
• Design Axiom C. There will be unintended consequences
• Murphy was wrong. Some will be good
• Murphy was wrong. Those that are bad will occur infrequently –
if the design is well done
• But they will occur: Every normal curve has two far ends
• The design of complex systems must include the design of
operations, and the operational design must be adaptive and
resilient. High Reliability Organizations (HROs)
• Some features and consequences of complex systems are to be
avoided as far as possible: Normal Accident Theory (NAT)
Engineering Design Program ▪ School of Engineering Design, Technology, and Professional Programs ▪ College of Engineering ▪ The Pennsylvania State University
Unintended
Engineering Design Program ▪ School of Engineering Design, Technology, and Professional Programs ▪ College of Engineering ▪ The Pennsylvania State University
Defining Risk
• Informal Approaches
• Formal Theories
Engineering Design Program ▪ School of Engineering Design, Technology, and Professional Programs ▪ College of Engineering ▪ The Pennsylvania State University
Accidents are Normal
Engineering Design Program ▪ School of Engineering Design, Technology, and Professional Programs ▪ College of Engineering ▪ The Pennsylvania State University
Charles Perrow: Normal Accidents
• Perrow is a sociologist who served on the Three Mile Island
Commission and published Normal Accidents in 1984 and 1999
Engineering Design Program ▪ School of Engineering Design, Technology, and Professional Programs ▪ College of Engineering ▪ The Pennsylvania State University
Schematic of PWR cooling systems
Engineering Design Program ▪ School of Engineering Design, Technology, and Professional Programs ▪ College of Engineering ▪ The Pennsylvania State University
Engineering Design Program ▪ School of Engineering Design, Technology, and Professional Programs ▪ College of Engineering ▪ The Pennsylvania State University
Condensate Polisher System
Deep-bed condensate polishers use mixed bed
resin to filter out corrosion products.
USFilter
Condensate treatment is widely used
in central fossil and nuclear-fueled stations,
and invariably within the fossil-fueled
thermal cycle with super-critical boilers,
where high purity must be strictly maintained.
Deep-bed externally regenerated
condensate polisher systems use mixed-bed
resin to filter out corrosion products generated
in the thermal cycle and to remove
any trace dissolved minerals that may enter
the feed water cycle.
Engineering Design Program ▪ School of Engineering Design, Technology, and Professional Programs ▪ College of Engineering ▪ The Pennsylvania State University
Three Mile Island: The First 13 seconds
• The core needed to be cooled and the emergency feed water pumps came on,
but the valves were left closed for reasons never determined and no water
actually circulated
• There were two indicators on TMIs huge control panel that showed the
valves were closed instead of open. One of which was obscured by a repair
tag. But no one was looking until the problem developed much further.
Some experts questioned this was a problem since worse ones were
developing and the emergency water system was limited
• The secondary system steam generator boiled dry, and the reactor scrammed
(dropped the graphite control rods into the core), but this was not enough to
stop the heat and pressure building
• The pilot operated relief valve PORV was an ASD used to vent the pressure
in the core. Expected to fail every 50 uses, it is rarely turned on. This time it
opened but did not close again and 32,000 gallons of radioactive water came
out (1/3 rd of core capacity). The PORV had failed twice before. It was
closed when found open after two hours
• The safety indicator for a failed PORV also failed. It reports a proper signal
not a proper event. A signal was sent to close the PORV.
Engineering Design Program ▪ School of Engineering Design, Technology, and Professional Programs ▪ College of Engineering ▪ The Pennsylvania State University
Three Mile Island -3
• The operators could not reasonably have been aware of these
failures and they were very highly interdependent (tightly
coupled) and not in a linear or direct operational sequence like a
production line (ie, it is a complex system)
• The essence of a normal accident: the interaction of multiple
failures that are not in a direct operational sequence. Impossible
to plan/design for? What fail safe design method would
anticipate this?http://www.cede.psu.edu/~rdevon/Design100/Design100SafeDesign.htm
• The High Pressure Injection (HPI) system is another ASD that
introduces a lot of cold water into the core very rapidly.
Engineering Design Program ▪ School of Engineering Design, Technology, and Professional Programs ▪ College of Engineering ▪ The Pennsylvania State University
Three Mile Island -4
• After 2 minutes the HPI came on. After 2 more minutes it was
drastically reduced: an operator error that meant not replacing
what was being lost through the open PORV. HPI does have its
own risks of cracking the core, however, and there are training
issues about not letting the pressurizer go “solid” (water only)
• This led to a steady uncovering of the core and it is considered
the event that made it a major accident
• Two pressure indicators gave opposite readings and they did not
know if they had a LOCA (loss of cooling accident). They did.
• There were further problems like the hydrogen bubble, but
importantly, the analyses of what happened at three Mile Island
were characterized by many disagreements among the experts,
and many questions that they just could not answer
Engineering Design Program ▪ School of Engineering Design, Technology, and Professional Programs ▪ College of Engineering ▪ The Pennsylvania State University
Summarizing Normal Accident Theory
• Presumption of a non-zero p
• Presumption that H could be unacceptable
• Categorizes accidents and victims
• Views complexity and tight coupling as undesirable
features in systems
• Provides criteria for defining complexity and coupling
• Categorizes major S-T systems in terms of complexity
and coupling
• Incorporates distorting effects of perception on risk
assessment
Engineering Design Program ▪ School of Engineering Design, Technology, and Professional Programs ▪ College of Engineering ▪ The Pennsylvania State University
Complexity, Coupling, and Catastrophe - 1
• Victims:
– 1st party - operators
– 2nd party – related system workers
– 3rd party innocent bystanders
– 4th party unborn generations
• Accidents
– Incidents – damage to units only
– Accidents damage to subsystems or whole system
– Component failure accidents linked failures in expected
sequence
– System accidents unanticipated interaction of multiple failure
Engineering Design Program ▪ School of Engineering Design, Technology, and Professional Programs ▪ College of Engineering ▪ The Pennsylvania State University
Complexity, Coupling, and Catastrophe - 2
• Complex and Linear Interactions
• Coping with hidden interactions. For example, will we know when parts are
wearing out?
• Perceptions, experience, complexity, and hidden interactions. Experience
makes systems look simpler but also builds blind spots
• Complex systems have
– Proximity of parts not in a production sequence (PS)
– Many common mode connections between components not in a PS
– Unfamiliar or unintended feedback loops
– Many control parameters with potential interactions
– Indirect or inferential information sources
– Limited understanding of some processes
Engineering Design Program ▪ School of Engineering Design, Technology, and Professional Programs ▪ College of Engineering ▪ The Pennsylvania State University
Complexity and Coupling
Engineering Design Program ▪ School of Engineering Design, Technology, and Professional Programs ▪ College of Engineering ▪ The Pennsylvania State University
Uncertainty
Complex systems contain unknowns even for the experts who design and
study them.
Even 20+ years later, what triggered the TMI accident is still not known.
Engineering Design Program ▪ School of Engineering Design, Technology, and Professional Programs ▪ College of Engineering ▪ The Pennsylvania State University
Perception of Risk
People’s perception of risk deviates from statistical realities in predictable
ways. The most important factor, sometimes labeled “dread risk,” is
associated with:
• lack of control over the activity
• fatal consequences if there were a mishap of some sort
• high catastrophic potential
• reactions of dread
• inequitable distribution of risks and benefits (including the transfer of risks
to future generations)
• the belief that risks are increasing and not easily reducible.
• High on this “dread risk” factor were nuclear weapons, nuclear power,
warfare, nerve gas, terrorism, crime, and “national defense.” Note that nuclear
weapons, nuclear power, and military activities in general are systems Perrow
classifies as complex and tightly coupled.
• Design Axiom H: The need to manage risk perception among operators,
local officials, and the public should be considered during design
Engineering Design Program ▪ School of Engineering Design, Technology, and Professional Programs ▪ College of Engineering ▪ The Pennsylvania State University
Tight and Loose Coupling Examples
• Tightly coupled:
– dams, chemical plants, power grids, bakeries, recombinant
DNA technologies
• Loosely coupled:
– manufacturing / assembly line, mining, R&D firms,
universities, Internet
Engineering Design Program ▪ School of Engineering Design, Technology, and Professional Programs ▪ College of Engineering ▪ The Pennsylvania State University
Further Study: Nine Theories
Engineering Design Program ▪ School of Engineering Design, Technology, and Professional Programs ▪ College of Engineering ▪ The Pennsylvania State University
Case Study: The Challenger
• Vaughan, Diane. The Challenger Launch Decision.
University of Chicago Pres, Chicago, 1997. An
extremely well reviewed book by a sociologist, but
controversial because she rejects the mistake answer,
the idea of managerial wrongdoing, that someone is to
blame. She states that “mistake, mishap, and disasters
are socially organized and systematically produced by
social structures. No extraordinary actions by
individuals explain what happened, no intentional
managerial wrongdoing, no rule violations, no
conspiracy.”
Engineering Design Program ▪ School of Engineering Design, Technology, and Professional Programs ▪ College of Engineering ▪ The Pennsylvania State University
Vaughan (Cont’d)
• It reads like a statement of normal accident theory, but
it is also a statement that sociology is important and
individual actions are largely socially determined hence
individual responsibility is equally diminished.
• There is an old debate as to whether human behavior is
ultimately reducable to irreducable sociological (ISPs)
or psychological propositions (IPPs). Cf. Nietsche “All
philosophy is biography” suggests (IPP) that Vaughan
only said what she said because of whom she is – a
sociologist or someone phobic about guilt
Engineering Design Program ▪ School of Engineering Design, Technology, and Professional Programs ▪ College of Engineering ▪ The Pennsylvania State University
Vaughan (Cont’d)
• At this point I part company with Vaughan and wonder
whether she thinks that Eichmann was merely a
transportation expert. In fact, she describes the root
cause as the “banality of organizational life,” and some
influential thinkers have also referred to the banality of
evil such a Hannah Arendt (Eichmann in Jerusalem) and
Stanley Millgram (Obedience to Authority – the electric
shock experiments), and Vaughan does refer to this
literature in her conclusions (407).
Engineering Design Program ▪ School of Engineering Design, Technology, and Professional Programs ▪ College of Engineering ▪ The Pennsylvania State University
Vaughan (Cont’d)
• There is more than a little truth to this. I have written
several articles on the need for social ethics,* but we
also have to have moral values and act on them. Arendt
and Milgram condemned what they described.
• Organizations are designed with the idea that people
will do their job properly and there should be
consequences if they do not.
• *Studying the ethical implications of the social
arrangements we make for making decisions. But these
arrangements are made by people and can changed
(Dewey)
Engineering Design Program ▪ School of Engineering Design, Technology, and Professional Programs ▪ College of Engineering ▪ The Pennsylvania State University
Vaughan (Cont’d)
• Columbia revealed a similar behavioral pattern, with
engineers who identified p and H getting overruled,
which confirms what Vaughan said while making even
more suspect her conclusion that no one is to blame.
Had she made the point that the structure and culture
at NASA must change maybe it would have made a
difference.
• Her book nevertheless is excellent scholarship and a
great reference text for variety of subjects very relevant
to the management of risk.
Engineering Design Program ▪ School of Engineering Design, Technology, and Professional Programs ▪ College of Engineering ▪ The Pennsylvania State University
The Challenger: p and H were Known
Engineers said don’t fly, for the right and fateful reasons, but
Morton Thiokol and NASA managers overruled them. So what
happened: the Law of Unintended Consequences:
• Ignorance It is impossible to anticipate everything.
• Error Incomplete analysis of the problem, or following habits
that worked in the past but may not apply to the current
situation. All decisions are about the future but based on past
experience, unless simulation models are used.“…the actor fails
to recognize that procedures which have been successful in certain
circumstances need not be so under any and all conditions.”
• Immediate interest may override long-term interests.
• Basic values may require or prohibit certain actions, even if the
long-term result might be unfavorable.
Engineering Design Program ▪ School of Engineering Design, Technology, and Professional Programs ▪ College of Engineering ▪ The Pennsylvania State University
Design Axioms
• Design Axiom A: It is never possible to know all Xs, but it is possible to
take unknown Xs as undesirable and hunt them down as far as possible.
• Design Axiom B: For engineering designers the lesson is clear, beware of
those who are converting unknowns into dreams or nightmares.
• Design Axiom C: There will be unintended consequences
• Design Axiom D: p, the probability of H (the hazard), is never zero
• Design Axiom E: Uncertainty of what p is, and ignorance of what H is,
increases with complexity even among experts
• Design Axioms F & G: Minimize tight coupling and complexity in the
design of complex systems since both raise the value of both p and H, often
in unknown ways. Complexity can create unforeseeable simultaneous failures,
• Design Axiom H: The need to manage risk perception among operators,
local officials, and the public should be considered in design
Engineering Design Program ▪ School of Engineering Design, Technology, and Professional Programs ▪ College of Engineering ▪ The Pennsylvania State University
When p is non-zero, and the design is good
Engineering Design Program ▪ School of Engineering Design, Technology, and Professional Programs ▪ College of Engineering ▪ The Pennsylvania State University
The Social Management of Risk: Exposures
• Self-hazardous behavior, smoking, drinking and other drug use,
hang gliding, cheerleading
• Behavior that is hazardous to others: crime, drunken driving,
child abuse, smoking in public
• Cogeneration: eg workplace where both parties agree to a
behavior but only one party fully understands the risks
• Production externalities: air or water pollution, toxic waste
• Natural processes: earthquakes, floods, droughts, epidemics,
storms and tornadoes
• Economic processes; Induce conditions such as poverty and
unemployment
• Government action: war
Engineering Design Program ▪ School of Engineering Design, Technology, and Professional Programs ▪ College of Engineering ▪ The Pennsylvania State University
The Social Management of Risk: Effects
• Fatalities and injuries
• Deferred injuries and fatalities: e.g., cancer
• Genetic effects on unborn generations
• Loss of job
• Loss of property
• Economic losses
• Environmental degradation
• Stress impact: loss of dear ones, fear of repetition
Engineering Design Program ▪ School of Engineering Design, Technology, and Professional Programs ▪ College of Engineering ▪ The Pennsylvania State University
The Social Management of Risk: Agencies
• Safe Design:
• Costs of failure: product liability, product recalls
• Safe behavior/operations/management
• Insurance
• Standards
• Design and Manufacturing Codes, ISO and national
• Professional codes
• Unions
• Government regulations
• Legislation of statutes
• Enforcement of codes and regulations
Engineering Design Program ▪ School of Engineering Design, Technology, and Professional Programs ▪ College of Engineering ▪ The Pennsylvania State University
Reference
Merkhofer, Miley W. Decision Science and Social Risk
Management, Boston, Reidel, 1987.
Circa 1986 “In the last one hundred years eight events
have occurred in the US in which one thousand or
more were killed (excluding wars and epidemics). None
since 1928. …Prior to 1928 such an event occurred
roughly every eight years. P 5
Despite increased longevity and the reduction in the
frequency of disasters, public concern about risk is
increasing (Harris poll, 1980)
Engineering Design Program ▪ School of Engineering Design, Technology, and Professional Programs ▪ College of Engineering ▪ The Pennsylvania State University