Sie sind auf Seite 1von 58

Michigan 2009 1

Risk Calculus
Risk Calculus – Index
Must Ignore Low Probability – 1AC....................................................................................................................................3
Must Reject Multiple Internal Links – 1AC.........................................................................................................................4
Policy Paralysis – 1AC.........................................................................................................................................................5
Tyranny of Survival – 1AC..................................................................................................................................................6
Extensions - Must ignore low probability Impacts...............................................................................................................7
Extensions - Must ignore low probability Impacts...............................................................................................................8
Extensions - Must ignore low probability Impacts...............................................................................................................9
Extensions - Must ignore multiple internal link Impacts....................................................................................................11
Extend - Policy Paralysis....................................................................................................................................................12
Extend - Policy Paralysis....................................................................................................................................................13
Alternative – Prioritize Uniqueness....................................................................................................................................14
Alternative – Expert Opinion..............................................................................................................................................15
Alternative – Prioritize Probability.....................................................................................................................................16
Alternative – Prioritize Probability.....................................................................................................................................17
Alternative – Threshold Probability...................................................................................................................................18
They Say “Risk = Probability times impact = infinity”......................................................................................................19
They Say “Risk = Probability times impact = infinity”......................................................................................................20
They Say “Risk = Probability times impact = infinity”......................................................................................................21
They Say “Insurance Principle”..........................................................................................................................................22
They Say “Magnitude is the Most Important”....................................................................................................................23
They Say “Presumption”....................................................................................................................................................24
They Say “Probability is Impossible to Calculate”............................................................................................................25
They Say “Probability is Impossible to Calculate”............................................................................................................26
They Say “Even Improbable Impacts sometimes Occur”..................................................................................................27
They Say “Focusing on Risk Calculus Distracts Education”.............................................................................................28
They Say “Focusing on Risk Calculus Distracts Education”.............................................................................................29
No Nuclear War impacts....................................................................................................................................................30
No Environmental Collapse................................................................................................................................................31
No Biodiversity Extinction.................................................................................................................................................32
No Ozone Depletion Extinction..........................................................................................................................................33
No HIV/AIDS Extinction...................................................................................................................................................34
No Terror Attack................................................................................................................................................................35
No Bioterror Attack............................................................................................................................................................36
No Bioterror Attack............................................................................................................................................................37
No Iran Impacts..................................................................................................................................................................38
No Iran Impacts..................................................................................................................................................................39
Only Nuclear Impacts are Existential.................................................................................................................................40
Bostrom Indicts...................................................................................................................................................................41
Extensions – Nuclear Threat Rhetoric Bad........................................................................................................................42
Extensions – Nuclear Threat Rhetoric Bad........................................................................................................................43
Extensions – Nuclear Threat Rhetoric Bad........................................................................................................................44
Extensions – Nuclear Threat Rhetoric Bad........................................................................................................................45
Extensions – Nuclear Threat Rhetoric Bad........................................................................................................................46
Extensions – Nuclear Threat Rhetoric Bad........................................................................................................................47
They Say “Ignore Low Probability”...................................................................................................................................48
They Say “Ignore Low Probability”...................................................................................................................................49
They Say “Ignore Low Probability”...................................................................................................................................50
They Say “Ignore Low Probability”...................................................................................................................................51
They Say “Ignore Low Probability”...................................................................................................................................52
They Say “Ignore Low Probability”...................................................................................................................................53
They Say “Tyranny of Survival..........................................................................................................................................54
They Say “Resource Wars Improbable”.............................................................................................................................55
They Say “Resource Wars Improbable”.............................................................................................................................56
They Say “Economic Collapse = Nuclear War Improbable”.............................................................................................57
They Say “Economic Collapse = Nuclear War Improbable”.............................................................................................58
The Method Lab
Michigan 2009 2
Risk Calculus

The Method Lab


Michigan 2009 3
Risk Calculus
Must Ignore Low Probability – 1AC
[ ] Mini-max arguments are flawed – they overemphasize pessimism, they misinterpret
probability, they place too much value on Novelty, and they undemocratically distribute
values
David Berube, Associate Professor of Speech Communication at the University of South Carolina, 2000 [Director of
Debate “Debunking Mini-max Reasoning: The Limits Of Extended Causal Chains In Contest Debating,”
Contemporary Argumentation and Debate, Volume 21, Available Online at http://www.cedadebate.o
rg/CAD/2000_berube.pdf, Accessed 04-05-2008, p. 64-69]

Vohra warned:"There are many inherent uncertainties in the quantitative assessment of accident probability. These
uncertainties include lack of sufficient data, the basic limitations of the probabilistic methods used, and insufficient
information about the physical phenomena . . . relating to the potential accident situation" (211). Why then, do we
accept claims associated with these probability assessments? The answer lies in the seductiveness of the mini-max principle: Act
to minimize the risk of maximum disaster. According to Kavka, under the mini-max principle, "benefits and probabilities are disregarded, and
that option is considered best which promises the least bad (or most good) outcome" (46-47). This is similar to what Kavka called the disaster
avoidance principle: `"When choosing between potential disasters under two-dimensional uncertainty, it is rational to select the alternative that
minimizes the probability of disaster occurrence" (50), and what Luce and Raiffa called the maximization-of-security-level theory (278-281). As
a number of authors have noted, the mini-max principle is fraught with difficulties. I will recount four particular pitfalls in
this article. First mini-max reasoning is grounded in ultrapessimisim, or "disregarding a relevant experiment regardless
of its cost" (Parmigiani 250). "The mini-max principle is founded on ultra-pessimism, [in] that it demands that the
actor assume the world to be in the worst possible state" (Savage, "Statistical Decisions" 63). Savage concluded:
"The mini-max rule based on negative income is ultrapessimistic and can lead to the ignoring of even extensive
evidence and hence is utterly untenable for statistics" (Foundations 200). Furthermore, Parmigiani found that "no form of the mini-
max principle is generally superior to the other in guarding against ultrapessimism. . . . [I]t is not possible to concoct a standardization method
that makes the mini-max principle safe from ultrapessimism" (243. 249). Second, mini-max reasoning is confounded by incorrect
probability assessments. "Applying mini-max means ignoring the probabilities as various outcomes" (Finnis 221).
One of the reasons for incorrect decisions is grounded in politics. Proponents of a mini-max claim may misrepresent
the probabilities. "The group mini-max rule is also objectionable in some contexts, because, if one were to try to
apply it in a real situation, the members of the group might well lie about their true probability judgments, in order
to influence the decision generated by the mini-max rule in the direction each considers correct" (Savage, Foundations
175). This problem is worsened as proponents incorporate lay source material into their extended arguments. Several studies have noted that lay
estimates of low probability hazards tend to be substantially higher than expert probability estimates. . . . Is it that people sensitive to risk
consequences, and unwilling to accept the risk or risk management or both strategies, might systematically exaggerate the magnitude of
consequences while those in the opposite camp might systematically underplay the consequential danger involved? This implies the hypothesis
that acceptance is an a priori condition, and becomes a driver of likelihood and consequence assessments, at least in some instances, while threat
probabilities become the key causal factor in acceptance in still other instances. (Nehvevajsa 522) The third fault with mini-max
reasoning is that it is "flagrantly undemocratic. In particular, the influence of an *opinion, under the group mini-max
rule, is altogether independent of how many people in the group hold that opinion" (Savage, Foundations 175). In
other words, singular experts make mini-max estimations. Quasi-experts or secondary experts make some of the
most bizarre extended arguments. In addition, there is an elitist sense to the process. The reasoning of the "expert" is
presumptive over the opinion of individuals who are less educated, less affluent, or even less white. What happens when the
elite are wrong? The arrogance of elitism is hardly more evident in any other setting. Deference to authority is an important co-requisite of
extended mini-max claims in contest debates. There is an insipid maxim associated with it: "Don't understand? Don't worry. We do the thinking
so you won't have to!" This problem is amplified when an exceptional source in a mini-max argument cannot be corroborated. Making a decision
based on a sole opinion grossly inflates the qualifications of the source to make the claim. Consider how this issue worsens as well when the
source is nameless or institutional, such as a press service. The final pitfall of mini-max reasoning is that the persuasiveness of any such argument
is a function of contingent variables, in particular, its novelty. Consider this simple illustration: A single large outcome appears to pose a greater
risk than does the sum of multiple small outcomes. "It is always observed that society is risk averse with respect to a single event of large
consequence as opposed to several small events giving the same total number of fatalities in the same time period. Hence 10.000 deaths
once in 10,000 years is perceived to be different from 1 death each year during 10,000 years" (Niehaus, de Leon &
Cullingfort 93). Niehaus, de Leon, and Cullingford extended their analysis with a review of nuclear power plant
safety. "The Reactor Safety Study similarly postulated that the public appears to accept more readily a much greater
social impact from many small accidents than it does from the more severe, less frequent occurrences that have a
similar society impact" (93). Theorists in many different settings have described this phenomenon. Wilson, for
instance, devised a way to examine the impact of low-probability high-consequence events that more clearly
portrayed societal estimates of such events: "A risk involving N people simultaneously is N2 (not N) times as
important as an accident involving one person. Thus a bus or aeroplane accident involving 100 people is as serious
as 10,000, not merely 100, automobile accidents killing one person" (274-275).
The Method Lab
Michigan 2009 4
Risk Calculus
Must Reject Multiple Internal Links – 1AC
[ ] Long chains of internal links make risk assessment impossible because they distort
probabilities
Hansson, 2006; professor in philosophy at the Royal Institute of Technology (Sven Ove; May 23, 2006; “The
Epistemology of Technological Risk”,; http://scholar.lib.vt.edu/ejournals/SPT/v9n2/hansson.html#bondi

Even after many years' experience of a technology there may be insufficient data to determine the frequencies of
unusual types of accidents or failures. As one example of this, there have (fortunately) been too few severe
accidents in nuclear reactors to make it possible to estimate their probabilities. In particular, most of the reactor
types in use have never been involved in any serious accident. It is therefore not possible to determine the risk
(probability) of a severe accident in a specified type of reactor. One common way to evade these difficulties is to
calculate the probability of major failures by means of a careful investigation of the various chains of events that
may lead to such failures. By combining the probabilities of various subevents in such a chain, a total probability
of a serious accident can be calculated. Such calculations were in vogue in the 1970's and 1980's, but today there is a
growing skepticism against them, due to several difficult problems with this methodology. One such problem is
that accidents can happen in more ways than we can think of beforehand. There is no method by which we can
identify all chains of events that may lead to a major accident in a nuclear reactor, or any other complex
technological system. Another problem with this methodology is that the probability of a chain of events can be
very difficult to determine even if we know the probability of each individual event. Suppose for instance that an
accident will happen if two safety valves both fail. Furthermore suppose that we have experience showing that the
probability is 1 in 500 that a valve of this construction will fail during a period of one year. Can we then conclude that
the probability that both will fail in that period is 1/500 x 1/500, i.e. 1/25000? Unfortunately not, since this calculation is
based on the assumption that failures in the two valves are completely independent events. It is easy to think of ways in
which they may be dependent: Faulty maintenance may affect them both. They may both fail at high temperatures or in
other extreme conditions caused by a failure in some other component, etc. It is in practice impossible to identify all
such dependencies and determine their effects on the combined event-chain.

The Method Lab


Michigan 2009 5
Risk Calculus
Policy Paralysis – 1AC
[ ] If you focus on minute probabilities, all actions can have catastrophic consequences –
absolute risk avoidance would stifle all assessment – even if we cannot draw a perfect line, we
should still act on probabilities
Hansson, 2006; professor in philosophy Royal Institute of Technology (Sven Ove; May 23, 2006; “The Epistemology of
Technological Risk”; http://scholar.lib.vt.edu/ejournals/SPT/v9n2/hansson.html#bondi

However, it would not be feasible to take such possibilities into account in all decisions that we make. In a sense, any
decision may have catastrophic unforeseen consequences. If far-reaching indirect effects are taken into account, then –
given the unpredictable nature of actual causation – almost any decision may lead to a disaster. In order to be able to
decide and act, we therefore have to disregard many of the more remote possibilities. Cases can also easily be found in
which it was an advantage that far-fetched dangers were not taken seriously. One case in point is the false alarm on so-
called polywater, an alleged polymeric form of water. In 1969, the prestigious scientific journal Nature printed a letter
that warned against producing polywater. The substance might "grow at the expense of normal water under any
conditions found in the environment," thus replacing all natural water on earth and destroying all life on this planet.
(Donahoe 1969 ) Soon afterwards, it was shown that polywater is a non-existent entity. If the warning had been heeded,
then no attempts would have been made to replicate the polywater experiments, and we might still not have known that
polywater does not exist. In cases like this, appeals to the possibility of unknown dangers may stop investigations and
thus prevent scientific and technological progress. We therefore need criteria to determine when the possibility of
unknown dangers should be taken seriously and when it can be neglected. This problem cannot be solved with
probability calculus or other exact mathematical methods. The best that we can hope for is a set of informal criteria that
can be used to support intuitive judgment. The following list of four criteria has been proposed for this purpose.
(Hansson 1996) 1. Asymmetry of uncertainty: Possibly, a decision to build a second bridge between Sweden and
Denmark will lead through some unforeseeable causal chain to a nuclear war. Possibly, it is the other way around so
that a decision not to build such a bridge will lead to a nuclear war. We have no reason why one or the other of these
two causal chains should be more probable, or otherwise more worthy of our attention, than the other. On the other
hand, the introduction of a new species of earthworm is connected with much more uncertainty than the option not to
introduce the new species. Such asymmetry is a necessary but insufficient condition for taking the issue of unknown
dangers into serious consideration. 2. Novelty: Unknown dangers come mainly from new and untested phenomena. The
emission of a new substance into the stratosphere constitutes a qualitative novelty, whereas the construction of a new
bridge does not. An interesting example of the novelty factor can be found in particle physics. Before new and more
powerful particle accelerators have been built, physicists have sometimes feared that the new levels of energy might
generate a new phase of matter that accretes every atom of the earth. The decision to regard these and similar fears as
groundless has been based on observations showing that the earth is already under constant bombardment from outer
space of particles with the same or higher energies. (Ruthen 1993) 3. Spatial and temporal limitations: If the effects of a
proposed measure are known to be limited in space or time, then these limitations reduce the urgency of the possible
unknown effects associated with the measure. The absence of such limitations contributes to the severity of many
ecological problems, such as global emissions and the spread of chemically stable pesticides. 4. Interference with
complex systems in balance: Complex systems such as ecosystems and the atmospheric system are known to have
reached some type of balance, which may be impossible to restore after a major disturbance. Due to this irreversibility,
uncontrolled interference with such systems is connected with a high degree of uncertainty. (Arguably, the same can be
said of uncontrolled interference with economic systems; this is an argument for piecemeal rather than drastic economic
reforms.) It might be argued that we do not know that these systems can resist even minor perturbations. If causation is
chaotic, then for all that we know, a minor modification of the liturgy of the Church of England may trigger a major
ecological disaster in Africa. If we assume that all cause-effect relationships are chaotic, then the very idea of planning
and taking precautions seems to lose its meaning. However, such a world-view would leave us entirely without
guidance, even in situations when we consider ourselves well-informed. Fortunately, experience does not bear out this
pessimistic worldview. Accumulated experience and theoretical reflection strongly indicate that certain types of
influences on ecological systems can be withstood, whereas others cannot. The same applies to technological,
economic, social, and political systems, although our knowledge about their resilience towards various disturbances has
not been sufficiently systematized.

The Method Lab


Michigan 2009 6
Risk Calculus
Tyranny of Survival – 1AC
[ ] Voting on a minute risk of Nuclear War imposes the Tyranny of Survival - which
ensures oppression and self destructive.

Daniel Callahan, prof of philosophy at Harvard, 1973 [ Co-founder and former director of The
Hastings Institute, “The Tyranny of Survival” p 91-93

In the
The value of survival could not be so readily abused were it not for its evocative power. But abused it has been.
name of survival, all manner of social and political evils have been committed against the
rights of individuals, including the right to life. The purported threat of Communist domination has for
over two decades fueled the drive of militarists for ever-larger defense budgets, no matter what the cost to other social
needs. During World War II, native Japanese-Americans were herded, without due process of law, to detention camps.
This policy was later upheld by the Supreme Court in Korematsu v. United States (1944) in the general context that a
threat to national security can justify acts otherwise blatantly unjustifiable. The survival of the Aryan race was one of
the official legitimations of Nazism. Under the banner of survival, the government of South Africa
imposes a ruthless apartheid, heedless of the most elementary human rights. The Vietnamese war
has seen one of the greatest of the many absurdities tolerated in the name of survival: the destruction of villages in order
to save them. But it is not only in a political setting that survival has been evoked as a final and unarguable value. The
main rationale B. F. Skinner offers in Beyond Freedom and Dignity for the controlled and conditioned society is the
need for survival. For Jacques Monod, in Chance and Necessity, survival requires that we overthrow almost every
known religious, ethical and political system. In genetics, the survival of the gene pool has been put forward as
sufficient grounds for a forceful prohibition of bearers of offensive genetic traits from marrying and bearing children.
Some have even suggested that we do the cause of survival no good by our misguided medical efforts to find means by
which those suffering from such common genetically based diseases as diabetes can live a normal life, and thus
procreate even more diabetics. In the field of population and environment, one can do no better than to cite Paul Ehrlich,
whose works have shown a high dedication to survival, and in its holy name a willingness to contemplate
governmentally enforced abortions and a denial of food to surviving populations of nations which have not enacted
population-control policies. For all these reasons it is possible to counterpoise over against the need for
survival a "tyranny of survival." There seems to be no imaginable evil which some group is
not willing to inflict on another for sake of survival, no rights, liberties or dignities which it is
not ready to suppress. It is easy, of course, to recognize the danger when survival is falsely and manipulatively
invoked. Dictators never talk about their aggressions, but only about the need to defend the fatherland to save it from
destruction at the hands of its enemies. But my point goes deeper than that. It is directed even at a legitimate
concern for survival, when that concern is allowed to reach an intensity which would ignore,
suppress or destroy other fundamental human rights and values. The potential tyranny
survival as value is that it is capable, if not treated sanely, of wiping out all other values.
Survival can become an obsession and a disease, provoking a destructive singlemindedness
that will stop at nothing. We come here to the fundamental moral dilemma. If, both biologically and
psychologically, the need for survival is basic to man, and if survival is the precondition for any and all human
achievements, and if no other rights make much sense without the premise of a right to life—then how will it be
possible to honor and act upon the need for survival without, in the process, destroying everything in human beings
which makes them worthy of survival. To put it more strongly, if the price of survival is human
degradation, then there is no moral reason why an effort should be made to ensure that
survival. It would be the Pyrrhic victory to end all Pyrrhic victories. Yet it would be the defeat of all defeats if,
because human beings could not properly manage their need to survive, they succeeded in not doing so.

The Method Lab


Michigan 2009 7
Risk Calculus
Extensions - Must ignore low probability Impacts
[ ] Risk prioritizes probability – low-probability outcomes should be ignored regardless
of how significant the consequences because low probabilities bias decisionmaking
Vertzberger, 1995, Professor at the Department of International Relations, the Hebrew University of Jerusalem, Israel
(Yaacov Y. I. Vertzberger, June 1995, “Rethinking and Reconceptualizing Risk in Foreign Policy Decision-Making: A
Sociocognitive Approach”, Political Psychology, Vol. 16, No. 2, pp. 347-380, http://www.jstor.org/stable/3791835)

Risk in Foreign Policy Decision-Making makers to their assessments of both outcomes and the probability
distribution of outcomes may range from total confidence to very low confidence. The less valid their estimates,
the higher the chance that inappropriate choices will be made that will result in costs and losses. The level of risk,
from the decision-maker's vantage point, will thus be defined by the answers to the following three questions in this order of presentation: (1)What are
the gains/losses associated with each known outcome? (2)What is the probability of each outcome? (3)How valid are outcome probability and
gain/loss estimates? Risk, then, is defined as the likelihood of the materialization of validly predictable direct and
indirect consequences with potentially adverse values, arising from events, self-behavior, environmental
constraints, or the reaction of an opponent or third party. Accordingly, risk estimates have three dimensions:
outcome-values (desired or undesired), the probability of outcomes,3 and the confidence with which these
estimates of outcome-values and probabilities are held by the decision-maker. Although the three components of actual risk
are in reality independent of one another, that is not the case with subjective risk estimates. Payoffs affect probabilities (Einhorn and Hogarth, 1986),
and probabilities influence the weight of payoffs in the process of choice. Experimental evidence shows that in a choice between two gambles, when
the probability of winning is greater than the probability of losing, choice is most closely associated with the amounts involved in each gamble; but
when the probability of losing is greater than the probability of winning, choice is most closely associated with the probabilities. This indicates that
choice is not independent of the structure of the alternatives available to the decision-maker (Payne, 1975). (For
discursive purposes only we shall continue to treat the three components separately, although they are inherently intertwined.) The validity
dimension affects the weight that decision-makers will attribute to probability and payoff estimates in their risk
evaluation, and in general will bias choice toward preference of decisions with known risks over alternatives with
unknown risks4 (Curley et al., 1986; Ellsberg, 1961; Fischhoff, 1983; Gardenfors & Sahlin, 1982; Heath & Tversky, 1991). This is to a
large degree a reflection of a human need for certainty regarding value and probability estimates 3In probabilistic
terms, perceived risk increases: (a) when the probability of undesirable conse-quences increases and the probability of desirable consequences
decreases, and (b) when the variance of probabilities is greater, that is, when the probability of extreme outcomes increases. In utilities terms (a) the
more undesirable the consequences the greater the perceived risk, and (b) as the distribution of possible consequences is more slanted toward
undesirable consequences, perceived risk increases. In terms of the interaction of probabilities and utilities, an alternative with higher negative SEU
will be rated as riskier than one with lower negative SEU (Milburn & Billings, 1976). 4People have little confidence in intermediate-
level subjective probability estimates and tend not to use them in decision-making. They also have trouble
understanding and interpreting information about low-probability events, so that small probabilities are
particularly prone to biasing (Freuden-burg, 1988; March & Shapira, 1987; Peterson & Lawson, 1989; Sj6berg, 1979; Ster, 1991, p. 106).
These biases result in low-probability outcomes, being ignored regardless of how significant the consequences of
these outcomes are. These biases also result in paying more attention to physical certainties, concrete events, and
well-specified causal relationships at the expense of the less tangible dimensions of a problem. 351 Vertzberger in risky
situations. In making value and probability estimates, decision-makers require a certain threshold of confidence
before they will consider the risk as something worth worrying about. As confidence in high-probability, high-
cost outcomes increases so does the perception of risk, and when confidence declines so does risk perception. This
implies that similar value and probability estimates may result in dissimilar overall risk perceptions and related anxiety by different individuals, since
the probability and/or value estimates may be held with dissimilar levels of confidence. In addition, individuals vary in their minimal confidence
threshold requirements and thus in the degree to which anxiety will increase in response to the same incremental rise in the validity of probability and
value estimates. For some people in risky situations, a small increase in confidence levels of threatening information produces a large leap in anxiety;
for others only a large increase in confidence will cause a significant increase in anxiety. The need for veridicality is most prominent in
decisions involving high stakes, where the costs of errors, including challenges to the legitimacy of decisions,
could prove critical. Validityi s thus a particularlyi mportantc onsider-ation in the formation of risk assessments
where decision-makers have reason to doubt the information available to them.5 In foreign policy this is very
often the case because of the vague or ambiguous nature of foreign policy-related informa-tion

The Method Lab


Michigan 2009 8
Risk Calculus
Extensions - Must ignore low probability Impacts
[ ] The Negative’s impact analysis relies on the Tyranny of Illusory Precision – the
refusal to assign zero risk to low probability impacts is based on a false belief in objectivity
and a tendency to compromise. This destroys the value of probability and divorces debate
from the real world
Dale Herbeck, Professor of Communication at Boston College, 1992 [Director of the Fulton Debating Society at Boston
College, “The Use and Abuse of Risk Analysis in Policy Debate,” Paper Presented at the 78th Annual Meeting of the
Speech Communication Association (Chicago, IL), October 29th-November 1st, Available Online via ERIC Number
ED354559, p. 10-12]

Those of us who judge debate with some frequency know that while it is difficult to quantify probabilities,
risk analysis forces us to assign probabilities to all arguments in a debate. As a result, we may come under, as
John Holdren would call "the tyranny of illusory precision." This phenomenon occurs whenever we take
qualitative judgments, decouple them from their context, and then use these judgments to assign a
probability which is used to justify conclusions. Even if we resist the temptation to assign unwarranted risks,
a related problem is that decision makers often fall prey to the fallacy of the golden mean. According to Edward
Darner, this "fallacy consists in assuming that the mean or middle view between two extremes must be the best or right one simply because it is
the middle view."12 In other words, rather than assess zero probability to an impact, a judge might assume that the
probability necessarily lies somewhere between the two positions advocated in the debate. Recognizing this
tendency, advocates have become quite adept at framing their arguments to justify the attribution of some
amount or probability. Consider, for example, the following quotation from Unberto Saffiotti of the National Cancer Institute: "The most
'prudent' policy is to consider all agents, for which the evidence is not clearly negative under accepted minimum conditions of observation as if
they were positive."13 Of course, the implication is that we must assess some probability of carcinogenicity absent proof to the contrary.
Evidence such as this, when invoked in debate, is often used to justify the claim that there must be some risk of the impact. The "Zero-
Infinity Problem."14 A second problem with risk analysis is that the magnitude of the impact has come to
dominate questions or probability. The result, according to Ehrlich and Ehrlich, is the "zero-infinity
problem." Although the probability of some events is infinitesimally small, the impacts may be so grave that
the risk becomes significant. To illustrate this point, the Ehrlich's cite the example of pancreatic cancer. Although the probability of
getting this form of cancer is extremely small, it is almost always fatal. Accordingly, the fear of contracting pancreatic cancer might be sufficient
to warrant measures which would be unlikely to decrease the incidence of this deadly disease. It is easy to translate the zero-infinity problem to
the debate context. Consider the following risks:
probability impact risk
99 in 100 100,000 lives = 99,000 lives
1 in 100 10,000,000 = 100,000 lives
Of course, the conclusion that can be drawn trom the above example is that a low probability/high impact
argument would generally outweigh a high probability/low impact argument. Being perceptive by nature, debaters
are well aware of this fact. It is, therefore, not surprising that the vast majority of all debate arguments eventually culminate in a nuclear war. By
offering the penultimate of impacts, the skilled advocate can effectively moot the importance of probability.
For the purpose of illustration, assume that a nuclear war would kill exactly one billion people, which may in fact be a conservative assessment. The
incredible argumentative power of this staggering impact is evident in the following statement of risks:
probability impact
1,000,000,000 risk
= 10,000,000 lives
1 in 100 (.01)
1 in 1000 (.001) 1,000,000,000 = 1,000,000 lives
1 in 10,000 (.0001) 1,000,000,000 = 100,000 lives
1 in 100,000 (.00001) 1,000,000,000 = 10,000 lives
1 in 1,000,000 (.000001) 1,000,000,000 = 1,000 lives
In other words, a 1 in 10,000 chance of a disadvantage culminating in nuclear war would be the equivalent of an affirmative saving 100,000 lives.
Not surprisingly, low probability/high impact arguments have come to dominate contemporary debate. Indeed, if
a stranger should hear a debate upon this year's intercollegiate policy topic, they would probably conclude that
any change in development policy, no matter how small, is likely to culminate in a nuclear war. As Star Muir has
observed: This takes form in two disturbing tendencies: an unwillingness to examine more real world impacts of
policies, and a jaded view of global devastation. That first is apparent in the unwillingness of the debaters to argue that a recession, per
se, is bad; that a regional war, not escalating to superpower conflict, w( (aid be a horrible thing. A global recession would probably not cause a
nuclear war, but it doubtless would cause untold suffering and human anguish. A regional war in Africa could kill hundreds of
thousands of lives, easily enough to outweigh a properly mitigated set of case scenarios. The problem is that
debaters won't tell these stories, but they will take the easy way out and read a blurb on World War III.'15
Of course, the problem with such argumentation is that it frequently borders on the absurd.

The Method Lab


Michigan 2009 9
Risk Calculus
Extensions - Must ignore low probability Impacts
[ ] Low-probability high-consequence arguments use mini-max reasoning which distorts
the amount of risk we should assign to impacts
Berube, 2000, Associate Professor of Speech Communication and Director of Debate at the University of South
Carolina (David M. Berube, 2000, “Debunking mini-max reasoning: the limits of extended causal chains in contest
debating” http://www.cedadebate.org/CAD/2000_berube.pdf, pages 53-73)

The lifeblood of contemporary contest debating may be the extended argument. An extended argument is any argument
requiring two or more distinct causal or correlational steps between initial data and ending claim. We find it associated
with advantages to comparative advantage cases, with counterplan advantages, with disadvantages, permutation and
impact turnarounds, some kritik implications, and even probabilistic topicality arguments. In practice, these often are
not only extended arguments. they are causal arguments using mini-max reasoning. Mini-max reasoning is defined as an
extended argument in which an infinitesimally probable event of high consequence is assumed to present a highly
consequential risk. Such arguments, also known as low-probability high-consequence arguments, are commonly
associated with "risk analysis." The opening statement from Schell represents a quintessential mini-max argument.
Schell asked his readers to ignore probability assessment and focus exclusively on the impact of his claim. While Schell
gave very specific reasons why probability is less important than impact in resolving this claim, his arguments are not
impervious to rebuttal. What was a knotty piece of evidence in the 1980s kick-started a practice in contest debating
which currently is evident in the ubiquitous political capital disadvantage code-named "Clinton." Here is an example of
the Clinton disadvantage. In theory, plan action causes some tradeoff (real or imaginary) that either increases or
decreases the President's ability to execute a particular agenda. Debaters have argued the following: Clinton (soon to be
Gore or Bush) needs to focus on foreign affairs. A recent agreement between Barak and Assad needs presidential
stewardship. The affirmative plan shifts presidential focus to Nigeria that trades off with focus on the Middle East. As a
result, the deal for the return of the Golan Heights to Syria fails. Violence and conflict ensues as Hizbollah terrorists
launchGuerilla attacks into northern Israel from Lebanon. Israel strikes back. Hizbollah incursions increase. Chemical
terrorism ensues and Israel attacks Hizbollah strongholds in southern Lebanon with tactical nuclear weapons. Iran
launches chemical weapons against Tel Aviv. Iraq allies with Iran. The United States is drawn in. Superpower
miscalculation results in all-out nuclear war culminating in a nuclear winter and the end of all life on the planet. This
low-probability high-consequence event argument is an extended argument using mini-max reasoning. The appeal of
mini-max risk arguments has heightened with the onset of on-line text retrieval services and the World Wide Web, both
of which allow debaters to search for particular words or word strings with relative ease. Extended arguments are
fabricated by linking evidence in which a word or word string serves as the common denominator, much in the fashion
of the sorities (stacked syllogism): AaB, BaC, CaD, therefore AaD. Prior to computerized search engines, a contest
debater's search for segments that could be woven together into an extended argument was incredibly time consuming.
The dead ends checked the authenticity of the extended claims by debunking especially fanciful hypotheses. Text
retrieval services may have changed that. While text retrieval services include some refereed published materials, they
also incorporate transcripts and wire releases that are less vigilantly checked for accuracy. The World Wide Web allows
virtually anyone to set up a site and post anything at that site regardless of its veracity. Sophisticated super search
engines, such as Savvy Search® help contest debaters track down particular words and phrases. Searches on text
retrieval services such as Lexis-Nexis Universe® and Congressional Universe® locate words and word strings within n
words of each other. Search results are collated and loomed into an extended argument. Often, evidence collected in this
manner is linked together to reach a conclusion of nearly infinite impact, such as the ever-present specter of global
thermonuclear war. Furthermore, too much evidence from online text retrieval services is unqualified or under-
qualified. Since anyone can post a web page and since transcripts and releases are seldom checked as factual, pseudo-
experts abound and are at the core of the most egregious claims in extended arguments using mini-max reasoning. In
nearly every episode of fear mongering . . . people with fancy titles appeared. . . [F]or some species of scares . . .
secondary scholars are standard fixtures. . . . Statements of alarm by newscasters and glorification of wannabe experts
are two telltales tricks of the fear mongers' trade. . . : the use of poignant anecdotes in place of scientific evidence, the
christening of isolated incidents as trends, depictions of entire categories of people as innately dangerous. . . . (Glassner
206, 208) Hence, any warrant by authority of this ilk further complicates probability estimates in extended arguments
using mini-max reasoning. Often the link and internal link story is the machination of the debater making the claim
rather than the sources cited in the linkage. The links in the chain may be claims with different, if not inconsistent,
warrants. As a result, contextual considerations can be mostly moot.

The Method Lab


Michigan 2009 10
Risk Calculus

The Method Lab


Michigan 2009 11
Risk Calculus
Extensions - Must ignore multiple internal link Impacts
[ ] Multiple internal link calculations fail – different conditions make determining
probability impossible
Berube, 2000, Associate Professor of Speech Communication and Director of Debate at the University of South
Carolina (David M. Berube, 2000, “Debunking mini-max reasoning: the limits of extended causal chains in contest
debating” http://www.cedadebate.org/CAD/2000_berube.pdf, pages 53-73)

The complex probabilities of extended arguments are problematic. For example, too much reliance is given an extended
link story when each step in the link exhibits a probability that is geometrically self-effacing. According to the
traditional multiplication theorem, if a story is drawn from AaBaCaD, the probabilities of AaB and BaC and CaD are
multiplied. "The probability that two subsequent events will both happen is a ratio compounded of the probability of the
1st, and the probability of the 2nd in the supposition the 1st happens" (Bayes 299). If the probability of AaB is .10 and
the probability of BaC is also .10, then the probability of AaC is .01. If the probability of CaD is also .10, then the
probability of AaD is .001. If all we had to do to determine probability involved multiplying fractions, calculating
probabilities would be easy. Unfortunately, such is not the case. An interesting caveat involves conditional probability.
"Its expositors hold that we should not concern ourselves with absolute probabilities, which have no relevance to things
as they are, but with conditional probabilities - the chances that some event will occur when some set of previous
conditions exists" (Krause 67). Conditional probabilities are most often associated with calculations involving variables
that may be even remotely associated, such as phenomena in international relations. If one considers the probability of
many separate events occurring, one must also consider whether or not they are correlated - that is, whether or not they
are truly independent. If they are correlated, simply multiplying individual probabilities will not give you the correct
estimate, and the final probability may actually be much larger than one will predict if one makes this error. For
example, the probability that I will utter an obscenity at any given instance may be small (although it is certainly not
zero). The probability that I will hit my funny bone at any given instant is also small. However, the probability that I
will hit my funny bone and then utter an obscenity is not equal to the product of the probabilities, since the probability
of swearing at a given instant is correlated to the probability of hurting myself at a given instant. (Krause 67) Hence, "if
we calculate a priori the probability of the occurred event and the probability of an event composed of that one and a
second one which is expected, the second probability divided by the first will be the probability of the event expected,
drawn from the observed event" (Laplace 15). Another complication of extended causal chains is the corroboration
principle. "There are cases in which each testimony seems unreliable (i.e., has less than 0.5 probability) on its own, even
though the combination of the two testimonies is rather persuasive. . . . [I]f both testimonies are genuinely independent
and fully agree with one another, we are surely going to be inclined to accept them" (Cohen 72). When we are uncertain
about a probability, we might try to engage multiple sources making the same or same-like claim. We feel it is less
likely that two or more sources are incorrect than that a single source will be. While corroboration seems valid, it is a
persuasive pipe-dream. If we use this calculus to draw our claims, errors are likely to be shared and replicated. Witness
some of the problems associated with realism in international relations literature.

The Method Lab


Michigan 2009 12
Risk Calculus
Extend - Policy Paralysis
[ ] Probability below certain thresholds should be ignored – otherwise it would paralyze
policymaking.
Nicholas Rescher is an American philosopher, University Professor of Philosophy and Chairman of the Center for
Philosophy of Science at University of Pittsburgh. 1983. [“Risk: A Philosophical Introduction to the Theory of Risk
Evaluation and Management”, p35-36. 1983.]

A probability is a number between zero and one. Now numbers between zero and one can get to be very small
indeed: As N gets Bigger, 1/N will grow very, very small. What, then, is one to do about extremely small
probabilities in the rational management of risks? On this issue there is a systemic disagreement between
provabilities working mathematics or natural science and decision theorists who work on issues relating to human
affairs. The former take the line that small numbers are small numbers and must and must be taken into account as
such. The latter tend to make the view that small probabilities represent extremely remote prospects and can be
written off. (De minimis non curat lex, as the old precept has it: there is no need to bother with trifles.) When
something is about as probable as it is that a thousand fair dice when tossed a thousand times will all come up sizes,
then, so it is held, we can pretty well forget about it as a wrothy concern. As a matter of practical policy we operate
with probabilities on the principle that when x < e, then x ~= 0. Where human dealings in real-life situations are
concerned, sufficiently remote possibility can – for all sensible purpose – be viewed as being of probability zero,
and possibilities with which they associated set aside. In “the real world” people are prepared to treat certain
probabilities as effectively zero, taking certain sufficiently improbable eventualities as no longer representing real
possibilities. In such a case our handling of the probabilities at issue is essentially a matter of fiat, of deciding as a
matter of policy that a certain level of sufficiently low probability can be taken as a cut-off point below which we
are no longer dealing with “real possibilities” and with “genuine risks.” In real-life deliberations, in the law
(especially in the context of negligence) and indeed throughout and setting of out practical affairs, it is necessary to
distinguish between real and unreal (or “merely theoretical”- possibilities. Once the probability of an event gets to
be small enough, the event at issue may be seen as no longer a real possibility. (theoretically possible though it mat
be). Such an event is something we can simply write off as being “outside the range of appropriate concern,”
something we can dismiss for “all practical purposes.” As one writer on insurance puts it: [P]eople... refuse to worry
about losses whose probability is below some threshold probabilities below the threshold are treated as though they
were zero.”

The Method Lab


Michigan 2009 13
Risk Calculus
Extend - Policy Paralysis
[ ] Some probabilities are effectively zero – if we don’t ignore them, we would be
paralyzed
Nicholas Rescher is an American philosopher, University Professor of Philosophy and Chairman of the Center for
Philosophy of Science at University of Pittsburgh. “Risk: A Philosophical Introduction to the Theory of Risk
Evaluation and Management”, p35-36. 1983.

No doubt, events of such possibility can happen in some sense of the term, but this “can” functions somewhat
figuratively – it is no longer something presents a realistic prospect. To be sure, this recourse to effective zero-hood
does not represent a strictly objective, ontological circumstance. It reflects a matter of choice or decision, namely the
practical step of treating certain theoretically extant possibilities as unreal – as not worth bothering about, as
meriting being set at zero, as being literally negligible. Of course, the question remains: How small is small enough
for being “effectively zero”? With that value of x does x ~= 0 begin: just exactly where does the threshold of
effective zerohood lie.? This is clearly not something that be n resolved in a once-and-for-all-manner. It may vary
from individual to individual, changing with th the “cautiousness” of the person involved, representing an scene of
an individual's stance towards the whole risk-taking process. And it may also vary with the magnitude of the stake at
issue. For it seems plausible to allow the threshold of effective zerohood to be readjusted with the magnitude of the
threat of issue taking lower values as the magnitude of the at issue increases. (Such a policy seems in smooth accord
with the fundamental principle of risk management that greater potential losses should be risked only when their
chance for realization are less.) In deliberating about risks to human life, for example, there is a some tendency to
take as a baseline the chance of death by natural disasters (or “acts of God”), roughly 1/1,000,000 per annum in the
USA. This would be seen as something akin to the “noise level” of a physical system of fatality probabilities
significantly smaller then this would thus be seen as negligible. Such an approach seems to underly the Good and
Drug Administration's proposed standards of “1 in a million over a lifetime.” People's stance in the face of the
probability that when embarking on a commercial airplane trip they will end up as an aviation fatality (which stands
at roughly 3 x 10-8 for the U.S.A.) also illustrates this perspective. (Most neither worry - nor insure unless "the
company pays.") But an important point must be noted in this connection. The probability values that we treat as
effectively zero must be values of which, in themselves, we are very sure indeed. But real-life probability values are
seldom all the precise. And so in general there will be considerable difficulty in sustaining the judgment that a
certain probablity ideed is effectively zero. A striking instance is afforded by the Atomic Energy Commission-
sponsored "Rasmussen report" (named after Norman C. Rasmussen, the study director) on the accident risks of
nuclear power plants: From the viewpoint of a person living in the general vicinity of a reactor, the likelihood of
being killed in any one year in a reactor accident is one chance in 300,000,000 and the likelihood of being injured in
any one year in a reactor accident is one chance in 150,000,000. The theoretical calculations that sustain such a
finding invoke so many assumptions regarding facts, circumstances, and operating principles that such probability
estimates are extremely shaky. Outside the domain of purely theoretical science we are too readily plunged below
the threshold of experimental error, and will thus confront great difficulties in supporting minute probability
distinctions in the sphere of technological and social applications. Statistical probabilities can be very problematic in
this regard, in particular since statistical data are often deficient or unavailable in the case of very rare events.
Personal probabilities too are very vulnerable in this context of assessing very low probabilities. For example, the
flood victims interviewed by the geographer R. W. Kates flatly denied that floods could ever recur in their area,
erroneously attributing previous floods to freak combinations of circumstances that were extremely unlikely to
recur. 16 One recent writer has asserted, not without reason, that in safety-engineering contexts it simply is not
possible to construct sufficiently convincing arguments to support very small probabilities (below 10-5).17 And
indeed a diversified literature has been devoted to describing the ways in which the estimation of very low
probabilities can go astray.

The Method Lab


Michigan 2009 14
Risk Calculus
Alternative – Prioritize Uniqueness
[ ] Prioritizing Uniqueness as an absolute take out is essential to avoid becoming
enslaved to infinite risk
Dale Herbeck, Professor of Communication at Boston College, 1992 [Director of the Fulton Debating Society at Boston
College, “The Use and Abuse of Risk Analysis in Policy Debate,” Paper Presented at the 78th Annual Meeting of the
Speech Communication Association (Chicago, IL), October 29th-November 1st, Available Online via ERIC Number
ED354559, p. 10-12]

Third, we must not allow ourselves to become enslaved to large impacts. The fact that the impact is grave, does not, in and
of itself, mean that there is any probability associated with the outcome. Consider, for example, a disadvantage which
posited that the plan would increase the risk of species extinction. While it is true that species extinction would have
serious consequences, this fact should not force us to mindlessly reject any policy that might cause species extinction.
Further, we should take care in assessing evidence purporting to prove that a prudent policy maker should reject any action that risks the impact. In
other words, evidence claiming that species extinction is the ultimate of all evils is not sufficient to prove that the
affirmative case should be summarily rejected. Finally, we must rehabilitate the importance of uniqueness arguments in
debate. When arguing the position is not unique, an advocate is arguing that the disadvantage should already have
occurred or will inevitably occur in the status quo. For example, when arguing uniqueness against a budget
disadvantage, an affirmative would argue that the President and/or Congress have routinely increased spending.
Therefore, such spending should cause the disadvantage. The problem in debate today is that judges consistently assign
some level of risk to disadvantages even when the affirmative presents uniqueness arguments which have a greater link
to the disadvantage than the affirmative plan. Consider the following example. Suppose an affirmative team advocated a plan
which provided for increased military training of Bangladesh's army under the International Military Education and Training
Program (IMET). Against this plan, suppose the negative advocated a disadvantage claiming that increased U.S. influence
in Bangladesh would cause a loss of Indian influence in Bangladesh, causing them to lash-out as a way of reclaiming
their influence in South Asia. Given the fact that the United States has given Bangladesh over 3 billion dollars over the
past 20 years,19 and given the fact that U.S. influence in South Asia is vastly increasing due to the virtual collapse of Soviet influence in the
region,20 it would be ludicrous to assume that there is any unique risk of India fearing a minimal expansion of the IMET
program to Bangladesh. In this example, where the uniqueness arguments prove a greater increase in U.S. influence
than will ever occur under the affirmative plan, a judge should conclude that there is zero risk to adopting the
affirmative plan. Unfortunately, many judges in this situation would irrationally assign some minimal risks to the
disadvantage. They would reason that there is always some risk, albeit small, to adopting the affirmative plan. Yet, such
reasoning makes a mockery of the concept of uniqueness arguments. If a uniqueness argument proves that the status
quo actions will be larger than the affirmative's link to the disadvantage, then it has sufficiently demonstrated that there
is no unique risk to adopting the affirmative plan. Under these circumstances, the judge should assign zero risk to the
disadvantage.

The Method Lab


Michigan 2009 15
Risk Calculus
Alternative – Expert Opinion
[ ] Expert opinion is critical to assessing probabilities for risk calculus
Vertzberger, 1995, Professor at the Department of International Relations, the Hebrew University of Jerusalem, Israel
(Yaacov Y. I. Vertzberger, June 1995, “Rethinking and Reconceptualizing Risk in Foreign Policy Decision-Making: A
Sociocognitive Approach”, Political Psychology, Vol. 16, No. 2, pp. 347-380, http://www.jstor.org/stable/3791835)

Risk in Foreign Policy Decision-Making issues, such as environmental hazards, where they perceive these criteria as relevant. In
person-based validation, confidence is rooted in the individual who is the source of the knowledge. The user does not care about how
knowledge was generated, but who is disseminating it. Confidence in a particular person may derive from several sources: innate
qualities (like charisma); affective qualities (such as liking); past experience (the person has proven himself in the past to be
credible); an established relationship (a long-time friend), and other reasons. The third source of confidence is belief-based
validation. In this case, knowledge that conforms or is congruent with important beliefs of the user will be considered as valid, even
if the knowledge is methodologically flawed. In this case, the identity of the person delivering the knowledge is of little consequence
to the user. Ideologues, such as former President Ronald Reagan, are inclined to use this validation criterion. Thus Reagan's
confidence in the exaggerated assess-ments that there was a high probability that a Marxist regime in Grenada would pose a high
threat to the United States was belief-based. It stemmed from his intense belief in the evil motives driving such regimes and their
unavoidable subordination to the interest of the Soviet Union. The fourth and least important source is situation-based validation.
Here context determines the reliability of knowledge, which is applied when the observer distrusts his or her information sources.
Situation-based validation is based on the premise that in particular situations the information either cannot be manipulated and
therefore can be trusted or the source of information has no incentive to manipulate information because the costs are too high or the
gains are marginal. Being cognitive misers, people tend to devise a hierarchy of their most preferred to least preferred validation
criteria. Judgment of the reliability of value and probability assessments will relate to this hierarchy. Decision-makers will start by
applying their most preferred criterion. If the most preferred criterion cannot be applied in their judgment of reliability, they will
proceed to the next level in the hierarchy, and so on, moving down the list of preferences, each step representing a lower level of
confidence. Preference for one source of validation over another has important implications for the increase or decrease of confidence
levels over time. Epistemic-based validation has built-in rules for discrediting or falsifying currently held assessments. Person-based
validation will change when there is diminished trust in a particular person or when a highly regarded person provides invalidating
information. Belief-based validation is the most difficult to discredit because beliefs, especially core beliefs, change very slowly.
Practically the only quick way of convincing a decision-maker relying on belief-based validation to change is by reframing the
information in a manner that will convince the decision-maker that it is not congruent any longer with his or her beliefs, or that the
preference for reliance on belief-based validation was an error.

[ ] Expert opinion is critical to assessing probabilities for risk calculus


Hansson, 2006; professor in philosophy Royal Institute of Technology (Sven Ove; May 23, 2006; “The Epistemology of
Technological Risk”; http://scholar.lib.vt.edu/ejournals/SPT/v9n2/hansson.html#bondi

Technological risks depend not only on the behaviour of technological components, but also on human behaviour.
Hence, the risks in a nuclear reactor depend on how the personnel act both under normal and extreme conditions.
Similarly, risks in road traffic depend to a large degree on the behaviour of motorists. It would indeed be difficult to find
an example of a technological system in which failure rates depend exclusively on its technical components, with no
influence from human agency. The risks associated with one and the same technology can differ drastically between
organizations with different attitudes to risk and safety. This is one of the reasons why human behaviour is even more
difficult to predict than the behaviour of technical components. Humans come into probability estimates in one more
way: It is humans who make these estimates. Unfortunately, psychological studies indicate that we have a tendency to
be overly confident in our own probability estimates. In other words, we tend to neglect the possibility that our own
estimates may be incorrect. It is not known what influence this bias may have on risk analysis, but it certainly has a
potential to create errors in the analysis. (Lichtenstein, 1982, 331) It is essential, for this and other reasons, to make a
clear distinction between those probability values that originate in experts' estimates and those that are known through
observed frequencies. There are occasions when decisions are influenced by worries about possible dangers although
we have only a vague idea about what these dangers might look like. Recent debates on biotechnology are an example
of this. Specific, identified risks have had only a limited role in these debates. Instead, the focus has been on vague or
unknown dangers such as the creation of new life-forms with unforeseeable properties.

The Method Lab


Michigan 2009 16
Risk Calculus
Alternative – Prioritize Probability
[ ] Risk is based on the knowledge of the probability of an outcome
Vertzberger, 1995, Professor at the Department of International Relations, the Hebrew University of Jerusalem, Israel
(Yaacov Y. I. Vertzberger, June 1995, “Rethinking and Reconceptualizing Risk in Foreign Policy Decision-Making: A
Sociocognitive Approach”, Political Psychology, Vol. 16, No. 2, pp. 347-380, http://www.jstor.org/stable/3791835)

risk has to be viewed as a compendium that represents a complex


What is risk? As a real-life construct of human behavior,
interface among a particular set of behaviors and outcome expectations, taking into account the environmental
context in which these behaviors take place. (For a review of various social science approaches to the concept of risk, see Bradbury,
1989, and Renn, 1992.) Risk must be approached in a nontechnical manner, and hence the common distinction
between risk and uncertainty is neither realistic nor practical when applied to the analysis of nonquantifiable
and ill-defined problems, such as those posed by important politico-military issues. The classical distinction
found in economics between risk and uncertainty postulates that risk exists when decision-makers have perfect
knowledge of all possible outcomes associated with an event and the probability distribution of their occurrence;
whereas uncertainty exists when a decision-maker has neither the knowledge of nor the objective probabilities
distribution of the outcomes associated with an event (Kobrin, 1979, p. 70; 1982, pp. 41-43). These definitions tend to overlook
outcome values. Yet the term "risk" in everyday language and as commonly understood by decision-makers has a
utility-oriented connotation. "[T]he word risk now means danger, high risk means a lot of danger.... The
language of risk is reserved as a specialized lexical register for political talk about undesirable outcomes" (Douglas,
1990, p. 3; also March & Shapira, 1987). Uncertainty, on the other hand, is information-oriented and connotes a state of incomplete information. At
the most extreme case, uncertainty entails even lack of information regarding what dimensions are relevant to the description of the risk-set associated
with a particular case of intervention. This is defined as descriptive uncertainty. It is possible, however, that although the relevant problem dimensions
are known, their values are not; this is defined as measurement uncertainty (Rowe, 1977, pp. 17-18).

[ ] Risk assessment must focus on probability – this mirrors the real world and is the
most precise method
Hansson, 2006; professor in philosophy at the Royal Institute of Technology (Sven Ove; May 23, 2006; “The
Epistemology of Technological Risk”; http://scholar.lib.vt.edu/ejournals/SPT/v9n2/hansson.html#bondi

Beginning in the late 1960's, growing public concern with new technologies gave rise to a new field of applied science: the study of risk. Researchers
from the technological, natural, behavioral, and social sciences joined to create a new interdisciplinary subject, risk analysis. (Otway 1987) The aim of
this discipline is to produce as exact knowledge as possible about risks. Such knowledge is much in demand, not least among managers, regulators,
and others who decide on the choice and use of technologies. But to what extent can risks be known? Risks are always connected to lack
of knowledge. If we know for certain that there will be an explosion in a factory, then there is no reason for us to
talk about that explosion as a risk. Similarly, if we know that no explosion will take place, then there is no reason
either to talk about risk. What we refer to as a risk of explosion is a situation in which it is not known whether or
not an explosion will take place. In this sense, knowledge about risk is knowledge about the unknown. It is therefore
a quite problematic type of knowledge. Although many have tried to make the concept of risk as objective as possible, on a fundamental level it is an
essentially value-laden concept. More precisely, it is negatively value-laden. Risks are unwanted phenomena. The tourist who hopes for a
sunny week talks about the "risk" of rain, but the farmer whose crops are threatened by drought will refer to the possibility of rain as a "chance" rather
than a "risk." There are more distinctions to be made. The word "risk" is used in many senses that are often not sufficiently
distinguished between. Let us consider four of the most common ones. (1) risk = an unwanted event which may or may not
occur. This is how we use the word "risk" when we say that lung cancer is one of the major risks that affect smokers, or that aeroembolism is a
serious risk in deep-sea diving. However, we may also describe smoking as a (health) risk or tell a diver that going below 30 meters is a risk not worth
taking. We then use the word "risk" to denote the event that caused the unwanted event, rather than the unwanted event itself: (2) risk = the
cause of an unwanted event which may or may not occur. In addition to identifying the risks, i.e. finding out what they
are, we also want to determine how big they are. This is usually done by assigning to each risk a numerical value
that indicates its size or seriousness. The numerical value most often used for this purpose is the probability of
the event in question. Indeed, risks are so strongly associated with probabilities that we often use the word "risk"
to denote the probability of an unwanted event rather than that event itself. This terminology is particularly common in
engineering applications. When a bridge-builder discusses the risk that a bridge will collapse, or an electrical engineer investigates the risk of power
failure, they are almost sure to use "risk" as a synonym of probability.

The Method Lab


Michigan 2009 17
Risk Calculus
Alternative – Prioritize Probability
[ ] Probability is necessary for the quantification of risk
Hansson, 2006; professor in philosophy Royal Institute of Technology (Sven Ove; May 23, 2006; “The Epistemology of
Technological Risk”; http://scholar.lib.vt.edu/ejournals/SPT/v9n2/hansson.html#bondi

it is useful to have risks quantified so that they can easily be compared and
From a decision-maker's point of view,
prioritized. But to what extent can quantification be achieved without distorting the true nature of the risks
involved? As we have just seen, probabilities are needed for both of the common types of quantification of risk (the third
and fourth senses of risk). Without probabilities, established practices for quantifying risks cannot be used. Therefore,
the crucial issue in the quantification of risk is the determination of probabilities.

[ ] Probability is the most important aspect of an argument


Berube, 2000, Associate Professor of Speech Communication and Director of Debate at the University of South
Carolina (David M. Berube, 2000, “Debunking mini-max reasoning: the limits of extended causal chains in contest
debating” http://www.cedadebate.org/CAD/2000_berube.pdf, pages 53-73)

The strength of the relationship between the claims in extended arguments rests on the probability of the
causation between and among the simple claims. The relationship between each claim in an extended argument
is moderated by its probability. Probability is challenging to define. Many scientists and members of the risk assessment community "have
not as yet come to grips with the foundational issue about the meaning of probability and the various interpretations that can be attached to the term
probability. This is extremely important, for it is how one views probability that determines one attitude toward a statistical procedure" (Singpurwalla
182). We employ the notion of probability when we do not know a thing with certainty. But our uncertainty is
either purely subjective (we do not know what will take place, but someone else may know) or objective (no one
knows, and no one can know). Subjective probability is a compass for an informational disability.. . . Probability is, so to speak, a cane for a
blind man; he uses it to feel his way. If he could see, he would not need the cane, and if I knew which horse was the fastest, I would not need
probability theory. (Lem 142) In simple arguments, "risks are simply the product of probability and consequence" (Thompson &
Parkinson 552). Thompson and Parkinson found a difficulty in risk assessment associated with mini-max arguments that they identified as the
problem of risk tails. "Risk tails are the segments of the standard risk curve which approach the probability and consequence axes. The tails represent
high-consequence low-probability risk and low-consequence high-probability risk" (552). This region, especially the high-consequence low-
probability tail, is the site of mini-max computation.

The Method Lab


Michigan 2009 18
Risk Calculus
Alternative – Threshold Probability
[ ] The alternative to Infinite risk is Threshold probability – we should ignore risks that
fall below a minimal level of probability
Dale Herbeck, Professor of Communication at Boston College, 1992 [Director of the Fulton Debating Society at Boston
College, “The Use and Abuse of Risk Analysis in Policy Debate,” Paper Presented at the 78th Annual Meeting of the Speech
Communication Association (Chicago, IL), October 29th-November 1st, Available Online via ERIC Number ED354559, p. 10-
12]

First, and foremost, we need to realize that some risks are so trivial that they are simply not meaningful. This is not to argue that
all low probability/high impact arguments should be ignored, but rather to suggest that there is a point beneath which
probabilities are meaningless. The problem with low probability arguments in debate is that they have taken on a life of their
own. Debate judges routinely accept minimal risks which would be summarily dismissed by business and political leaders. While
it has been argued that our leaders should take these risks more seriously, we believe that many judges err in assessing any
weight to such speculative arguments. The solution, of course, is to recognize that there is a line beyond which probability is not
meaningfully evaluated. We do not believe it is possible to conclude, given current evidence and formats of debate, that a plan
might cause a 1 in 10,000 increase in the risk of nuclear conflagration.17 Further, even if it were possible, we need to recognize
that at some point a risk becomes so small that it should be ignored. As the Chicago Tribune aptly noted, we routinely dismiss the
probability of grave impacts because they are not meaningful: It begins as soon as we awake. Turn on the light, and we risk
electrocution; several hundred people are killed each year in accidents involving home wiring or appliances. Start downstairs to
put on the coffee, and you're really asking for it; about 7,000 Americans die in home falls each year. Brush your teeth, and you
may get cancer from the tap water. And so it. goes throughout the day -- commuting to work, breathing the air, working, having
lunch, coming home, indulging in leisure time, going back to bed.18 Just as we ignore these risks in our own lives, we should be
willing to ignore minimal risks in debates. Second, we must consider the increment of the risk. All too often, disadvantages claim
that the plan will dramatically increase the risk of nuclear war. This might be true, and still not be compelling, if the original risk
was itself insignificant. For example, it means little to double the probability of nuclear war if the original probability was only 1
in one million. To avoid this temptation, advocates should focus on the initial probability, and not on the marginal doubling of the
risk claimed by the negative.

[ ] The alternative to mini max arguments is to disregard low probabilities – this


restores rationality to decisionmaking
David Berube, Associate Professor of Speech Communication at the University of South Carolina, 2000 [Director of
Debate “Debunking Mini-max Reasoning: The Limits Of Extended Causal Chains In Contest Debating,” Contemporary
Argumentation and Debate, Volume 21, Available Online at http://www.cedadebate.org/CAD/2000_berube.pdf, Accessed 04-05-
2008, p. 64-69]

If extended arguments using mini-max reasoning is so indefensible, what can we do? Surprisingly, the answer is quite a lot. As a
starting point, we need to reject the notion that contest debating would be impossible without [mini-max debating} them. We
could demand a greater responsibility on the part of arguers making mini-max claims (a subject approached below). Debaters
could use their plans and counterplans to stipulate the internal link and uniqueness stories for their extended arguments,
consequently focusing the debate on probability assessment and away from exaggerated impacts. Alternatively, debaters may
select to discuss ideas as we have seen in the recent trend toward kritik debating. In addition, we need to understand that burdens
of proof associated with extended arguments involving mini-max reasoning are not always extraordinary. Here is one rationale
why it might be imprudent to reject all instances involving mini-max claims. Consider these two questions. Should we decide to
forego a civil rights initiative in the U.S. because it may lead to a war in the Middle East? Should we refrain from building a
plutonium reprocessing plant nearby to avoid the heightened incidence of cancer? We might accept the second more regularly
than the first. The reason the second extended argument should be more presumptive is simply because interceding variables that
might preclude the consequence are less reliable than in the first scenario because they would be derivative. In other words, the
fix would need to be designed by agents similarly motivated. Just like "realist" foreign policy theorists may think too much alike,
so do agents who are acting within the same agency. Unlike the second scenario, agents able to intercede between civil rights
legislation and U.S.-Israeli foreign relations come from different disciplines and worldviews (different directions) and are less
likely to share motivations which might prevent their capability to interpose end stops into a particular series of occurrences.
With these caveats out of the way and assuming some mini-max extended arguments are more reliable than others, I propose a
number of tests by which the strength of particular mini-max extended arguments might be adduced. The tests fall into three
general categories: probability and confidence, scenario construction, and perceptual bias. I offer these tests merely as
suggestions and in full awareness of the fact that they hardly exhaust the potential checks on extended arguments using mini-max
reasoning.

The Method Lab


Michigan 2009 19
Risk Calculus
They Say “Risk = Probability times impact = infinity”
[ ] Basing risk assessment on impact times size makes assessment too indeterminate – it
ignores subjectivities in how to measure and evaluate expectations
Hansson, 2006; professor in philosophy Royal Institute of Technology (Sven Ove; May 23, 2006; “The Epistemology of
Technological Risk”; http://scholar.lib.vt.edu/ejournals/SPT/v9n2/hansson.html#bondi

In risk-benefit analysis, i.e. the systematic comparison of risks with benefits, risks are measured in terms of their
expectation values. This is also the standard meaning of risk in several branches of risk research. Hence, in studies of
risk perception the standard approach is to compare the degree of severity that subjects assign to different risk factors
("subjective risk") to the expectation values that have been calculated for the same risk factors ("objective risk"). The
underlying assumption is that there is an objective, knowable risk level that can be calculated with the expectation value
method. However, this measure of risk is problematic for at least two fundamental reasons. First, probability-weighing
is normatively controversial. A risk of 1 in 1000 that 1000 persons will die is very different from a risk of 1 in 10 that
10 persons will die. Although the expectation values are the same, moral reasons can be given to regard one of these
two situations as more serious than the other. In particular, proponents of a precautionary approach maintain that
prevention against large but improbable accidents should be given a higher priority than what would ensue from an
expectation value analysis. (O'Riordan and Cameron 1994, O'Riordan et al 2001) The other problem with the
expectation value approach is that it assesses risks only according to their probabilities and the severity of their
consequences. Most people's appraisals of risks are influenced by factors other than these. In particular, the expectation
value method treats risks as impersonal entities, and pays no attention to how risks and benefits are distributed or
connected. In contrast, the relations between the persons affected by risks and benefits are important in most people's
appraisals of risks. It makes a difference if it is myself or someone else that I expose to a certain danger in order to earn
myself a fortune. If the expectation value is the same in both cases, we can arguably say that the size of the risk is the
same in both cases. It does not follow, however, that the risk is equally acceptable in the two cases. More generally
speaking, if we use expectation values to measure the size of risks, this must be done with the reservation that the size
of a risk is not all that we need to in order to judge whether or not it can be accepted. Additional information about its
social context is also needed.

[ ] Indeterminate risk assessment is counterproductive – it creates the illusion of


objectivity and masks dangers
Hansson, 2006; professor in philosophy Royal Institute of Technology (Sven Ove; May 23, 2006; “The Epistemology of
Technological Risk”,; http://scholar.lib.vt.edu/ejournals/SPT/v9n2/hansson.html#bondi

In real life we are seldom in a situation like that at the roulette table, when all probabilities are known with certainty (or
at least beyond reasonable doubt). Most of the time we have to deal with dangers without knowing their probabilities,
and often we do not even know what dangers we have ahead of us. This is true not least in the development of new
technologies. The social and environmental effects of a new technology can seldom be fully grasped beforehand, and
there is often considerable uncertainty with respect to the dangers that it may give rise to. (Porter 1991) Risk analysis
has, however, a strong tendency towards quantification. Risk analysts often exhibit the tuxedo syndrome: they proceed
as if decisions on technologies were made under conditions analogous to gambling at the roulette table. In actual fact,
however, these decisions have more in common with entering an unexplored jungle. The tuxedo syndrome is dangerous
since it may lead to an illusion of control. Risk analysts and those whom they advice may believe that they know what
the risks are and how big they are, when in fact they do not.When there is statistically sufficient experience of an event-
type, such as a machine failure, then we can determine its probability by collecting and analysing that experience.
Hence, if we want to know the probability that the airbag in a certain make of car fails to release in a collision, we
should collect statistics from the accidents in which such cars were involved.

The Method Lab


Michigan 2009 20
Risk Calculus
They Say “Risk = Probability times impact = infinity”
[ ] Probability times magnitude does not take real world event into account and does not
work in decision making – it denies decision makers crucial information and flexibility
Hansson professor in philosophy Royal Institute of Technology 2007 [Hélène Hermansson, Sven Ove Hansson. Risk
Management. Basingstoke: Jul 2007. Vol. 9, Iss. 3; pg. 129, 16 pgs JSTOR]

In other words, risk is defined as the product of probability and severity. Probabilistic risk analysis (PRA) is a highly
useful tool that provides risk managers with important information. One of its advantages is its commensurability
with economic analysis, with which it can be combined in the form of risk-benefit analysis. However, neither PRA
nor risk-benefit analysis provides decision-makers with all the information that they need in order to make risk
management decisions. In particular, important ethical aspects are not covered in these forms of risk analysis. The
ethics of risk-taking and risk imposition concerns problems of agency and interpersonal relationships that cannot be
adequately expressed in a framework that operates exclusively with the probabilities and severities of outcomes.
Risks do not just "exist" as free-floating entities; they are taken or imposed. In order to appraise an action of risk-
taking or risk imposition from a moral point of view, it is not sufficient to know the values and probabilities of its
possible outcomes. We also need to know who performs the action and with what intentions. For instance, it makes
a moral difference if someone risks /her own life or that of somebody else in order to earn a fortune for/herself. It
also makes a difference if risk-taking is freely chosen by the affected person, that is, that /she has full information
about the risk and is capable of making the decision/herself, if it is accepted by /her in order to obtain some benefit
otherwise not obtainable, or imposed against /her will (Hansson, 2003; Hermansson, 2005).

[ ] The use of magnitude times probability doesn’t work- it isn’t real world and is
impossible to determine
David Berube, Associate Professor of Speech Communication at the University of South Carolina, 2000 [Director of Debate
“Debunking Mini-max Reasoning: The Limits Of Extended Causal Chains In Contest Debating,” Contemporary Argumentation and Debate, Volume 21, Available Online at
http://www.cedadebate.org/CAD/2000_berube.pdf, Accessed 04-05-2008, p. 64-69]

The complex probabilities of extended arguments are problematic. For example, too much reliance is given an extended link story when each step in the link exhibits a probability that is geometrically
self-effacing. According to the traditional multiplication theorem, if a story is drawn from AaBaCaD, the probabilities of AaB and BaC and CaD are multiplied. "The probability that two subsequent
events will both happen is a ratio compounded of the probability of the 1st, and the probability of the 2nd in the supposition the 1st happens" (Bayes 299). If the probability of AaB is .10 and the
probability of BaC is also .10, then the probability of AaC is .01. If the probability of CaD is also .10, then the probability of AaD is .001.
If all we had to do to determine probability involved multiplying fractions, calculating probabilities would be easy.
Unfortunately, such is not the case. An interesting caveat involves conditional probability. "Its expositors hold that we should
not concern ourselves with absolute probabilities, which have no relevance to things as they are, but with conditional probabilities - the chances
that some event will occur when some set of previous conditions exists" (Krause 67). Conditional probabilities are most
often associated with calculations involving variables that may be even remotely associated, such as phenomena in
international relations. If one considers the probability of many separate events occurring, one must also consider whether or not they are
correlated - that is, whether or not they are truly independent. If they are correlated, simply multiplying individual probabilities will not
give you the correct estimate, and the final probability may actually be much larger than one will predict if one makes
this error. For example, the probability that I will utter an obscenity at any given instance may be small (although it is
certainly not zero). The probability that I will hit my funny bone at any given instant is also small. However, the
probability that I will hit my funny bone and then utter an obscenity is not equal to the product of the probabilities, since
the probability of swearing at a given instant is correlated to the probability of hurting myself at a given instant. (Krause
67) Hence, "if we calculate a priori the probability of the occurred event and the probability of an event composed of that one and a second one which
is expected, the second probability divided by the first will be the probability of the event expected, drawn from the observed event" (Laplace 15). Another complication of extended causal chains is
the corroboration principle. "There are cases in which each testimony seems unreliable (i.e., has less than 0.5 probability) on its own, even though the combination of the two testimonies is rather
persuasive. . . . [I]f both testimonies are genuinely independent and fully agree with one another, we are surely going to be inclined to accept them" (Cohen 72). When we are uncertain about a
probability, we might try to engage multiple sources making the same or same-like claim. We feel it is less likely that two or more sources are incorrect than that a single source will be. While
corroboration seems valid, it is a persuasive pipe-dream. If we use this calculus to draw our claims, errors are likely to be shared and replicated. Witness some of the problems associated with realism
in international relations literature.
As such, the multiplication theorem has been subverted by conditional probabilities and undercut by corroboration, but
contest debaters and policy makers have not risen to the challenge. While contest debating has borrowed heavily from
policymaking and systems analysis, it has not resolved the causality issues any better than have policy studies experts.
The grand calculus used in systems analysis is as simplistic as it is in contest debating. Lichtman and Rohrer described
what happens to systems analysis in a contest debate two decades ago. "To determine the level of net benefits achieved
by a policy system when multiple outcomes are considered, policy makers simply sum, for all anticipated results, the
product of their probabilities and values" (238).

The Method Lab


Michigan 2009 21
Risk Calculus
They Say “Risk = Probability times impact = infinity”
[ ] Viewing risk as probability times impact does not improve decision making – it relies
on the illusion of objective numbers
Dale Herbeck, Professor of Communication at Boston College, 1992 [Director of the Fulton Debating Society at Boston
College, “The Use and Abuse of Risk Analysis in Policy Debate,” Paper Presented at the 78th Annual Meeting of the
Speech Communication Association (Chicago, IL), October 29th-November 1st, Available Online via ERIC Number
ED354559, p. 10-12]

Since risk analysis is drawn from the literature on policy making, it seems appropriate to rely on that literature to define the terms. William Rowe
defines risk as "the potential for harm."5 Definitions such as this are really suggesting that risk is nothing more than the probability of encountering
negatively-valued events.6 Since negatively-valued events can differ drastically in consequence, it is necessary to add some means for valuing them to
our equation. Recognizing this fact, Paul and Anne Ehrlich define risk as follows: Risk is sometimes used as a synonym for
"probability" in insurance policies--the risk of a loss. In analyzing issues like those discussed in this chapter, "risk"
means the consequences of an event multiplied by its probability (or frequency) of occurrence--or, to put it another way,
risk is a probability of a bad consequence.? For the purposes of this paper, the Ehrlich's definition of risk can be
expressed in the following equation: RISK = PROBABILITY X IMPACT The utility of such a conception of risk
should be obvious. In a debate, advocates attempt to prove the desirability of policies by demonstrating risks. The
affirmative argues that the plan will prevent risks, while the negative argues that the plan will cause risks. At the end of
the debate, the decision maker compares risk and opts for the policy that produces either the greatest benefit, or perhaps,
the policy that produces the lesser of evils. The Abuse of Risk Analysis At first glance, it might appear that the
introduction of risk analysis into academic debate would improve the quality of decisions rendered. After all, risk
analysis provides a seemingly objective way of comparing the impact of competing policy options. Further, risk analysis
would seem to devalue the subjectivity inherent in the communication of information. Unfortunately, we believe that the illusion of objectivity masks
several serious problems with risk analysis as it is presently utilized in academic debate. While a number of problems might be identified, in this
paper we argue first that risk analysis artificially assigns probability to arguments and second, that risk analysis overvalues arguments with large
impacts. The "Tyranny of Illusory Precision."8 At the outset, it should be noted that it is extremely difficult to assess probability. In an article on risk
which appeared in Psychology Today, Dr. J. Frank Yates observed that "the average person has problems identifying potential risks and deciding how
likely they are to occur."9 In addition, Yates suggests that most of us overestimate the value of our own judgment in matters of common
knowledge.10 It might of course, be argued that debate judges trained in the use of risk analysis are better able to assess
risks. This, however, is frequently not the case. In one of the few articles on risk analysis in the forensic literature,
Vincent Follert offered this example drawn from the final round of the 1978 National Debate Tournament: After
exposure to the same information, each judged reached what appeared to be substantially different estimates of the
probability that the plan would prompt government cuts in the biomedical research budget: "Biomedical research would
probably be cut as a result of the plan," "the affirmative goes a long way towards eliminating the risks . . . of cuts," "I
am left with a substantial risk," and "the risk of cuts seems less significant than the case."11

The Method Lab


Michigan 2009 22
Risk Calculus
They Say “Insurance Principle”
[ ] The Insurance Principle doesn’t apply – insurance compensates victims After a crisis
– it doesn’t justify any preventive measures
Richard Ericson, Professor All Souls College, University of Oxford 2004 [Catastrophe risk, insurance and terrorism;
Richard V. Ericson and Aaron Doyle; http://docserver.ingentaconnect.com/deliver/connect/routledg/03 085147/v3
3n2s1.pdf?
expires=1245869868&id=50923937&titleid=737&accname=University+of+Michigan+At+Ann+Arbor&checksum=
C2B09B34425421708F626ECAC89173E9]

Insurers play key but often hidden roles in establishing preventive security and loss prevention infrastructures,
whether based on environmental design, electronic surveillance technologies, or private security operatives (ibid.).
Their initiatives in this regard are increasingly within the precautionary principle, which emphasizes that low frequency but high severity risks
must be addressed through extraordinary control measures that reflect ‘zero tolerance’ and aspire to ‘zero risk’ (Ewald 2002). O’Malley criticizes
Beck for claiming that the insurance industry influences the control of risks, and asserts that insurance is only a system for compensating loss.
But in what sense does insurance control threats? Insurance is a means of distributing risks in order to provide
compensation after the event. It does not ‘control’ the source of the risk any more than do such venerable methods as
‘state intervention’ in the form of disaster relief funding, which distribute the impact of risks through taxation and
related ‘spreading devices’. (O’Malley 2003: 276)

[ ] Extremely low probability moots the insurance principle


Nicholas Rescher is an American philosopher, University Professor of Philosophy and Chairman of the Center for
Philosophy of Science at University of Pittsburgh. 1983. [“Risk: A Philosophical Introduction to the Theory of Risk
Evaluation and Management”, p35-36. 1983.]

To be sure, one important qualification must be made: If E. is small enough to be seen as "effectively zero" -if the
untoward eventuation at issue is categorized as not representing a real possibility -then insurance becomes
unnecessary. In a gamesituation exercize conducted by Paul Slovic and his associates (see 78 Insurance Against
Catastrophe Table 1), some 65070 of the participants were unwilling to insure at $500 against a .002 probability of a
loss of around $250,000 while only some 25% were unwilling to insure at this same cost against a loss of some
$2000 with a .25 probability (despite the fact that the former loss represented a "catastrophe" that would have put
them out of the game). 11 In game situations, as in real life, people incline to dismiss small-probability eventuations
as unreal -as something that can be omitted from the range of one's practical concerns.

[ ] The Insurance principle is vague in its application to political debates


Richard Ericson, Professor All Souls College, University of Oxford 2004 [Catastrophe risk, insurance and terrorism;
Richard V. Ericson and Aaron Doyle; http://docserver.ingentaconnect.com/deliver/connect/routledg/03 085147/v3
3n2s1.pdf?
expires=1245869868&id=50923937&titleid=737&accname=University+of+Michigan+At+Ann+Arbor&checksum=
C2B09B34425421708F626ECAC89173E9]

Third, Beck, as well as other participants in debates about risk society, makes his assertions about insurability
without empirical evidence regarding how the insurance industry actually operates in conditions of uncertainty. As
Bougen observes, ‘Just how the catastrophic underwriter arrives at key decisions has proved a source of fascination for many years, yet literature
on the topic is particularly scarce’ (2003: 258). Similar comments have been made in other fields, such as political science: ‘How and why
the insurers and risk managers exercise such power over outcomes and with what consequences for the world market
economy and for the allocation of values among social groups, national economies and business enterprises is a
fundamental question for contemporary international political economy. For fifteen years I have waited, in vain, for
someone to write a definitive account’ (Strange 1996: 123). In debates on risk society, this lacuna is especially
remarkable because much of the debate hinges on how insurers make decisions as gatekeepers of risk and
uncertainty at the frontiers of risk society.

The Method Lab


Michigan 2009 23
Risk Calculus
They Say “Magnitude is the Most Important”
[ ] Policy decision-makers should never rely on magnitude alone, but rather focus on the
likelihood and proximity of a situation – magnitude alone oversimplifies complex situations
Vertzberger, 1995, Professor at the Department of International Relations, the Hebrew University of Jerusalem, Israel
(Yaacov Y. I. Vertzberger, June 1995, “Rethinking and Reconceptualizing Risk in Foreign Policy Decision-Making: A
Sociocognitive Approach”, Political Psychology, Vol. 16, No. 2, pp. 347-380, http://www.jstor.org/stable/3791835)

Risk, and social risk in particular, is a complex phenomenon with many attributes. This has distinctive effects on the
way in which decision-makers compare risks, the biases that affect risk perceptions, and decision-makers' risk
preferences. To explain these, we introduce in this section the concepts of the texture of risk and the taste for risk.
Comparing risks is an essential task in the process of choice between risky options. The most obvious approach to the
task is to compare the magnitudes of 357 risk represented by the competing options. But along with the problem of pro-
ducing a unified single measurement of risk in real-life political-strategic con-texts, which are mostly ill-defined and
contain hard-to-quantify variables, this approach poses another difficulty. Although people may make a choice based on
magnitude alone, especially when the options are clearly differentiated into high-and low-risk ones, this usually is not
the case when the risk variance between options is moderate or low. Decision-makers rely therefore on a more complex
and differentiated comparison based on the texture of risk, which is a set of risk-defining attributes. What, then, are the
attributes that define the texture of risk? 1. Risk transparency. How ambiguous or well understood are the risky
consequences of a decision? Debates among experts and policy advisers are likely to increase doubts among decision-
makers regarding whether the risks of a particular policy are really understood. This is apt to diminish perceived validity
of value and probability assessments and increase risk aversiveness. 2. Risk severity. How serious and damaging are the
perceived consequences of a decision or situation? 3. Risk certainty. How certain is it that any particular adverse
outcome will materialize? If risks are unanticipated, the level of perceived risk will be much higher because of surprise
and the lack of available resources required to cope with risks as they emerge. This implies that similar situations will
represent dissimilar levels of risk depending on prior anticipation and prepa-ration. 4. Risk horizon. How close in time
are the adverse consequences? The closer in time they are, the more vivid and salient they will seem and the more
weight they will be given. Distant negative consequences are under-weighed and perceived as less likely to occur, and
therefore have only a minor impact on decisions (Bjorkman, 1984; Milbur & Billings, 1976). 5. Risk complexity.
Complexity can be assessed using four criteria: (a) The measurability of risk. The less quantitatively measurable risk
dimensions are more elusive and difficult to assess. (b) Variability of issue dimensions, namely, the range of issue-areas
affected by risk dimensions (e.g., economic, military, political). (c) Multiplicity of time dimensions, that is, whether or
not all risky effects are expected to occur within the same time frame (e.g., whether they are all short- or long-term
consequences, or both). (d) Interactivity of dimensions. The more interactive the various risk dimensions, that is, the
more change in one risk dimension affects the level of risk in other dimensions, the more complex becomes the risk
calculus. The more complex the risk calculus, the more likely is the decision-maker to ignore most or many of the
dimensions and concentrate on one or a few of the most immediate and salient. Moreover, the complexity of the choice
reduces the decision-maker's comprehension of the task and becomes a source of anxiety because of the increased
probability of making the wrong decision. 6. Risk reversibility. Are risky decisions reversible once they are made, and
at what cost? 7. Risk controllability and containability. Even when risky decisions are irreversible, it makes a difference
if the risks generated by the decision are controllable and containable or not. For example, irreversible risky decisions
that would result in controllable risk are more acceptable than similar decisions that result in uncontainable risks. 8.
Risk accountability. Will decision-makers be held responsible by the public for adverse consequences? If so, what is the
magnitude of personal-political cost that they will have to bear? "As a general rule, the more directly accountable a
decision-maker is to the public, the more likely it is that public perceptions will receive consideration in priority setting"
(Ka-sper, 1980, p. 79). Therefore, choice preferences are often shaped by the criterion of which policy options are easier
to explain and justify (Ranyard, 1976; Tversky, 1972), although this is not the only driving motive of choice. Although
for analytic purposes, these risk attributes are presented as mutually exclusive, in reality such clear-cut distinctions are
rare, and the relations among attributes are interactive. For example, complexity could affect transparency; severity
could affect accountability. Any combination of these attributes will affect risk preference either directly or indirectly
through their impact on any single or all three constituent components of risk, namely: value, probability, and validity.
As a rule, the more transparent, severe, proximate, certain, irreversible, and uncontrollable risks are, the less complex
the related risk calculus is. Also the more accountable decision-makers are, the more likely are the risks to be
recognized in time, considered as relevant, and given higher importance in the shaping of decisions.

The Method Lab


Michigan 2009 24
Risk Calculus
They Say “Presumption”
[ ] Risk calculus cannot skew toward the status quo – similar risks are inherent in
Inaction
Vertzberger, 1995, Professor at the Department of International Relations, the Hebrew University of Jerusalem, Israel
(Yaacov Y. I. Vertzberger, June 1995, “Rethinking and Reconceptualizing Risk in Foreign Policy Decision-Making: A
Sociocognitive Approach”, Political Psychology, Vol. 16, No. 2, pp. 347-380, http://www.jstor.org/stable/3791835)

Risk in Foreign Policy Decision-Making incorrect identification of risk-taking with active policy choices. A passive,
"stick in the mud" (Mandel, 1987) policy may also entail risk-taking by attempting to preserve the status quo and
ignoring environmental signals that indicate a need for initiative and change. In other words, there are no risk-free
choices, and that includes the decision not to decide. Even though decision-makers often equate passivity with risk
avoidance, avoiding active decisions may in some cases actually entail more risk than making an active choice. It
follows that in the case of intervention, risk-taking is not necessarily congruent with decisions that increase the chance
of violence by armed intervention. A broader view of risk should take into account that risk avoidance in the short run
(such as refusing to deploy troops) may sometimes turn out in the long run to be a very risky decision. Consider, for
example, the options the United States faced with respect to military intervention in the Gulf. Intervention, of course,
involved the risk of military hostilities with Iraq. But nonintervention meant the risk of Iraq's con-trolling the lion's
share of the world's oil supply, or of a future confrontation with a nuclear Iraq. In such cases the choice is not between
risk-taking and risk aversion, but between different types of risk.

The Method Lab


Michigan 2009 25
Risk Calculus
They Say “Probability is Impossible to Calculate”
[ ] Risk assessment and probability is essential when making policy-relevant analysis,
even if it cannot be precisely defined
Vertzberger, 1995, professor at the Department of International Relations, the Hebrew University of Jerusalem, Israel
(Yaacov Y. I. Vertzberger, June 1995, “Rethinking and Reconceptualizing Risk in Foreign Policy Decision-Making: A
Sociocognitive Approach”, Political Psychology, Vol. 16, No. 2, pp. 347-380, http://www.jstor.org/stable/3791835)

Decisions involving risk in politics are different from similar decisions in business, an area that has generated
extensive research on risk-taking. Vertzberger decision-makers, and foreign policy decision-makers in particular,
are not accus-tomed to defining precisely and systematically their level of acceptable risk prior to making a
decision. This is not the case, for example, in stock market gambles, where investors tend to clearly define their acceptable risk levels. The main
reason for this difference is that in business there exist shared consensual norms of what are reasonable business practices. In politics, on the other
hand, a consensual normative framework that distinguishes the gambler from the astute responsible statesman is
yet to emerge, let alone become an integral part of the political decision-making culture. This is manifested in the
ongoing debate on such questions as, What consitutes good judgment in political issues? Are there criteria that can clearly distinguish good decisions
from poor decisions? What differentiates reasonable risk-taking from adventurous risk-taking? In a word, our current state of
knowledge of risk in international politics is incomplete and is lacking in systematic conceptualization. But what
exactly do we mean by "risky decisions"? How can we define risk scientifically as an analytic construct, rather than as a gut feeling of danger or
discomfort? Is risk a cognitive, affective, or behavioral construct? Is it a uniform construct? And are risks in all areas of human life similar in nature?
It is surprising that while the construct of risk and its behavioral implications have been singled out for extensive
study in most areas of current social science research, relating to a broad range of human activities such as
medicine, eco-nomics, industry, technology, environmental studies, and others, it has been practically ignored in
that domain of human affairs where risk is perennial and has a most critical relevance-international politics and,
specifically, interna-tional security issues. Many aspects of risk. that are of interest to students and practitioners
of foreign policy fall outside the purview of the theoretical knowl-edge of risk accumulated in other disciplines.
The few studies that emphasize risk in the field of international relations( e.g., Adomeit, 1973, 1982; Bueno de
Mesquita, 1981; Lamborn 1991; Huth et al., 1992) adopt and implement the classic technical definition of risk
that views risk as a product of the probability and consequences of a potentially adverse event. This definition
draws its inspi-ration from the gambling metaphor, is probability oriented, and is inadequate for capturingt he
essence of risk in internationalp olitics.

[ ] Even without exact probability thresholds, we can make risk assessments – objectivity
can never replace the need for subjective decision making
William Ruckelshaus Former Director of the EPA, 1983 [Science, Risk, and Public Policy; Source: Science, New
Series, Vol. 221, No. 4615 pp. 1026-1028; Published by: Sep. 9,; jstor]

In the future, this being an imperfect world, the rigor and thoroughness of our risk analyses will undoubtedly be
affect- ed by many factors, including the toxici- ty of the substances examined, the popu- lations exposed, the
pressure of the regu- latory timetable, and the resources avail- able. Despite these often conflicting pressures, risk
assessment at EPA must be based only on scientific evidence and scientific consensus. Nothing will erode public
confidence faster than the suspi- cion that policy considerations have been allowed to influence the assessment of
risk. Risk Management Although there is an objective way to assess risk, there is, of course, no purely objective way
to manage it, nor can we ignore the subjective perception of risk in the ultimate management of a particu- lar
substance. To do so would be to place too much credence in our objective data and ignore the possibility that
occasion- ally one's intuition is right. No amount of data is a substitute for judgment. Further, we must search for
ways to describe risk in terms that the average citizen can comprehend.

The Method Lab


Michigan 2009 26
Risk Calculus
They Say “Probability is Impossible to Calculate”
[ ] Statistical quantification of risk is not possible but underlying conceptual principles
can act as guidelines for decision
Vertzberger, 1995, Professor at the Department of International Relations, the Hebrew University of Jerusalem, Israel
(Yaacov Y. I. Vertzberger, June 1995, “Rethinking and Reconceptualizing Risk in Foreign Policy Decision-Making: A
Sociocognitive Approach”, Political Psychology, Vol. 16, No. 2, pp. 347-380, http://www.jstor.org/stable/3791835)

These multiple attributes, which characterize social and political risk-taking, make it extremely difficult to compare
risks because these attributes are not easily translated into common measures. Attempts to quantify political and social
risks are therefore subject to suspicion. "This is not to say that quantitative risk assessments are not useful or
illuminating, only that it pays to be somewhat skeptical of the quantitative results of risk assessments and to recognize
that the appearance of great accuracy that precise numbers in such analyses carry with them is spurious" (Kasper, 1980,
p. 74). It is for this reason that risk assessment in politico-military decisions can rarely make effective use of statistical
decision theory in its formal mathematical guise. The problems involved are hard to model and do not have a
sufficientlyw ell-defined and quantifiables tructuret o be adequately represented by such abstract mathematical models.
Still, the con-ceptual principles that underlie statistical decision theory are valid and when interpreted qualitatively may
act as guidelines for risk assessment even when quantification is not possible (Dror, 1986; Lowrance, 1980; Strauch,
1971; 1980). This set of risk attributesc an be found in every risky situation.T hey provide the coordinates that describe
the situation and allow for comparisons with other risk situations. These attributes also explain differences and
similarities in decision-makers' responses across risky situations, because perceptions of these attributes and the tastes
for risk may be dissimilar across decision-makers so that their mapping of the same situation will result in different
definitions of the situation and risk assessments. The accuracy of decision-makers' perceptions of these risk attributesd
eterminesa numbero f importanti ssues: (1) their sensitivity to the completeness or incompleteness of their information
about the problem, and their ability to understandt he risks involved; (2) the likelihood that they will be open-minded
and imaginative about those aspects of the problem about which hard data are unavailable; (3) their motivation to
analyze and deal with those aspects whose treatment is awkward or unpleasant. The key, then, to the accuracy and
comprehensiveness of risk assessment processes, especially for ill-defined and hard-to-model problems, can be found in
three determinants: information, imagination, and motivation. Information without imagination and motivation is not
likely to yield an astuter isk assessment. Imaginationu nconstrainedb y a broad and valid information base may yield
paranoia, self-serving assessment, and hallucination. Motivation without information will produce wishful thinking-or
its counterpart, aworst-casea nalysiso f the situation.( A detailedd iscussion of the quality of risk judgments is beyond
the scope of this study. The interested reader can consult the following literature: Steinberger, 1993; Sternberg, 1990;
and Tetlock, 1992.) Closely related to the concept of risk texture is the concept of the taste for risk. We argue that
decision-makers, like consumers in economics, have individ-ual tastes for risk. These tastes reflect a preferencef or
particularr isks over others even if the magnitudes of the compared risks are similar. These tastes express a preferencef
or a particulart extureo f risk, one that consists of a particularc ombi-nation of or emphasis on the risk attributes that
were specified above. Risks that are of the same magnitude (as measured, for example, in dollar terms), but represent
different combinations of risk attributes, are not perceived by decision-makers as similar risks. The taste for risk, by
emphasizing that each risky problem has a unique texture, explains the cross-decisional diversity of risk preferences.

The Method Lab


Michigan 2009 27
Risk Calculus
They Say “Even Improbable Impacts sometimes Occur”
[ ] Even if highly improbable events do occur, they cannot be prioritized in risk calculus,
because they distort the process
David Berube, Associate Professor of Speech Communication at the University of South Carolina, 2000 [Director of
Debate “Debunking Mini-max Reasoning: The Limits Of Extended Causal Chains In Contest Debating,”
Contemporary Argumentation and Debate, Volume 21, Available Online at
http://www.cedadebate.org/CAD/2000_berube.pdf, Accessed 04-05-2008, p. 64-69]

Tooley posed an even more intriguing question: "Does our world, then, simply contain an enormous number of highly
unlikely accidents?" (105). The proponent of a mini-max disadvantage would want you to believe that such is true.
Recently, we have learned highly unlikely accidents in a chaotic system are ordered (Butz). If so, the extended mini-max
argument might be one such ordering. Unfortunately for proponents of extended mini-max arguments, once any system is
dominated by highly unlikely accidents, the logic of the extended argument corrod es. Predicting unpredictability is
paradoxical. We expect the critic in a contest debate to assess the strength of an extended argument and resolve its disposition. However,
when the consequence is nearly infinite, it makes such a probability calculation thomy. Debaters seldom provide
critics with a discussion of multiplicational versus correlational probability assessments, and often substitute simple
corroboration for probability assessments.

The Method Lab


Michigan 2009 28
Risk Calculus
They Say “Focusing on Risk Calculus Distracts Education”
[ ] Infinite risk arguments undermine the educational value of debate because they make
it impossible for the real world to take policy debate seriously
David Berube, Associate Professor of Speech Communication at the University of South Carolina, 2000 [Director of
Debate “Debunking Mini-max Reasoning: The Limits Of Extended Causal Chains In Contest Debating,”
Contemporary Argumentation and Debate, Volume 21, Available Online at
http://www.cedadebate.org/CAD/2000_berube.pdf, Accessed 04-05-2008, p. 64-69]

Outsiders to contest debating have remarked simply that too many policy debate arguments end in all-out nuclear war:
consequently, they categorize the activity as foolish. How many times have educators had contest debaters in a
classroom discussion who strung out an extended mini-max argument to the jeers and guffaws of their classmates? They
cannot all be wrong. Frighteningly enough, most of us agree. We should not ignore Charles Richet's adage: "The stupid
man is not the one who does not understand something - but the man who understands it well enough yet acts as if he
didn't" (Tabori 6). Regrettably, mini-max arguments are not the exclusive domain of contest debating. "Policies driven
by the consideration of low risk probabilities will, on the whole, lead to low investment strategies to prevent a hazard
from being realized or to mitigate the hazard's consequences. By comparison, policies driven by the consideration of
high consequences, despite low probabilities, will lead to high levels of public investment" (Nehnevajsa 521).
Regardless of their persuasiveness, Bashor and others have discovered that mini-max claims are not useful in resolving
complex issues. For example, in his assessment of low-probability, potentially high-consequence events such as terrorist
use of weapons of mass destruction, Bashor found simple estimates of potential losses added little to contingency
planning. While adding little to policy analysis, extended arguments using mini-max reasoning remain powerful
determinants of resource allocation. As such, they need to be debunked. Experts agree. For example, Slovic advocates a
better understanding of all risk analysis since it drives much of our public policy. "Whoever controls the definition of
risk controls the rational solution to the problem at hand. If risk is defined one way, then one option will rise to the top
as the most cost-effective or the safest or the best. If it is defined another way, perhaps incorporating qualitative
characteristics or other contextual factors, one will likely get a different ordering of action solutions. Defining risk is
thus an exercise in power" (699). When probability assessments are eliminated from risk calculi, as is the case in mini-
max risk arguments, it is a political act, and all political acts need to be scrutinized with a critical lens.

[ ] Debating risk calculation is key to making the policy debate process educational
Dale Herbeck, Professor of Communication at Boston College, 1992 [Director of the Fulton Debating Society at Boston
College, “The Use and Abuse of Risk Analysis in Policy Debate,” Paper Presented at the 78th Annual Meeting of the
Speech Communication Association (Chicago, IL), October 29th-November 1st, Available Online via ERIC Number
ED354559, p. 10-12]

While the four suggestions offered in this paper may seem simplistic, we believe that they could dramatically improve
the use of risk analysis in policy debate. Rehabilitating risk analysis, to our way of thinking, is decidedly superior to
alternative methods for comparing policies. Admittedly, none of these suggestions is a perfect solution. Advocates will still use risk
analysis to their advantage, and judges will still be faced with the difficult task of assigning probabilities and assessing impacts. Still, we hope that the
alternatives can serve as useful guides for judges in policy debates. It is sometimes argued that debate is a laboratory for testing
argumentation. Critics of the laboratory metaphor have argued that we have failed as scientists, for we have produced
little of consequence in our lab. Perhaps our experience with risk analysis in debate can inform our understanding of the crisis rhetoric which
we confront on an almost daily basis. The best check on such preposterous claims, it seems to us, is an appreciation of the nature of risk analysis and
how it functions in argumentation. If we understand this tool, we will be well-armed in our battle with the bogeymen of our age.21

The Method Lab


Michigan 2009 29
Risk Calculus
They Say “Focusing on Risk Calculus Distracts Education”
[ ] Risk management must account for ethical issues to maintain public support
Hansson professor in philosophy Royal Institute of Technology 2007 [Hélène Hermansson, Sven Ove Hansson. Risk
Management. Basingstoke: Jul 2007. Vol. 9, Iss. 3; pg. 129, 16 pgs JSTOR]

Ethical issues are prominent in many risk-related social conflicts. In such conflicts, expert assessments based on
quantitative risk and risk-benefit analysis have often failed to convince the public. This has often been depicted as
failures in the communication between experts and laypersons. Alternatively, it can be described as resulting from
failures of an established analytical framework that excludes legitimate normative issues to which members of the
public attach great importance (Hansson, 2005a). In order to deal with these issues, traditional quantitative analysis
of risk needs to be supplemented with a systematic characterization of the ethical aspects of risk, including issues
such as voluntariness, consent, intent and justice. A major reason why this is seldom done is the lack of adequate
tools for ethical risk analysis. It is the purpose of the present contribution to introduce a model for ethical risk
analysis. In the following section, we present the outline of a three-party model of the ethical aspects of risk, and
identify seven crucial questions that can be used to characterize the ethical aspects of a risk management problem. In
the next section, we comment on each of these seven questions and on its relevance in risk management. In the
penultimate section, we present three examples that show how our model can be used to clarify the nature of a risk
management problem. Some general conclusions are drawn in the last section.

The Method Lab


Michigan 2009 30
Risk Calculus
No Nuclear War impacts
[ ] No risk of global nuclear war – China and Russia won’t launch full scale attacks due
to geography and defense spending
Charles Peña a senior fellow with the Coalition for a Realistic Foreign Policy, 2006 [February 16, More Defense
Spending, Less Security http://www.antiwar.com/pena/?articleid=8546

Unsound fiscal practices aside, the real question is whether such large defense spending is necessary for U.S. security
(to provide some perspective, current U.S. defense spending is more in real terms than during any of the Reagan years
and surpassed only by spending at the end of World War Two in 1945 and 1946 and during the Korean War in 1952).
The answer is "no." The United States is in a unique geostrategic position. We have friendly neighbors to the
north and south, and vast moats to the east and west. Given that no other country in the world has significant global
power projection capability, America is relatively safe from a military invasion. And the vast U.S. strategic nuclear
arsenal is a powerful deterrent against any country with nuclear weapons – even against so-called rogue states if
they eventually acquire long-range ballistic missiles. With the demise of the Soviet Union, the United States no
longer faces a serious military challenger or global hegemonic threat. And the U.S. military is, by far and away, the
most dominant military force on the planet. Russia comes closest to having the capability to be a military threat to the
United States, but instead of being a threat to Europe, it now has observer status with NATO (North Atlantic Treaty
Organization) and, despite having more main battle tanks than the U.S. Army, is no longer considered a threat to sweep
through the Fulda Gap to occupy Western Europe. Even if Russia were to change course and adopt a more hostile
position, it is not in a position to challenge the United States – either economically or militarily. In 2003, Russia’s
gross domestic product (GDP) was a little more than a tenth of U.S. GDP ($1.3 trillion vs. $10.9 trillion). And although
a larger share of Russia’s GDP was for defense expenditures (4.9 percent vs. 3.7 percent), in absolute terms the United
States outspent Russia by more than 6-to-1. So Russia would have to devote more than 20 percent of its GDP to defense
– which would exceed what the Soviet Union spent during the height of the Cold War during the 1980s – to equal the
United States. Certainly, Chinese military developments bear watching, and although many see China as the next great
threat, even if China modernizes and expands its strategic nuclear force (as many military experts predict it will),
the United States will retain a credible nuclear deterrent with an overwhelming advantage in warheads,
launchers, and variety of delivery vehicles. Moreover, China does not possess the sea- or airlift to be able to project
its military power and threaten the U.S. homeland. And like Russia, China may not have the wherewithal to compete
with and challenge the United States. In 2003, U.S. GDP was almost eight times more than China's ($10.9 trillion vs.
$1.4 trillion). China spent fractionally more of its GDP on defense than the United States (3.9 percent vs. 3.7 percent),
but in absolute terms the U.S. defense expenditures were seven times that of China's ($404.9 billion vs. $55.9 billion).
So China would have to devote one-quarter of its GDP to defense to equal the United States. If the Russian and
Chinese militaries are not serious threats to the United States, so-called rogue states – such as North Korea, Iran,
Syria, and Cuba – are even less of a threat. Though these countries are unfriendly to the United States, none have any
real military capability to threaten or challenge vital American security interests or the U.S. homeland. In economic
terms, the GDP of these four countries was $590.3 billion in 2003 compared to a U.S. GDP of $10.9 trillion, or less than
5.5 percent of U.S. GDP.

The Method Lab


Michigan 2009 31
Risk Calculus
No Environmental Collapse
[ ] Environmental collapse won’t cause war – deterrence and interdependence
Daniel Deudney 1999 Asst Prof of Poli Sci at Johns Hopkins, [Contested Grounds: Security and Conflict in the New
Environmental Politics, available at NetLibrary.com. p213]

In summary, the case for thinking environmental degradation will cause interstate violence is much weaker than
commonly thought. In part this is because of features of the international system unrelated to environmental issues.
Although many conflict scenarios draw analogies from historical experience, they fail to take into account the
important ways in which the contemporary interstate system differs from earlier ones. Military capability
sufficient to make aggression prohibitively costly has become widely distributed, making even large shifts in the
relative power potential of states less likely to cause war. Interstate violence seems to be poorly matched as a means to
resolve many of the conflicts that might arise from environmental degradation. The vitality of the international trading
system and the more general phenomenon of complex interdependence also militate against violent interstate outcomes.
The result is a world system with considerable resiliency and enough "rattle room" to weather significant
environmental disruption without large-scale violent interstate conflict. Conclusions The degradation of the natural
environment upon which human well-being depends is a problem with far-reaching significance for all human societies.
But this problem has little to do with the national-security-from-violence problem that continues to afflict politics. Not
only is there little in common between the causes and solutions to these two problems, but the nationalist and
militarist mindsets closely associated with "national security" thinking directly conflict with the core of the
emerging environmentalist worldview. Harnessing these sentiments for a "war on pollution" is a dangerous and
probably self-defeating enterprise. And fortunately, the prospects for resource and pollution wars are not as great as
often conjured by environmentalists.x

The Method Lab


Michigan 2009 32
Risk Calculus
No Biodiversity Extinction
[ ] Species loss won’t cause extinction – it is impossible to identify keystone species
Sharman, 2003 (Martin, European Commission Research DG DI-4 Biodiversity and Ecosystems, “Ecological value
of biodiversity,” http://www.nbu.ac.uk/biota/Archive_Biotic/3806.htm)

Can we use the ecological value of biodiversity as an argument to encourage the development of policies that
seek to protect biodiversity? Probably not, because the evidence before our eyes is that progressively
simplified systems persist - until they abruptly collapse. John Hutcheson believes that "there is a scientifically
demonstrated ecological value of biodiversity". I would be flying in the face of reason (and evidence) to deny that
life creates and maintains conditions suitable for life or that life buffers physical parameters. This is not my
argument. I suspect that our disagreement may stem from his perception of biodiversity as 'life' and my perception
of biological diversity as 'a characteristic of life'. But I feel that there is another point here, to do with governance
and what reasons we give in support of our belief that we must develop human survival systems that conserve
biodiversity. What concerns me is that we can take an ecosystem (for example, a forest) and progressively
simplify it by removing species, or reducing biodiversity in some other way, perhaps by reducing the
populations of the most common species, or removing certain phenotypes. For a long time, the buffering that
John talks about means that the forest will maintain some kind of ecological integrity as this reduction in
biodiversity continues. We probably all know at least one forest - or other ecosystem, the ocean being the most
glaringly obvious one - that is going through this slow transformation. The loss of each single species may bring
about a small cascade of other losses, but can we say that the diversity itself (as distinct from any of its components)
has scientifically demonstrated ecological value? At some point, perhaps, we will remove one species too many, and
an ecological catastrophe will follow and the forest (or fertile ocean) will be transformed into something else. At that
cusp, nobody could possibly deny that biodiversity has ecological value. Unfortunately we will almost always
discover the phase space location of that cusp too late. And this is my worry about such a functional reason for
defending and cherishing biodiversity - we can go on pulling away the pillars from under the pier for a long time,
without anything terribly dramatic happening. Anyone who tries to warn against this kind of behaviour is discounted
as a doomsayer. But when the pier does abruptly collapse, what can we do except watch? I feel that nobody should
end their contribution to this conference without stating what strategically important research should be done, in
support of good governance, to investigate the concerns that they raise. In that belief, here is my (partial) list of
"science for good governance" for this issue: Understand how to define and assess ecosystem quality. Develop ways
of managing endangered or threatened ecosystems, marginal or relict habitats, and those with low resilience.
Understand how to evaluate the minimum area that an ecosystem must cover if it is to persist under probable
scenarios of climatic and anthropogenic change. Understand the ecology of the deep ocean and the benthos and its
response to drivers of biodiversity change. Develop effective low-cost methods to rehabilitate threatened species and
restore degraded ecosystems. Last, but by no means least, better understand how to conserve biodiversity while
ensuring sustainable livelihoods.

The Method Lab


Michigan 2009 33
Risk Calculus
No Ozone Depletion Extinction
[ ] Massive impacts from ozone depletion are denied: Ozone layers vary year to year
which means they gain no short-turn impacts AND the ozone layer will begin to heal in the
long term.
The Australian 2007 [Leigh Dayton, Science writer October 5, Ozone breach at a five-year low. Lexis

THE ozone hole over Antarctica has shrunk by 30 percent, and is now the smallest it has been since 2002.
According to satellite measurements released yesterday by the European Space Agency, the hole covers 24.7 million
square kilometres, roughly three times the size of Australia and twice the size of Antarctica itself. Last year, the hole
covered 28million square kilometres. ESA scientists estimated that this year 27.7million tonnes of ozone were lost,
compared with last year's record loss of 40 million tonnes. According to atmospheric scientist Paul Fraser of the
CSIRO's Marine and Atmospheric Research unit in Melbourne, the 30 per cent drop is not surprising. ''The size
of the ozone hole varies from year to year, largely depending on the temperature of the upper atmosphere, the
stratosphere, which is controlled by stratospheric dynamics,'' he said. ''The very small hole in 2002 broke up very
early. Since then, we've had big ones.'' The hole -- first identified in 1985 -- is a funnel-shaped region of ozone-
depleted air in the stratosphere that occurs in winter, 15-16km above the South Pole. The ozone layer girdles the
planet, shielding it from solar radiation. Thinning of the protective layer is caused by the presence of gases such as
chlorine and bromine, which originate in man-made products such as chlorofluorocarbons (CFCs). When the
stratosphere is colder than about -78C, stratospheric clouds form. They serve as ''platforms'' for ozone-depleting
chemical reactions caused by CFCs and other compounds. Typically, the ozone hole persists until November or
December, when winds surrounding the South Pole weaken, allowing in warmer ozone-rich air. Such ozone-
depleting compounds were outlawed in 1987, and Dr Fraser said they had been dropping by about 1 per cent a year
since then. ''The long-term prediction is that by the middle of this century we'll have healing (of the layer), Dr
Fraser said. ''But it's hard to tease out the long-term trend, driven by reduction in compounds, from the year-to-year
variations.''

[ ] Ozone depletion won’t cause extinction – predictions are media hype.


Ben Lieberman, senior policy analyst at the Heritage Foundation, 2007 [Ozone: The hole truth, the washington times
section; A17 Sept 19, Lexis

The international treaty to protect the ozone layer turns 20 this year. But is there really much reason to celebrate?
Environmentalists have made many apocalyptic predictions over the last several decades. Virtually none has
come to pass. Yet each time, the greens and their political allies proclaim victory, arguing their preventive
prescriptions averted disaster. Such is the case with the 1987 Montreal Protocol On Substances That Deplete The
Ozone Layer (Montreal Protocol). The lurid predictions of ozone depletion-induced skin cancer epidemics,
ecosystem destruction and others haven't come true, for which Montreal Protocol proponents congratulate
themselves. But in retrospect, the evidence shows ozone depletion was an exaggerated threat in the first place.
As the treaty parties return to Montreal for their 20th anniversary meeting it should be cause for reflection, not
celebration, especially for those who hope to repeat this "success story" in the context of global warming. The treaty
came about over legitimate but overstated concerns that chlorofluorocarbons (CFCs, a then-widely used class of
refrigerants) and other compounds were rising to the stratosphere and destroying ozone molecules. These molecules,
collectively known as the ozone layer, shield the Earth from excessive ultraviolet-B radiation (UVB) from the sun.
The Montreal Protocol's provisions were tightened in 1990 and again in 1992, culminating with a CFC ban in most
developed nations by 1996. So what do we know now? As far as ozone depletion is concerned, the thinning of the
ozone layer that occurred throughout the 1980s apparently stopped in the early 1990s, too soon to credit the
Montreal Protocol. A 1998 World Meteorological Organization (WMO) report said: "Since 1991, the linear
[downward] trend observed during the 1980s has not continued, but rather total column ozone has been almost
constant." However, the same report noted that the stratospheric concentrations of the offending compounds were
still increasing through 1998.

The Method Lab


Michigan 2009 34
Risk Calculus
No HIV/AIDS Extinction
[ ] HIV/AIDS will not lead to extinction – current models are based on false assumptions
and proper statistical analysis proves.
Preker 2004 Alexander Human development network, [addressing hiv/aids in east asia and the pacific,”
http://siteresources.worldbank.org/INTEAPREGTOPHIVAIDS/PublicationsandReports/20282986/Regional_Paper.
pdf P71 Human Development Network , Health, Nutrition, and Population Series . A World Bank study et. Al,.)

All socioeconomic models of the impact of HIV/AIDS depend on epidemic logical predictions on the future
course of the epidemic. Many economic models rely on simplistic assumptions that other parts of the world will
follow the trajectory of SubSaharan Africa. This analogy reflects a profound misunderstanding of the
epidemiology of HIV. To understand this one must develop a theoretical model of the growth of infectious diseases.
Infectious diseases do not grow indefinitely—there must be some natural limit. In fact, when one views an
epidemic over time, it has an initial phase of growth and then it tapers off and reaches a steady state before it
eventually disappears. The classic case of this type of infection is influenza or SARS. HIV/AIDS is similar, but the
process is much slower because of the long interval between infection with HIV and the development of AIDS. In
addition, there is a long period of infectiousness, often years, which makes it very different from classical
infectious diseases such as influenza. The figure on page 72 shows the course of an HIV/AIDS infection in a
population.18 The key to understanding the dynamics of an infectious disease is the reproductive rate (R0) common
to all organisms. As can be seen, in the initial period of R0 > 1, the prevalence increases exponentially. However, as
the number of people susceptible to the disease are “used up” by the epidemic, the reproductive rate begins to fall. If
no new susceptible groups enter the population, then the infection will fade away. Given the long duration of
HIV/AIDS, there is continuing growth in new susceptible populations, making extinction unlikely.

[ ] HIV/AIDS and risks like it do not pose a risk of extinction – historically plagues have
not threatened species
Bostrom 2002 (Nick, Professor of Philosophy at Oxford, Journal of Evolution and Technology, “Existential risks,”
Vol. 9 March http://www.nickbostrom.com/existential/risks.html)

Risks in this sixth category are a recent phenomenon. This is part of the reason why it is useful to distinguish them
from other risks. We have not evolved mechanisms, either biologically or culturally, for managing such risks. Our
intuitions and coping strategies have been shaped by our long experience with risks such as dangerous animals,
hostile individuals or tribes, poisonous foods, automobile accidents, Chernobyl, Bhopal, volcano eruptions,
earthquakes, draughts, World War I, World War II, epidemics of influenza, smallpox, black plague, and AIDS.
These types of disasters have occurred many times and our cultural attitudes towards risk have been shaped by trial-
and-error in managing such hazards. But tragic as such events are to the people immediately affected, in the big
picture of things – from the perspective of humankind as a whole – even the worst of these catastrophes are mere
ripples on the surface of the great sea of life. They haven’t significantly affected the total amount of human
suffering or happiness or determined the long-term fate of our species. With the exception of a species-
destroying comet or asteroid impact (an extremely rare occurrence), there were probably no significant existential
risks in human history until the mid-twentieth century, and certainly none that it was within our power to do
something about.

The Method Lab


Michigan 2009 35
Risk Calculus
No Terror Attack
[ ] Terrorism risk are exaggerated to justify any federal government action
John Mueller, prof of Political Science at Ohio State, 2006 [Foreign Affairs, “Is There Still a Terrorist Threat? ”
September/October http://www.walkeronline.net/Mueller%20%20Is%20 There%20S till%20A%20Terrorist
%20Threat.pdf (Accessed at Michigan University on June 24th, 2009)

Those attacks demonstrated, of course, that al Qaeda or at least 19 of its members still possessed some fight. And
none of this is to deny that more terrorist attacks on the United States are still possible. Nor is it to suggest that al
Qaeda is anything other than a murderous movement. Moreover, after the ill considered U.S. venture in Iraq is over,
freelance jihadists trained there may seek to continue their operations elsewhere although they are more likely to
focus on places such as Chechnya than on the United States. A unilateral American military attack against Iran could
cause that country to retaliate, probably with very wide support within the Muslim world, by aiding anti-American
insurgencies in Afghanistan and Iraq and inflicting damage on Israel and on American interests worldwide. But
while keeping such potential dangers in mind, it is worth remembering that the total number of people killed since
9/11 by al Qaeda or al Qaedalike operatives outside of Afghanistan and Iraq is not much higher than the
number who drown in bathtubs in the United States in a single year, and that the lifetime chance of an
American being killed by international terrorism is about one in 80,000 about the same chance of being killed
by a comet or a meteor. Even if there were a 9/11 scale attack every three months for the next five years, the
likelihood that an individual American would number among the dead would be two hundredths of a percent (or one
in 5,000). Although it remains heretical to say so, the evidence so far suggests that fears of the omnipotent terrorist
reminiscent of those inspired by images of the 20 foottall Japanese after Pearl Harbor or the 20 foottall Communists
at various points in the Cold War (particularly after Sputnik) may have been overblown, the threat presented within
the United States by al Qaeda greatly exaggerated. The massive and expensive homeland security apparatus
erected since 9/11 may be persecuting some, spying on many, inconveniencing most, and taxing all to defend
the United States against an enemy that scarcely exists.

The Method Lab


Michigan 2009 36
Risk Calculus
No Bioterror Attack
[ ] Empirically – Bio terrorism has had limited impacts – even the most successful
attacks have killed less than a hundred people – media hype drives fears of WMD
Brendan O’Neil, Editor of Spiked, 2004 [Thursday 19 August 2004 http://www.spiked-
online.com/index.php/site/article/2263/ Weapons of Minimum Destruction

The most effective WMD-attack by a non-state group, from a military perspective, was carried out by the Tamil Tigers
of Sri Lanka in 1990. They used chlorine gas against Sri Lankan soldiers guarding a fort, injuring over 60 soldiers but
killing none. The Tamil Tigers’ use of chemicals angered their support base, when some of the chlorine drifted back
into Tamil territory - confirming Rapoport’s view that one problem with using unpredictable and unwieldy chemical
and biological weapons over conventional weapons is that the cost can be as great ‘to the attacker as to the attacked’.
The Tigers have not used WMD since. The most infamous use of WMD by terrorists was in March 1995, when 10
members of Aum Shinryko, the strange Japanese religious cult, released sarin gas on the Tokyo Underground. The
homemade gas was placed in plastic bags wrapped in newspapers. The cult members started the attack by puncturing
the bags with umbrellas. Twelve people were killed; over 1,000 were hospitalised, 40 of whom were seriously injured.
The Tokyo gas attack is seen as the most audacious use of WMD by terrorists to date; it is often namechecked as an
example of what might happen if al-Qaeda types were to use WMD on the London Underground or on the New York
Subway. Yet, as Rapoport points out, while the Aum Shinryko attack certainly had tragic consequences, it also
showed us the limitations of WMD attacks in terms of causing casualties or destruction. He says that even though
Aum Shinryko had ‘extraordinary cover for a long time’ - meaning that the Japanese authorities were nervous about
monitoring the group on the grounds that it was a religious outfit - and despite the fact that it had ‘20 members with
graduate degrees in science, significant laboratories and assets of over a billion dollars’, it still did not succeed in its aim
of taking hundreds or thousands of casualties, of causing mass destruction. For Rapoport this shows that such weapons
are far from easy to use, especially when the groups using them must move around quickly, ‘as all terrorists must do’.
According to Rapoport, the most striking thing about the Aum Shinryko attack is that no one died from inhaling the
sarin gas itself - in every fatal case, the individual had made contact with the liquid. He cites Parachini again, who says
that the individuals killed by Aum Shinryko are the only people to have lost their lives as a result of a WMD attack by a
terrorist group over the past 25 years. (There were also five deaths as a result of anthrax attacks post-9/11, but Parachini
doesn’t include those because the individual responsible and the motivation for those attacks remain unknown.) ‘When
you think that fewer than 15 people have been killed by known terrorist use of chemical and biological weapons,
and contrast that to the thousands who were killed on 9/11 and in conventional bombings in Madrid or Bali or
Istanbul, it’s quite remarkable that we are so obsessed with WMD’, says Rapoport. So why are we so obsessed with
WMD? Why do we continue to fret over weapons which, by all accounts, do not cause as much mass destruction as
conventional weapons, which have only rarely been used by terrorists (and not very successfully at that), and which
we’re not even certain that today’s terrorists, specifically al-Qaeda, have got access to? Rapoport says that’s a good
question - but a difficult one to answer. He thinks the reasons are complex; he argues that it isn’t only government and
media who have ratcheted up fear about WMD, but that ‘economic interests’ have, too - those in business, government
and research institutions who stand to make financial gain from public concern about WMD and from public demands
for more protective measures against such weapons. No doubt there is some truth in that. But the disparity between
the facts about WMD and our fears of WMD also reveals something more about today’s terror-obsession. It
shows up the gap between the reality of terrorism - which over the past three years has largely consisted of scrappy
bomb attacks by small nihilistic groups - and the fear of terrorism as something that might bring down civilisation
as we know it, or, in the words of President Bush, inflict ‘hundreds of thousands of casualties’. It suggests that our
concern about terrorism is not entirely shaped by the real threat posed by terrorism, but by a broader sense of
fear and insecurity at home. That might explain why so much of the terror discussion, particularly in relation to
WMD, is anticipatory and speculative, always conjuring up worst-case scenarios - because it comes from within, from
our own nightmares and imaginations, rather than from without. In this sense, chemical and biological weapons - the
nightmare notion of silent, invisible killer poisons being released into our water systems or on to crowded public
transport - are the perfect metaphor for the West’s own sense of vulnerability. What we could really do with is a
heavy dose of reality.

The Method Lab


Michigan 2009 37
Risk Calculus
No Bioterror Attack
[ ] Biological and chemical weapons have historically caused far less damage then
conventional weapons – these are weapons of minimal destruction. Any claim to the contrary
is just fear mongering.
Brendan O’Neil, Editor of Spiked, 2004 [Thursday 19 August 2004 http://www.spiked-
online.com/index.php/site/article/2263/ Weapons of Minimum Destruction]

‘Believe it or not, what we refer to as “weapons of mass destruction” are actually not very destructive.’ David C
Rapoport, professor of political science at University of California, Los Angeles and editor of the Journal of Terrorism
and Political Violence, has examined what he calls ‘easily available evidence’ relating to the historic use of chemical
and biological weapons. He found something surprising - such weapons do not cause mass destruction. Indeed,
whether used by states, terror groups or dispersed in industrial accidents, they tend to be far less destructive than
conventional weapons. ‘If we stopped speculating about things that might happen in the future and looked
instead at what has happened in the past, we’d see that our fears about WMD are misplaced’, he says. Yet such
fears remain widespread. Post-9/11, American and British leaders have issued dire warnings about terrorists getting
hold of WMD and causing mass murder and mayhem. President George W Bush has spoken of terrorists who, ‘if they
ever gained weapons of mass destruction’, would ‘kill hundreds of thousands, without hesitation and without mercy’
(1). The British government has spent £28million on stockpiling millions of smallpox vaccines, even though there’s no
evidence that terrorists have got access to smallpox, which was eradicated as a natural disease in the 1970s and now
exists only in two high-security labs in America and Russia (2). In 2002, British nurses became the first in the world to
get training in how to deal with the victims of bioterrorism (3). The UK Home Office’s 22-page pamphlet on how to
survive a terror attack, published last month, included tips on what to do in the event of a ‘chemical, biological or
radiological attack’ (‘Move away from the immediate source of danger’, it usefully advised). Spine-chilling books such
as Plague Wars: A True Story of Biological Warfare, The New Face of Terrorism: Threats From Weapons of Mass
Destruction and The Survival Guide: What to Do in a Biological, Chemical or Nuclear Emergency speculate over what
kind of horrors WMD might wreak. TV docudramas, meanwhile, explore how Britain might cope with a smallpox
assault and what would happen if London were ‘dirty nuked’ (4).

The Method Lab


Michigan 2009 38
Risk Calculus
No Iran Impacts
[ ] Iranian prolif won’t cause middle east war – Iran would be constrained by US power
Barry Posen, Professor of Political Science at the MIT 2006 [will become director of the MIT Security Studies Program
in July We Can Live With a Nuclear Iran http://web.mit.edu/cis/oped _posen_3_2_06.html MIT Center for International
Studies Op-Eds

The final concern is that a nuclear Iran would simply feel less constrained from other kinds of adventurism, including
subversion or outright conventional aggression. But the Gulf states can counter Iranian subversion, regardless of
Iran's nuclear status, with domestic reforms and improvements in their police and intelligence operations --
measures these states are, or should be, undertaking in any case. As for aggression, the fear is that Iran could rely on a
diffuse threat of nuclear escalation to deter others from attacking it, even in response to Iranian belligerence. But while
it's possible that Iranian leaders would think this way, it's equally possible that they would be more cautious. Tehran
could not rule out the possibility that others with more and better nuclear weapons would strike Iran first,
should it provoke a crisis or war. Judging from cold war history, if the Iranians so much as appeared to be
readying their nuclear forces for use, the United States might consider a preemptive nuclear strike. Israel might
adopt a similar doctrine in the face of an Iranian nuclear arsenal. These are not developments to be wished for, but
they are risks that a nuclear Iran must take into account. Nor are such calculations all that should counsel caution. Iran's
military is large, but its conventional weapons are obsolete. Today the Iranian military could impose considerable costs
on an American invasion or occupation force within Iran, but only with vast and extraordinarily expensive
improvements could it defeat the American military if it were sent to defend the Gulf states from Iranian
aggression.

[ ] Iranian prolif won’t cause regional arms races – Israel is already nuclear and Egypt
depends on US Aid
Barry Posen, Professor of Political Science at the MIT 2006 [will become director of the MIT Security Studies Program
in July We Can Live With a Nuclear Iran http://web.mit.edu/cis/oped _posen_3_2_06.html MIT Center for International
Studies Op-Eds

The intense concern about Iran's nuclear energy program reflects the judgment that, should it turn to the
production of weapons, an Iran with nuclear arms would gravely endanger the United States and the world. An
Iranian nuclear arsenal, policymakers fear, could touch off a regional arms race while emboldening Tehran to under take
aggressive, even reckless, actions. But these outcomes are not inevitable, nor are they beyond the capacity of the
United States and its allies to defuse. Indeed, while it's seldom a positive thing when a new nuclear power
emerges, there is reason to believe that we could readily manage a nuclear Iran. A Middle Eastern arms race is a
frightening thought, but it is improbable. If Iran acquires nuclear weapons, among its neighbors, only Israel, Egypt,
Saudi Arabia and Turkey could conceivably muster the resources to follow suit. Israel is already a nuclear
power. Iranian weapons might coax the Israelis to go public with their arsenal and to draw up plans for the use
of such weapons in the event of an Iranian military threat. And if Israel disclosed its nuclear status, Egypt might
also find it diplomatically difficult to forswear acquiring nuclear weapons. But Cairo depends on foreign
assistance, which would make Egypt vulnerable to the enormous international pressure it would most likely face
to refrain from joining an arms race.

The Method Lab


Michigan 2009 39
Risk Calculus
No Iran Impacts
[ ] There will not be a civil war in Iran – events just aren't adding up to it.
Amir Taheri, Iranian writer based in Europe, 2009 [Civil war unlikely in Iran By Amir Taheri, Special to Gulf News
Published: June 23, 2009, 22:56 http://archive.gulfnews.com/opinion/columns/region/ 10325444.html

However, there are more differences between the events of 1979 and those of today than there are similarities.
To start with, the ruling establishment under the Shah remained reasonably united until the very end. Even after the
Shah had left the country, none of the key figures of the regime switched sides. Today, however, the ruling elite is
split down the middle. Almost as many members of the regime have sided with Mousavi as have backed
Ahmadinejad. In 1979, the people looked to the Shiite clergy for leadership. This time the clergy is being
pushed into the background. The 'moral references' of Iranian society are no longer clerics. They are intellectuals,
academics, lawyers and leaders of independent trade unions. Another difference is that in 1979 the ruling elite had
little stomach for a fight. Many of its members had homes and investments abroad and thus were not forced to fight
with their backs to the wall. Thousands of them just packed up and left. Now, however, the overwhelming majority
of the ruling elite has no fallback position. There is yet another difference. In 1979, a majority of Iranians would
probably have voted for the Shah had there been elections. However, few of them were prepared to fight for him in
the streets. This time, the regime may well lose a free and fair election, but is still capable of fielding large numbers
of supporters who are ready to kill and die for it. The perception that the Shah was weak and unwilling to hit back
played a crucial role in disheartening his supporters and encouraging his opponents. That perception was one reason
so many of the Shah's closest aides simply fled the country at the first opportunity. Is Iran heading for a civil war?
My answer is a cautious no. Iran's long history, spanning more than 2,500 years, contains only three events that
could be described as civil wars: in the fifth century BC, in the sixth century AD and in 1911. The reason for this is
that as a power struggle develops, Iranians are adept at distinguishing which side is going to win. Once they
have identified the winner, they all rally to his side. No one is left on the other side to provoke a civil war. Call
it opportunism if you like, but this is a part of the template of Iranian politics. I only hope that the side that
realises it is losing does not go into denial, but rather bows out without provoking a prolonged and bloody conflict.

The Method Lab


Michigan 2009 40
Risk Calculus
Only Nuclear Impacts are Existential
[ ] The only impacts discussed that are absolute threats of extinction are nuclear impacts
the rest are endurable, even if they are horrible
Nick Bostrom, 2001 prof of Philosophy, Oxford University [ Journal of Evolution and Technology, Vol. 9, March
2002. First version: 2001 March, JStor

1.2 Existential risks In this paper we shall discuss risks of the sixth category, the one marked with an
X. This is the category of global, terminal risks. I shall call these existential risks. Existential risks are distinct
from global endurable risks. Examples of the latter kind include: threats to the biodiversity of Earth’s ecosphere,
moderate global warming, global economic recessions (even major ones), and possibly stifling cultural or religious
eras such as the “dark ages”, even if they encompass the whole global community, provided they are transitory
(though see the section on “Shrieks” below). To say that a particular global risk is endurable is evidently not to say
that it is acceptable or not very serious. A world war fought with conventional weapons or a Nazi-style Reich lasting
for a decade would be extremely horrible events even though they would fall under the rubric of endurable global
risks since humanity could eventually recover. (On the other hand, they could be a local terminal risk for many
individuals and for persecuted ethnic groups.) I shall use the following definition of existential risks:
Existential risk – One where an adverse outcome would either annihilate Earth-originating intelligent life or
permanently and drastically curtail its potential. An existential risk is one where humankind as a whole is imperiled.
Existential disasters have major adverse consequences for the course of human civilization for all time to come.
2 The unique challenge of existential risks Risks in this sixth category are a recent phenomenon. This is part of the reason why it is
useful to distinguish them from other risks. We have not evolved mechanisms, either biologically or culturally, for managing such risks. Our
intuitions and coping strategies have been shaped by our long experience with risks such as dangerous animals, hostile individuals or tribes,
poisonous foods, automobile accidents, Chernobyl, Bhopal, volcano eruptions, earthquakes, draughts, World War I, World War II, epidemics of
influenza, smallpox, black plague, and AIDS. These types of disasters have occurred many times and our cultural attitudes towards risk have been
shaped by trial-and-error in managing such hazards. But tragic as such events are to the people immediately affected, in the big picture of things –
from the perspective of humankind as a whole – even the worst of these catastrophes are mere ripples on the surface
of the great sea of life. They haven’t significantly affected the total amount of human suffering or happiness or
determined the long-term fate of our species. With the exception of a species-destroying comet or asteroid impact (an extremely
rare occurrence), there were probably no significant existential risks in human history until the mid-twentieth century, and certainly none that it
was within our power to do something about. The first manmade existential risk was the inaugural detonation of an
atomic bomb. At the time, there was some concern that the explosion might start a runaway chain-reaction by “igniting” the atmosphere.
Although we now know that such an outcome was physically impossible, it qualifies as an existential risk that was present at the time. For there
to be a risk, given the knowledge and understanding available, it suffices that there is some subjective probability of
an adverse outcome, even if it later turns out that objectively there was no chance of something bad happening. If we
don’t know whether something is objectively risky or not, then it is risky in the subjective sense. The subjective
sense is of course what we must base our decisions on.[2] At any given time we must use our best current subjective
estimate of what the objective risk factors are.[3]

The Method Lab


Michigan 2009 41
Risk Calculus
Bostrom Indicts
[ ] Bostrum is a lunatic – he says that we cannot dismiss the possibility that we are living
in the Matrix – the credibility of All of his claims are suspect.
Nick Bostrom, 2001 prof of Philosophy, Oxford University [ Journal of Evolution and Technology, Vol. 9, March
2002. First version: 2001 March, JStor

4.3 We’re living in a simulation and it gets shut down A case can be made that the hypothesis that we are
living in a computer simulation should be given a significant probability [27]. The basic idea behind this so-called “Simulation
argument” is that vast amounts of computing power may become available in the future (see e.g. [28,29]), and that it could be used, among other
things, to run large numbers of fine-grained simulations of past human civilizations. Under some not-too-implausible assumptions,
the result can be that almost all minds like ours are simulated minds, and that we should therefore assign a
significant probability to being such computer-emulated minds rather than the (subjectively indistinguishable) minds
of originally evolved creatures. And if we are, we suffer the risk that the simulation may be shut down at any time.
A decision to terminate our simulation may be prompted by our actions or by exogenous factors. While to some it
may seem frivolous to list such a radical or “philosophical” hypothesis next the concrete threat of nuclear holocaust, we must seek to base these
evaluations on reasons rather than untutored intuition. Until a refutation appears of the argument presented in [27], it would intellectually
dishonest to neglect to mention simulation-shutdown as a potential extinction mode.

[ ] Bostrum also says that we cannot discount the risk of Skynet.


Nick Bostrom, 2001 prof of Philosophy, Oxford University [ Journal of Evolution and Technology, Vol. 9, March
2002. First version: 2001 March, JStor

An upload is a mind that has been


Take-over by a transcending upload. Suppose uploads come before human-level artificial intelligence.
transferred from a biological brain to a computer that emulates the computational processes that took place in the
original biological neural network [19,33,53,54]. A successful uploading process would preserve the original mind’s
memories, skills, values, and consciousness. Uploading a mind will make it much easier to enhance its intelligence,
by running it faster, adding additional computational resources, or streamlining its architecture. One could imagine
that enhancing an upload beyond a certain point will result in a positive feedback loop, where the enhanced upload is able
to figure out ways of making itself even smarter; and the smarter successor version is in turn even better at designing
an improved version of itself, and so on. If this runaway process is sudden, it could result in one upload reaching
superhuman levels of intelligence while everybody else remains at a roughly human level. Such enormous
intellectual superiority may well give it correspondingly great power. It could rapidly invent new technologies or perfect
nanotechnological designs, for example. If the transcending upload is bent on preventing others from getting the opportunity to upload, it might
do so. The posthuman world may then be a reflection of one particular egoistical upload’s preferences (which in a
worst case scenario would be worse than worthless). Such a world may well be a realization of only a tiny part of
what would have been possible and desirable. This end is a shriek.

The Method Lab


Michigan 2009 42
Risk Calculus
Extensions – Nuclear Threat Rhetoric Bad
[ ] Nuclear war is only textual – it has never occurred and cannot be described – it can
only be addressed as a discursive event.
Derrida, 1984 (Jacques, philosopher, Diacritics, "No Apocalypse, Not Now (Full Speed Ahead, Seven Missiles, Seven
Missives)," Vol 14, No 2, Nuclear Criticism, Summer, PS.1
http://www.westga.edu/~pburgey/XIDS/NoApocalypseDerrida.pdf

Third reason. In our techno-scientifico-militaro-diplomatic incompetence, we may consider ourselves, however, as


competent as others to deal with a phenomenon whose essential feature is that of being fabulously textual,
through and through. Nuclear weaponry depends, more than any weaponry in the past, it seems, upon structures
of information and communication, structures of language, including non-vocalizable language, structures of codes
and graphic decoding. But the phenomenon is fabulously textual also to the extent that, for the moment, a nuclear
war has not taken place: one can only talk and write about it. You will say, perhaps: but it is not the first time; the
other wars, too, so long as they hadn't taken place, were only talked about and written about. And as to the fright of
imaginary anticipation, what might prove that a European in the period following the war of 1870 might not have been
more terrified by the "technological" image of the bombings and exterminations of the Second World War (even
supposing he had been able to form such an image) than we are by the image we can construct for ourselves of a nuclear
war? The logic of this argument is not devoid of value, especially if one is thinking about a limited and "clean" nuclear
war. But it loses its value in the face of the hypothesis of a total nuclear war, which, as a hypothesis, or, if you prefer, as
a fantasy, or phantasm, conditions every discourse and all strategies. Unlike the other wars, which have all been
preceded by wars of more or less the same type in human memory (and gunpowder did not mark a radical break in
this respect), nuclear war has no precedent. It has never occurred, itself; it is a non-event. The explosion of
American bombs in 1945 ended a "classical," conventional war; it did not set off a nuclear war. The terrifying
reality of the nuclear conflict can only be the signified referent, never the real referent (present or past) of a
discourse or a text. At least today apparently. And that sets us to thinking about today, our day, the presence of this
present in and through that fabulous textuality. Better than ever and more than ever. The growing multiplication of the
discourse- indeed, of the literature - on this subject may constitute a process of fearful domestication, the
anticipatory assimilation of that unanticipatable entirely-other. For the moment, today, one may say that a non-
localizable nuclear war has not occurred; it has existence only through what is said of it, only where it is talked
about. Some might call it a fable, then, a pure invention: in the sense in which it is said that a myth, an image, a
fiction, a utopia, a rhetorical figure, a fantasy, a phantasm, are inventions. It may also be called a speculation,
even a fabulous specularization. The breaking of the mirror would be, finally, through an act of language, the
very occurrence of nuclear war. Who can swear that our unconscious is not expecting this? dreaming of it, desiring it?
You will perhaps find it shocking to find the nuclear issue reduced to a fable. But then I haven't said simply that. I
have recalled that a nuclear war is for the time being a fable, that is, something one can only talk about. But who can fail
to recognize the massive "reality" of nuclear weaponry and of the terrifying forces of destruction that are being
stockpiled and capitalized everywhere, that are coming to constitute the very movement of capitalization. One has to
distinguish between this "reality" of the nuclear age and the fiction of war. But, and this would perhaps be the
imperative of a nuclear criticism, one must also be careful to interpret critically this critical or diacritical distinction. For
the "reality" of the nuclear age and the fable of nuclear war are perhaps distinct, but they are not two separate things. It
is the war (in other words the fable) that triggers this fabulous war effort, this senseless capitalization of
sophisticated weaponry, this speed race in search of speed, this crazy precipitation which, through techno-
science, through all the techno-scientific inventiveness that it motivates, structures not only the army, diplomacy,
politics, but the whole of the human socius today, everything that is named by the old words culture, civilization,,
schole, paideia. "Reality," let's say the encompassing institution of the nuclear age, is constructed by the fable, on the
basis of an event that has never happened (except in fantasy, and that is not nothing at all)," an event of which
one can only speak, an event whose advent remains an invention by men (in all the senses of the word
"invention") or which, rather, remains to be invented. An invention because it depends upon new technical
mechanisms, to be sure, but an invention also because it does not exist and especially because, at whatever point it
should come into existence, it would be a grand premiere appearance.

The Method Lab


Michigan 2009 43
Risk Calculus
Extensions – Nuclear Threat Rhetoric Bad
[ ] Focusing on catastrophic risk assessment dehumanizes decisionmaking
Derrida, 1984 (Jacques, philosopher, Diacritics, "No Apocalypse, Not Now (Full Speed Ahead, Seven Missiles, Seven
Missives)," Vol 14, No 2, Nuclear Criticism, Summer, PS.1 http://www.westga.edu/~pburgey/XIDS/NoApocalypse
Derrida .pdf

Reason number one. Let us consider the form of the question itself: is the war of (over, for) speed (with all that it
entails) an irreducibly new phenomenon, an invention linked to a set of inventions of the so-called nuclear age, or is it
rather the brutal acceleration of a movement that has always already been at work? This form of the question perhaps
constitutes the most indispensable formal matrix, the keystone or, if you will, the nuclear question, for any problematics
of the "nuclear criticism" type, in all its aspects. Naturally, I don't have time to demonstrate this. I am offering it,
therefore, as a hasty conclusion, a precipitous assertion, a belief, an opinion-based argument, a doctrine or a dogmatic
weapon. But 1 was determined to begin with it. l wanted to begin as quickly as possible with a warning in the form of a
dissuasion: watch out, don't go too fast. There is perhaps no invention, no radically new predicate in the situation
known as "the nuclear age." Of all the dimensions of such an "age" we may always say one thing: it is neither
the first time nor the last. The historian's critical vigilance can always help us verify that repetitiveness; and that
historian's patience, that lucidity of memory must always shed their light on "nuclear criticism," must oblige it
to decelerate, dissuade it from rushing to a conclusion on the subject of speed itself. But this dissuasion and
deceleration I am urging carry their own risks: the critical zeal that leads us to recognize precedents, continuities,
and repetitions at every turn can make us look like suicidal sleepwalkers, blind and deaf alongside the unheard-
of; it could make us stand blind and deaf alongside that which cuts through the assimilating resemblance of
discourses (for example of the apocalyptic or bimillenarist type), through the analogy of techno-military situations,
strategic arrangements, with all their wagers, their last-resort calculations, on the "brink," their use of chance
and risk factors, their mimetic resource to upping the ante, and so on- blind and deaf, then, alongside what would
be absolutely unique; and it, this critical zeal, would seek in the stockpile of history (in short, in history itself, which in
this case would have this blinding search as its function) the wherewithal to neutralize invention, to translate the
unknown into a known, to metaphorize, allegorize, domesticate the terror, to circumvent (with the help of
circumlocutions: turns of phrase, tropes and strophes) the inescapable catastrophe, the undeviating precipitation toward
a remainderless cataclysm. The critical slowdown may thus be as critical as the critical acceleration. One may still die
after having spent one's life recognizing, as a lucid historian, to what extent all that was not new, telling oneself
that the inventors of the nuclear age or of nuclear criticism did not invent the wheel, or, as we say in French,
"invent gunpowder." That's the way one always dies, moreover, and the death of what is still now and then
called humanity might well not escape the rule.

[ ] Nuclear rhetoric is a persuasion of intimidation.


Derrida, 1984 (Jacques, philosopher, Diacritics, "No Apocalypse, Not Now (Full Speed Ahead, Seven Missiles, Seven
Missives)," Vol 14, No 2, Nuclear Criticism, Summer, PS.1 ttp://www.westga.edu/~pburgey/XIDS/NoApocalypse
Derrida.pdf

Reason number two. What is the right speed, then? Given our inability to provide a good answer for that question, we at
least have to recognize gratefully that the nuclear age allows us to think through this aporia of speed (i.e., the need
to move both slowly and quickly); it allows us to confront our predicament starting from the limit constituted by
the absolute acceleration in which the uniqueness of an ultimate event, of a final collision or collusion, the
temporalities called subjective and objective, phenomenological and intra-worldly, authentic and inauthentic,
etc., would end up being merged into one another. But, wishing to address these questions to the participants of a
colloquium on "nuclear criticism," 1 am also wondering at what speed we have to deal with these aporias: with what
rhetoric, what strategy of implicit connection, what ruses of potentialization and of ellipsis, what weapons of irony? The
"nuclear age" makes for a certain type of colloquium, with its particular technology of information, diffusion
and storage, its rhythm of speech, its demonstration procedures, and thus its arguments and its armaments, its
modes of persuasion or intimidation.

The Method Lab


Michigan 2009 44
Risk Calculus
Extensions – Nuclear Threat Rhetoric Bad
[ ] Techno-scientific predictions of nuclear catastrophes fail – they rely on a model of
discourse that privileges expertise over probability
Derrida, 1984 (Jacques, philosopher, Diacritics, "No Apocalypse, Not Now (Full Speed Ahead, Seven Missiles, Seven
Missives)," Vol 14, No 2, Nuclear Criticism, Summer, PS.1
http://www.westga.edu/~pburgey/XIDS/NoApocalypseDerrida.pdf

Second reason. So we are not experts in strategy, in diplomacy, or in the techno-science known as nuclear science, we
are oriented rather toward what is called not humanity but the humanities, history, literature, languages, philology,
the social sciences, in short all that which in the Kantian university was situated in the inferior class of the philosophy
school, foreign to any exercise of power. We are specialists in discourse and in texts, all sorts of texts. Now I shall
venture to say that in spite of all appearances this specialty is what entitles us, and doubly so, to concern ourselves
seriously with the nuclear issue. And by the same token, if we have not done so before, this entitlement, this
responsibility that we would thus have been neglecting until now, directs us to concern ourselves with the nuclear issue
- first, inasmuch as we are representatives of humanity and of the incompetent humanities which have to think through
as rigorously as possible the problem of competence, given that the stakes of the nuclear question are those of humanity,
of the humanities. How, in the face of the nuclear issue, are we to get speech to circulate not only among the self-styled
competent parties and those who are alleged to be incompetent, but among the competent parties themselves. For we are
more than just suspicious; we are certain that, in this area in particular, there is a multiplicity of dissociated,
heterogeneous competencies. Such knowledge is neither coherent nor totalizable. Moreover, between those whose
competence is technoscientific (those who invent in the sense of unveiling or of "constative" discovery as well as in
the sense of production of new technical or "performing" mechanisms) and those whose competence is politico-
military, those who are empowered to make decisions, the deputies of performance or of the performative, the
frontier is more undecidable than ever, as it is between the good and evil of all nuclear technology. If on the one
hand it is apparently the first time that these competencies are so dangerously and effectively dissociated, on the other
hand and from another point of view, they have never been so terribly accumulated, concentrated, entrusted as in a dice
game to so few hands: the military men are also scientists, and they find themselves inevitably in the position of
participating in the final decision, whatever precautions may be taken in this area. All of them, that is, very few, are in
the position of inventing, inaugurating, improvising procedures and giving orders where no model - we shall talk
about this later on-can help them at all. Among the acts of observing, revealing, knowing, promising, acting,
simulating, giving orders, and so on, the limits have never been so precarious, so undecidable. Today it is on the
basis of that situation - the limit case in which the limit itself is suspended, in which therefore the krinein, crisis,
decision itself, and choice are being subtracted from us, are abandoning us like the remainder of that subtraction - it is
on the basis of that situation that we have to re-think the relations between knowing and acting, between
constative speech acts and performative speech acts, between the invention that finds what was already there and
the one that produces new mechanisms or new spaces. In the undecidable and at the moment of a decision that
has no common ground with any other, we have to reinvent invention or conceive of another "pragmatics."

The Method Lab


Michigan 2009 45
Risk Calculus
Extensions – Nuclear Threat Rhetoric Bad
[ ] Nuclear deterrence is a rhetorical act – it can only be viewed as persuasion when
made as a threat. This reexamination allows us to approach nuclear weapons rationally.

Derrida, 1984 (Jacques, philosopher, Diacritics, "No Apocalypse, Not Now (Full Speed Ahead, Seven Missiles, Seven
Missives)," Vol 14, No 2, Nuclear Criticism, Summer, PS.1
http://www.westga.edu/~pburgey/XIDS/NoApocalypseDerrida.pdf

We can therefore consider ourselves competent because the sophistication of the nuclear missile, strategy can never do
without a sophistry of belief and the rhetorical simulation of a text. First reason. The worldwide organization of
the human socius today hangs by the thread of nuclear rhetoric. This is immediately readable
in the fact that we use the term "strategy of deterrence" or "strategy of dissuasion," as we say in
French, for the overall official logic of nuclear politics. Dissuasion, or deterrence, means "persuasion."
Dissuasion is a negative mode or effect of persuasion. The art of persuasion is, as you know, one of the two axes of
what has been called rhetoric since classical times. To dissuade is certainly a form of persuasion, but it involves not
only persuading someone to think or believe this or that, but persuading someone that something must not be done. We
dissuade when we persuade some-one that it is dangerous, inopportune or wrong to decide to do something. The
rhetoric of dissuasion is a performative apparatus that has other performatives as its intended output. The
anticipation of nuclear war (dreaded as the fantasy, or phantasm, of a remainderless destruction) installs
humanity-and through all sorts of relays even defines the essence of modern humanity- in its
rhetorical condition. To recall this is not to paint with verbose vanity the horror of the nuclear
catastrophe which, according to some, is already degrading our world in its totality, or
improving it by the same token, according to others; it is not to say of this absolute pharmakan that it is
woven with words, as if we were saying "all this horror is nothing but rhetoric." On the contrary, this allows us to
think today, retrospectively, the power and the essence of rhetoric; and even of sophistry,
which has always been connected, at least since the Trojan War, with rhetoric (this is true for the
Greek conception of what we are committed here to naming, Greek style, sophistry, and rhetoric).

[ ] Their rhetoric causes dissociation – shattering – of our perception of reality. That


leads to the very real possibility of the destruction of society.
Derrida, 1984 (Jacques, philosopher, Diacritics, "No Apocalypse, Not Now (Full Speed Ahead, Seven Missiles, Seven Missives)," Vol 14, No 2, Nuclear Criticism, Summer, PS.1
http://www.westga.edu/~pburgey/XIDS/NoApocalypseDerrida.pdf

Reason number three. Having raised, very rapidly, my question on the subject of speed, I am unilaterally disarming, I am putting my cards on the table. I am announcing that, for want of time-
time for preparation and time for the speech act-I shall not make a real "speech." By which means, you will say, I shall have taken more time than all my partners. I am thus choosing, as you have
already observed, the genre or rhetorical form of tiny atomic nuclei (in the process of fission or division in an uninterruptable chain) which I shall arrange or rather which I shall project toward
you, like tiny inoffensive missiles: in a discontinuous, more or less haphazard fashion. This will be my little strategic and capitalistic calculation, in order to say, potentially, without being too
tedious and as quickly as possible, as many things as possible. Capitalization- or capitalism - always has the structure of a certain potentialization of speed. This has been, in three points, my first
missile, or my first missive, or my first nuclear aphorism: in the beginning there will have been speed, which is always taking on speed, in other words, overtaking or-as we say in French, prendre
At the beginning-faster than the
de vitesse, doubler, doubling, passing-both the act and the speech. At the beginning was the word; at the beginning was the act. No!
word or the act-there will have been speed, and a speed race between them. But of course, speed was only a
beginning for my speech, for my speech act, today. or such a feat, we may consider ourselves competent. And for
the reason I have just stated very quickly: because of speed. Indeed: nowhere has the dissociation between the
place where competence is exercised and the place where the stakes are located ever seemed more rigorous,
more dangerous, more catastrophic. Seemed, I said. Is it not apparently the first time that that dissociation,
more unbridgeable than ever for ordinary mortals, has put in the balance the fate of what is still now and
then called humanity as a whole, or even of the earth as a whole, at the very moment when your president is
even thinking about waging war beyond the earth? Doesn't that dissociation (which is dissociation itself, the
division and the dislocation of the socius, of sociality itself) allow us to think the essence of knowledge and
techne itself, as socialization and de-socialization, as the constitution and the deconstruction of the socius?

The Method Lab


Michigan 2009 46
Risk Calculus
Extensions – Nuclear Threat Rhetoric Bad
[ ] There is no risk of nuclear annihilation – the claim is preposterous. Arguments of
global nuclear war only serve to further entrench the powers of the state and to excuse any
action that the presenter deems necessary.
Jean Baudrillard, 1981 in “Simulacra and Simulation” p. 32-35, French cultural theorist, sociologist, philosopher,
political commentator, and photographer. Postmodernism and post-structuralism.

The apotheosis of simulation: the nuclear. However, the balance of terror is never anything but the spectacular
slope of a system of deterrence that has insinuated itself from the inside into all the cracks of daily life. Nuclear
suspension only serves to seal the trivialized system of deterrence that is at the heart of the media, of the violence
without consequences that reigns throughout the world, of the aleatory apparatus of all the choices that are made
for us. The most insignificant of our behaviors is regulated by neutralized, indifferent, equivalent signs, by zero-sum
signs like those that regulate the "strategy of games" (but the true equation is elsewhere, and the unknown is precisely
that variable of simulation which makes of the atomic arsenal itself a hyperreal form, a simulacrum that dominates
everything and reduces all "ground-level" events to being nothing but ephemeral scenarios, transforming the life left us
into survival, into a stake without stakes not even into a life insurance policy: into a policy that already has no value). It
is not the direct threat of atomic destruction that paralyzes our lives, it is deterrence that gives them leukemia.
And this deterrence comes from that fact that even the real atomic clash is precluded-precluded like the
eventuality of the real in a system of signs. The whole world pretends to believe in the reality of this threat (this is
understandable on the part of the military, the gravity of their exercise and the discourse of their "strategy" are at stake),
but it is precisely at this level that there are no strategic stakes. The whole originality of the situation lies in the
improbability of destruction. Deterrence precludes war-the archaic violence of expanding systems. Deterrence itself
is the neutral, implosive violence of metastable systems or systems in involution. There is no longer a subject of
deterrence, nor an adversary nor a strategy-it is a planetary structure of the annihilation of stakes. Atomic war, like the
Trojan War, will not take place. The risk of nuclear annihilation only serves as a pretext, through the
sophistication of weapons (a sophistication that surpasses any possible objective to such an extent that it is itself a
symptom of nullity), for installing a universal security system, a universal lockup and control system whose
deterrent effect is not at all aimed at an atomic clash (which was never in question, except without a doubt in the
very initial stages of the cold war, when one still confused the nuclear apparatus with conventional war) but, rather, at
the much greater probability of any real event, of anything that would be an event in the general system and
upset its balance. The balance of terror is the terror of balance. Deterrence is not a strategy, it circulates and is
exchanged between nuclear protagonists exactly as is international capital in the orbital zone of monetary
speculation whose fluctuations suffice to control all global exchanges. Thus the money of destruction (without any
reference to real destruction, any more than floating capital has a real referent of production) that circulates in nuclear
orbit suffices to control all the violence and potential conflicts around the world. What is hatched in the shadow of this
mechanism with the pretext of a maximal, "objective," threat, and thanks to Damocles' nuclear sword, is the perfection
of the best system of control that has ever existed. And the progressive satellization of the whole planet through this
hypermodel of security. The same goes for peaceful nuclear power stations. Pacification does not distinguish between
the civil and the military: every- where where irreversible apparatuses of control are elaborated, everywhere the notion
of security becomes omnipotent, everywhere where the norm replaces the old arsenal of laws and violence (including
war), it is the system of deterrence that grows, and around it grows the historical, social, and political desert. A gigantic
involution that makes every conflict, every finality, every confrontation contract in proportion to this blackmail that
interrupts, neutralizes, freezes them all. No longer can any revolt, any story be deployed according to its own logic
because it risks annihilation. No strategy is possible any longer, and escalation is only a puerile game given over to the
military. The political stake is dead, only simulacra of conflicts and carefully circumscribed stakes remain. The "space
race" played exactly the same role as nuclear escalation. This is why the space program was so easily able to replace it
in the 1960s (Kennedy/Khrushchev), or to develop concurrently as a form of "peaceful coexistence." Because what,
ultimately, is the function of the space program, of the conquest of the moon, of the launching of satellites if not the
institution of a model of universal gravitation, of satellization of which the lunar module is the perfect embryo?
Programmed microcosm, where nothing can be left to chance. Trajectory, energy, calculation, physiology, psychology,
environment-nothing can be left to contingencies, this is the total universe of the norm-the Law no longer exists, it is the
operational immanence of every detail that is law. A universe purged of all threat of meaning, in a state of asepsis and

The Method Lab


Michigan 2009 47
Risk Calculus
Extensions – Nuclear Threat Rhetoric Bad

weightlessness-it is this very perfection that is fascinating. The exaltation of the crowds was not a response to the event
of landing on the moon or of sending a man into space (this would be, rather, the fulfillment of an earlier dream), rather,
we are dumb-founded by the perfection of the programming and the technical manipulation, by the immanent wonder of
the programmed un- folding of events. Fascination with the maximal norm and the mastery of probability. Vertigo
of the model, which unites with the model of death, but without fear or drive. Because if the law, with its aura of
transgression, if order, with its aura of violence, still taps a perverse imaginary, the norm fixes, fascinates,
stupefies, and makes every imaginary involute. One no longer fantasizes about the minutiae of a program. Just
watching it produces vertigo. The vertigo of a world without flaws. Now, it is the same model of programmatic
infallibility, of maximum security and deterrence that today controls the spread of the social. There lies the true nuclear
fallout: the meticulous operation of technology serves as a model for the meticulous operation of the social. Here as
well, nothing will be left to chance, moreover this is the essence of socialization, which began centuries ago, but which
has now entered its accelerated phase, toward a limit that one believed would be explosive (revolution), but which for
the moment is translated by an inverse, implosive, irreversible process: the generalized deterrence of chance, of
accident, of transversality, of finality; of contradiction, rupture, or complexity in a sociality illuminated by the norm,
doomed to the descriptive transparency of mechanisms of information. In fact, the spatial and nuclear models do not
have their own ends: neither the discovery of the moon, nor military and strategic superiority. Their truth is to be the
models of simulation, the model vectors of a system of planetary control (where even the super- powers of this scenario
are not free-the whole world is satellized).

The Method Lab


Michigan 2009 48
Risk Calculus
They Say “Ignore Low Probability”
[ ] It is impossible to set a threshold for “too low” probability
Berube, 2000, Associate Professor of Speech Communication and Director of Debate at the University of South
Carolina (David M. Berube, 2000, “Debunking mini-max reasoning: the limits of extended causal chains in contest
debating” http://www.cedadebate.org/CAD/2000_berube.pdf, pages 53-73)

Zarefsky's observation is intriguing. Consider how often critics have voided disadvantages following a uniqueness
response. For example, in response to a Presidential leadership internal link story, a contest debater may claim that
recent Presidential behavior makes the claim not unique. However, uniqueness is not a threshold issue, it is a linear one,
a probabilistic one. While the response reduces the likelihood of the internal link story, uniqueness responses only
reduce the probability of the internal link story. The likeliness a uniqueness response is absolute is very low. Some
uniqueness, or probability, remains after a uniqueness challenge, yet the critics round down and ignore the leadership
disadvantage entirely. On the other hand, many judges round up as well, responding to contest debaters who have
begged the risk question by a final rebuttal appeal to mini-max reasoning. Risk theorists find this false dualism
troubling. For example, de Sousa warns: A pragmatic conception of probability needs something broader than mere
acceptance. for acceptance is an on/off matter, and probability has degrees. . . . Because of the lottery paradox, high
probabilities can never be a sufficient condition of acceptance. And because of what I call the Lem Paradoxt, low
probability can never be a sufficient condition of rejection." (261)

[ ] Catastrophes moot the paralysis argument – even with low probabilities,


reasonability requires decision makers to avoid extinction consequences
Nicholas Rescher is an American philosopher, University Professor of Philosophy and Chairman of the Center for
Philosophy of Science at University of Pittsburgh. [“Risk: A Philosophical Introduction to the Theory of Risk
Evaluation and Management”, p35-36. 1983.]

Disparity of Risk Situations and the Threshold of Relative Unacceptability A disparity of risks arises when there is so
serious an imbalance among alternative eventuations - so great a difference in the relative size of the prospective
negativities at issue - that one alternative can be viewed as simply ineligible relative to another, quite independently of
considerations of probabilistic detail. The prospect of such a negativity is simply unacceptable relative to the gains or
losses otherwise operative in the situation, without reference to any "balance of probabilities." Thus no matter what the
balance of probabilities, the "reasonable man" would not risk loss of life or limb to avert the prospect of some trivial
inconvenience. Nor would he ever risk utter impoverishment to avert the possible loss of a few cents - at any rate as
long as we are not dealing with probabilities that are "effectively zero." The prospective damage of the one alternative is
too great in relation to the potential loss of the other, regardless of the odds. One "just can't take the chance." In this
light consider a choice-situation of the form set out in Figure 1. In a situation of this sort, the possible losses at issue can
prove to be of altogether different orders. The negativity of Y can be so large relative to that of X that they are simply
not in the same league - one would rationally opt for one and shun the other regardless of how the probabilities x and
yare adjusted. In the conditions at issue, the Y-risking hazard is simply unacceptable. It is unjustified as well as
unrealistic to take the stance that all negativities are essentially comparable and to hold that one can always be balanced
off against another by such probabilistic manipulations. To be sure, the customary decision-approach via expected-value
comparisons would always enable us to establish a probabilistic proportion between the risks at issue by balancing the
expected -value equation .

The Method Lab


Michigan 2009 49
Risk Calculus
They Say “Ignore Low Probability”
[ ] Focusing only on probability is impractical – it is impossible to assign accurate
probabilities to risk and there is no alternative to using impacts
Dale Herbeck, Professor of Communication at Boston College, 1992 [Director of the Fulton Debating Society at
Boston College, “The Use and Abuse of Risk Analysis in Policy Debate,” Paper Presented at the 78th Annual
Meeting of the Speech Communication Association (Chicago, IL), October 29th-November 1st, Available Online
via ERIC Number ED354559, p. 10-12]

It is easy to indict the use of risk analysis in policy debate. The more difficult task is to frame an alternative. As a
tool, risk analysis offers a uniquely valuable method of assessing and comparing a variety of competing policy
alternatives. It is, simply put, difficult to imagine how you could evaluate without some notion of risk analysis.
Recognizing this fact, Vincent Follert offered the following guidelines in his article critiquing the use of risk
analysis in debate: Each disputant should justify numerical estimates of the probability and valence of outcomes.
Debaters could give more attention to the analytical assumptions made by their opponents. Such tests may also be
applied to the evidence offered in the debate. Statistical tests may also be applied to the evidence and models used.
Finally, the disputants should argue in favor of a particular perspective which allows the critic to make comparisons
of dissimilar alternatives. 16 While we appreciate the spirit and intent of Follert's suggestions, we question their
workability in the debate setting. How could debaters meaningfully measure the probability associated with some of
the arguments in debate? How can assumptions, seldom expressed in the debate, be factored into the decision? How
would statistical tests be applied? While risk analysis would be improved if these questions were addressed, it seems
impossible to address them meaningfully within current debate formats. At a more fundamental level, we believe that Follert's
guidelines fail to address many of our concerns. Ev --i if a debater could quantify the probability, defend the assumptions
underlying that assessment, and apply statistical tests to prove that the risks were statistically significant, we still
fear that debate would be enslaved to low probability/high impact scenarios. We believe that a better approach to
addressing this problem would be to rehabilitate our notion of probability.

[ ] Examining low probability outcomes is necessary – the dominant weighting functions


are non-linear
Abdellaoui, 2000, Research Director and Affiliate professor at HEC (Mohammed Abdellaoui, “Parameter-Free
Elicitation of Utility and Probability Weighting Functions”, Management Science, Vol. 46, No. 11 (Nov., 2000), pp.
1497-1512, http://www.jstor.org/stable/2661664)

In a seminal paper, Kahneman and Tversky (1979) present experimental evidence that preferences between risky
prospects are not linear in probabilities. They propose, as well, a theory of choice under risk, Prospect Theory (PT),
suggesting that a probability weighting function (that maps the unit interval into itself with discontinuities at 0 and 1)
exhibiting over-weighting of small probabilities and underweighting of moderate and high probabilities may explain the
observed nonlinearities. Subsequently, modern generalizations of PT were proposed through Rank- Dependent Expected Utility (RDEU) theory
(Quiggin 1982, Wakker 1994), and Cumulative Prospect Theory (CPT) (Tversky and Kahneman 1992). Their main characteristic, in
decision under risk, consists of allowing not only the transformation of outcomes into utilities, but also the
transformation of decumulative probabilities to obtain decision weights through a probability weighting function. This
innovation, however, has been perceived as a factor complicating utility mea-surement, and therefore, the elicitation of RDEU and CPT models. A
variety of methods have been used to determine the shapes of the utility function and the probability weighting function under RDEU and CPT. The
pre-dominant approach prespecifies parametric forms for these functions and then estimates them through standard techniques (e.g., Tversky and
Kahneman 1992, Camerer and Ho 1994, Hey and Orme 1994, Tver-sky and Fox 1995). However, assuming specific functional forms for
the utility function and the probability weighting function makes inference about the shapes of these functions
dependent on the choice of functional forms. Two research strategies can avoid the potential problems of parametric
estimation. The first strategy consists of testing simple preference conditions to obtain information about the shape of either the utility function or
the probability weighting function. The second strategy consists of eliciting the utility and probability weighting functions at the level of individuals,
without any parametric assumption. This approach is more demanding, but, in return, provides direct measurements of both functions.

The Method Lab


Michigan 2009 50
Risk Calculus
They Say “Ignore Low Probability”
[ ] Prioritizing small probabilities of high risks is psychologically justified – we have
diminished sensitivity to smaller impacts
Abdellaoui, 2000, Research Director and Affiliate professor at HEC (Mohammed Abdellaoui, “Parameter-Free
Elicitation of Utility and Probability Weighting Functions”, Management Science, Vol. 46, No. 11 (Nov., 2000), pp.
1497-1512, http://www.jstor.org/stable/2661664)

A few months after a first version of this paper was completed, Bleichrodt and Pinto (1998) and Gonzalez and Wu
(1999) fin-ished two papers proposing two methods to elicit probability weighting functions. The first paper reports
experimental re-sults regarding probability weighting in medical decision mak-ing (using the tradeoff method). The
second investigates indi-vidual differences in probability weighting for monetary gains through an alternating least
square estimation method. Given that the elicitation of the probability weight-ing function in this paper needs the
construction of the utility function to be carried out first, the first em-pirical question addressed concerns the shape of
the utility function. Its qualitative properties issuing from the psychological principle of diminishing sensitivity are
confirmed here, in agreement with other recent findings (Wakker and Deneffe 1996, Fennema and van Assen 1998, Fox
and Tversky 1998). Then the ques-tion of the shape of the probability weighting func-tion is addressed. The data
confirm that individuals transform probabilities consistently with the psycho-logical principle of diminishing sensitivity,
with the two end points of the probability interval serving as reference points. Overall, these results are consistent (for
gains) with those obtained recently by Tversky and Fox (1995). This paper also elicits probability weighting func-tions
for losses. This allows a straightforward com-parison of the treatment of probabilities for gains and losses at the level of
individual subjects. Indeed, the data confirm the existence of a significant difference between the probability weighting
function for gains and the probability weighting function for losses. Moreover, the data suggest a descriptive superiority
of CPT over RDEU. Finally, the hypothesis of linearity of the proba-bility weighting function for moderate probabilities
is investigated. This question has received rather contradictory answers in the experimental literature. Camerer (1992),
Harless and Camerer (1994), and Ab-dellaoui and Munier (1998), for instance, obtained re-sults through nonparametric
techniques suggesting a linear weighting function for intermediate probabili-ties (see also Cohen and Jaffray 1988). On
the contrary, Wu and Gonzalez (1996), among others, found sup-port for nonlinearity. This paper confirms the latter
(i.e., nonlinearity).

[ ] Low probability is not a reason to reject the impact- threshold probability is too
simplistic and risk is linear
David Berube, Associate Professor of Speech Communication at the University of South Carolina, 2000 [Director of
Debate “Debunking Mini-max Reasoning: The Limits Of Extended Causal Chains In Contest Debating,”
Contemporary Argumentation and Debate, Volume 21, Available Online at
http://www.cedadebate.org/CAD/2000_berube.pdf, Accessed 04-05-2008, p. 64-69]

Zarefsky's observation is intriguing. Consider how often critics have voided disadvantages following a uniqueness response. For example, in
response to a Presidential leadership internal link story, a contest debater may claim that recent Presidential behavior makes the claim not unique.
However, uniqueness is not a threshold issue, it is a linear one, a probabilistic one. While the response reduces the likelihood of the
internal link story, uniqueness responses only reduce the probability of the internal link story. The likeliness a
uniqueness response is absolute is very low. Some uniqueness, or probability, remains after a uniqueness challenge,
yet the critics round down and ignore the leadership disadvantage entirely. On the other hand, many judges round up as well,
responding to contest debaters who have begged the risk question by a final rebuttal appeal to mini-max reasoning. Risk theorists find this false
dualism troubling. For example, de Sousa warns: A pragmatic conception of probability needs something broader than mere acceptance. for
acceptance is an on/off matter, and probability has degrees. . . . Because of the lottery paradox, high probabilities can never be a sufficient
condition of acceptance. And because of what I call the Lem Paradoxt, low probability can never be a sufficient
condition of rejection." (261)

The Method Lab


Michigan 2009 51
Risk Calculus
They Say “Ignore Low Probability”
[ ] We cannot ignore even low probability extinction risks – too much is at stake
Nick Bostrom, 2001 prof of Philosophy, Oxford University [ Journal of Evolution and Technology, Vol. 9, March
2002. First version: 2001 March, JStor

In combination, these indirect arguments add important constraints to those we can glean from the direct consideration of various
technological risks, although there is not room here to elaborate on the details. But the balance of evidence is such that it would appear
unreasonable not to assign a substantial probability to the hypothesis that an existential disaster will do us in. My subjective opinion is that setting
this probability lower than 25% would be misguided, and the best estimate may be considerably higher. But even if the probability were
much smaller (say, ~1%) the subject matter would still merit very serious attention because of how much is at stake.

In general, the greatest existential risks on the time-scale of a couple of centuries or less appear to be those that
derive from the activities of advanced technological civilizations. We see this by looking at the various existential
risks we have listed. In each of the four categories, the top risks are engendered by our activities. The only
significant existential risks for which this isn’t true are “simulation gets shut down” (although on some versions of this
hypothesis the shutdown would be prompted by our activities [27]); the catch-all hypotheses (which include both types of scenarios); asteroid or
comet impact (which is a very low probability risk); and getting killed by an extraterrestrial civilization (which would be highly unlikely in the
near future).[19]

[ ] Discussion existential threats is good – it builds public awareness and examines


ethical and political dimensions of policy making
Nick Bostrom, 2001 prof of Philosophy, Oxford University [ Journal of Evolution and Technology, Vol. 9, March
2002. First version: 2001 March, Jstor]

9 Implications for policy and ethics Existential risks have a cluster of features that make it useful to
identify them as a special category: the extreme magnitude of the harm that would come from an existential disaster;
the futility of the trial-and-error approach; the lack of evolved biological and cultural coping methods; the fact that existential risk
dilution is a global public good; the shared stakeholdership of all future generations; the international nature of many of the required
countermeasures; the necessarily highly speculative and multidisciplinary nature of the topic; the subtle and diverse methodological
problems involved in assessing the probability of existential risks; and the comparative neglect of the whole area. From our survey of the
most important existential risks and their key attributes, we can extract tentative recommendations for ethics and
policy: 9.1 Raise the profile of existential risks We need more research into existential risks –
detailed studies of particular aspects of specific risks as well as more general investigations of associated ethical,
methodological, security and policy issues. Public awareness should also be built up so that constructive political
debate about possible countermeasures becomes possible. Now, it’s a commonplace that researchers always conclude that
more research needs to be done in their field. But in this instance it is really true. There is more scholarly work on the life-habits of
the dung fly than on existential risks.

The Method Lab


Michigan 2009 52
Risk Calculus
They Say “Ignore Low Probability”
[ ] Extinction outweighs because it is irreversible - Irreversible disasters should be
prioritized regardless of probability because we cannot recover
Vertzberger, 1995, Professor at the Department of International Relations, the Hebrew University of Jerusalem, Israel
(Yaacov Y. I. Vertzberger, June 1995, “Rethinking and Reconceptualizing Risk in Foreign Policy Decision-Making: A
Sociocognitive Approach”, Political Psychology, Vol. 16, No. 2, pp. 347-380, http://www.jstor.org/stable/3791835)

A particularly controversial and painful choice is how and whether to discount lives, that is, whether to give deaths far
into the future less weight than present deaths. Consequently problems are framed in a way that understates the human-
life cost dimension, or, if that is not possible, that avoids an explicit, precise statement of trade-off calculations. This
evasive approach allows for misleading ambiguities.'2 The more general dilemma is clearly phrased by Morton (1991):
The reason why risky choices are so hard, and why different people can reasonably have differenta ttitudest o them, is
that we ranks ome things as incomparablyw orse thano thers. Typically there are irreversible disasters. Death is the
central example, but there are many others: dishonor, the failure of one's life work, bankruptcy. Once these have gone
wrong nothing will make them right again, or compensate for them. The lack of compensation is what matters most. It
underlines a reluctance to reason in terms of average long-term outcomes.... So our reluctancet o reason probabilistically
on some topics and our tendency to value some things incomparably more than others are two sides of the same coin.
The problem this poses for decision making is also two-sided, though. On the one hand there are dilemmas about giving
a weight to risks of death and other disasters, and on the other hand there is the general problem of how to give less
crucial goods any weight at all in comparison. (pp. 109-110) Framing plays an important role not only in the
predecisional process but also in midterm policy evaluations, where available feedbacks become a basis for deciding
between policy continuity and change. Midterm policy evaluations, especially when outcomes are ambiguous, are based
on two types of consider-ations: the decision-makers' perception of actual policy outcomes and of the outcomes that
might have occurred had a different policy been adopted. In the latter case the decision-maker compares actual results
with counterfactual re-sults, in order to compare the relative value of the current policy with possible alternatives
(Miller et al., 1990). Here, framing plays a role in two basic ways. First, it affects the way in which the actual outcomes
are presented-for exam-ple, whether the loss or gain dimensions are emphasized, as discussed earlier. Second, it
influences the way in which counterfactual results are presented. Since decision-makers deal with events that did not
occur and assessments of '2For a general discussion of trade-off avoidance by decision-makers and the resultant
irrational consistency, see Jervis, 1976, pp. 128-142. Decision-makers believe that the policy they favor is better than
alternative policies in terms of the risk it poses on several logically independent value dimensions. They focus their
attention on one set of risk calculations and then adjust all other risk calculations to fit the conclusions they infer from
risks that are the focus of their attention (Jervis, 1982, 1983, pp. 22-23). New information that should have called their
attention to other risk sets is then assimilated into the priorc alculations throughm isperceptiono r misinterpretation. The
impli-cations for the more specific value-of-human-life-assessmentd ilemma are discussed in Teuber.

The Method Lab


Michigan 2009 53
Risk Calculus
They Say “Ignore Low Probability”
[ ] Extinction threats must trump other forms of risk calculus – we don’t have the option
to experiment, our existing decision making processes are inadequate, and future generations
magnify the size of the impact infinitely
Nick Bostrom, 2001 prof of Philosophy, Oxford University [ Journal of Evolution and Technology, Vol. 9, March
2002. First version: 2001 March, JStor

A much greater existential risk emerged with the build-up of nuclear arsenals in the US and the USSR. An
all-out nuclear war was a possibility with both a substantial probability and with consequences that might have been
persistent enough to qualify as global and terminal. There was a real worry among those best acquainted with the
information available at the time that a nuclear Armageddon would occur and that it might annihilate our species or
permanently destroy human civilization.[4] Russia and the US retain large nuclear arsenals that could be used in a
future confrontation, either accidentally or deliberately. There is also a risk that other states may one day build up
large nuclear arsenals. Note however that a smaller nuclear exchange, between India and Pakistan for instance, is not
an existential risk, since it would not destroy or thwart humankind’s potential permanently. Such a war might
however be a local terminal risk for the cities most likely to be targeted. Unfortunately, we shall see that nuclear
Armageddon and comet or asteroid strikes are mere preludes to the existential risks that we will encounter in the
21st century. The special nature of the challenges posed by existential risks is illustrated by the following
points: 1.Our approach to existential risks cannot be one of trial-and-error. There is no opportunity to learn from
errors. The reactive approach – see what happens, limit damages, and learn from experience – is unworkable. Rather, we must take a
proactive approach. This requires foresight to anticipate new types of threats and a willingness to take decisive preventive
action and to bear the costs (moral and economic) of such actions. 2. We cannot necessarily rely on the institutions,
moral norms, social attitudes or national security policies that developed from our experience with managing other
sorts of risks. Existential risks are a different kind of beast. We might find it hard to take them as seriously as we
should simply because we have never yet witnessed such disasters.[5] Our collective fear-response is likely ill
calibrated to the magnitude of threat. Reductions in existential risks are global public goods [13] and may therefore
be undersupplied by the market [14]. Existential risks are a menace for everybody and may require acting on the
international plane. Respect for national sovereignty is not a legitimate excuse for failing to take countermeasures
against a major existential risk. If we take into account the welfare of future generations, the harm done by
existential risks is multiplied by another factor, the size of which depends on whether and how much we discount
future benefits [15,16].

The Method Lab


Michigan 2009 54
Risk Calculus
They Say “Tyranny of Survival
[ ] Focusing on infinite risk doesn’t destroy morality – maximizing the potential of
avoiding extinction is moral itself
Nick Bostrom, 2001 prof of Philosophy, Oxford University [ Journal of Evolution and Technology, Vol. 9, March
2002. First version: 2001 March, Jstor]

Previous sections have argued that the combined probability of the existential risks is very substantial. Although there is still a fairly broad
range of differing estimates that responsible thinkers could make, it is nonetheless arguable that because the negative utility of an existential
disaster is so enormous, the objective of reducing existential risks should be a dominant consideration when acting out of
concern for humankind as a whole. It may be useful to adopt the following rule of thumb for moral action; we can
call it Maxipok: Maximize the probability of an okay outcome, where an “okay outcome” is any outcome that avoids
existential disaster. At best, this is a rule of thumb, a prima facie suggestion, rather than a principle of absolute validity, since there
clearly are other moral objectives than preventing terminal global disaster. Its usefulness consists in helping us to get our priorities
straight. Moral action is always at risk to diffuse its efficacy on feel-good projects[24] rather on serious work that
has the best chance of fixing the worst ills. The cleft between the feel-good projects and what really has the greatest
potential for good is likely to be especially great in regard to existential risk. Since the goal is somewhat abstract and since
existential risks don’t currently cause suffering in any living creature[25], there is less of a feel-good dividend to be derived from efforts that seek
to reduce them. This suggests an offshoot moral project, namely to reshape the popular moral perception so as to give more credit and social
approbation to those who devote their time and resources to benefiting humankind via global safety compared to other philanthropies.

The Method Lab


Michigan 2009 55
Risk Calculus
They Say “Resource Wars Improbable”
[ ] Resource wars are very real and probable – studies confirm.
Wayne Nafziger, Prof of Comparative Economics, Univ of Illinois, 2002 [ economic development, inequality, war,
and state violence e. and Juha Auvinen, Juha Auvinen is a Docent of International Politics at the University of
Helsinki. http://www.ksu.edu/ economics/nafwayne/EcDev.doc

Collier contends (Collier and Hoeffler 1998, pp. 568-69; Collier 2000a, pp. 92-95) that the possession of primary
commodities, especially exports, increases the occurrence and duration of civil war. Mwanasali (2000, p. 145)
indicates the reasons why. “Primary commodity exports present several advantages to the belligerents. Because
they are generic products, rather than brand names, their origin can easily be concealed. They are usually the most
heavily taxable, especially in kind, and their production or marketing does not require the complicated processes, as
is the case of manufactured goods.” Primary goods include both agricultural (usually renewable) and mineral
(largely nonrenewable) commodities. According to De Soysa’s statistical tests (2000, pp. 123-24), “the incidence of
civil war is completely unrelated to the per capita availability of natural resources, defined as the stocks of both
renewable resources . . . and nonrenewables.” But, once De Soysa refines her independent variable to include only
mineral resources, her result is highly significant. She finds that ‘the higher the per capita availability of . . . mineral
wealth, the greater the incidence of conflict’ (ibid., p. 124). The following, based mainly on work by WIDER
researchers (Nafziger et al., 2 vols., 2000), explains why minerals contribute to conflict and state violence. In the
struggle for allies during the cold war, the United States and the Soviet Union provided military and economic aid
for developing countries. Sovereignty provided the opportunity to extract resources from the major powers in
exchange for diplomatic support. Yet aid could provide the basis for supporting a patronage system for either the
state or for insurgents in opposition. When the cold war ended in the early 1990s, nation-states and rebels in the
developing world required different strategies and new sources of funds. Many countries in Africa and Asia
needed control of resources to provide military and police power but only minimal services to control
territory. Indeed with the International Monetary Fund (IMF)/World Bank emphasis on the market and private
enterprise, predatory rulers often undermined their own bureaucracies to build personal power at the expense of
health, education, and agricultural development (Reno, 2000, pp. 231-32; Väyrynen 2000b, pp. 437-79).

The Method Lab


Michigan 2009 56
Risk Calculus
They Say “Resource Wars Improbable”
[ ] Resources are often the cause of conflicts – history proves.
Wayne Nafziger, Prof of Comparative Economics, Univ of Illinois, 2002 [ economic development, inequality, war,
and state violence e. and Juha Auvinen, Juha Auvinen is a Docent of International Politics at the University of
Helsinki. http://www.ksu.edu/ economics/nafwayne/EcDev.doc

The struggle for control over minerals and other natural resources are important sources of conflict. In
Angola, Sierra Leone, Liberia, and Congo - Kinshasa, rulers and warlords used exclusive contracts with foreign
firms for diamonds and other minerals to “regularize” sources of revenue in lieu of a government agency to collect
taxes (Reno, 1996, 1998, 2000). In comparison, however, Tanzania and Togo lacked the tradable resources to
become a predatory society (Väyrynen 2000b, pp. 444-45). After the decline of aid after the cold war, Sierra
Leone was more susceptible to pressures for liberalization and adjustment from the IMF and World Bank. In
1991, the IMF, the Bank, and bilateral creditors offered loans and debt rescheduling worth $625 million, about 80
per cent of GNP, if Sierra Leone reduced government expenditure and employment. In response, Freetown heeded
the World Bank's advice (1994, pp. 22-51) to use private operators to run state services for a profit. But privatization
did not eliminate the pressures of clients demanding payoffs but merely shifted the arena of clientage to the private
sector. Sierra Leone's ruling elites, needing new ways of exercising power, used foreign firms to consolidate
power and stave off threats from political rivals. In the 1990s, Sierra Leonean heads of state have relied on
exclusive contracts with foreign firms for diamond mining to stabilize revenue, foreign mercenaries and advisors to
replace the national army in providing security, and foreign contractors (sometimes the same mining or security
firms) to provide other state services. In the process, rulers have found it advantageous to “destroy state agencies, to
‘cleanse’ them of politically threatening patrimonial hangers-on and use violence to extract resources from people
under their control” (Reno, 1996, pp. 7-8, 12). In Liberia, Charles Taylor used external commercial networks
(foreign firms), some a legacy of the Samuel Doe regime of the late 1980s, to amass power over Liberia, and at
times, the eastern periphery of Sierra Leone. Taylor's territory had its own currency and banking system,
telecommunications network, airfields, export trade (in diamonds, timber, gold, and farm products) to support arms
imports, and (until 1993) a deepwater port. For Taylor, a warlord during most of the 1990s before being elected
Liberia’s president in 1997, controlling territory by building a patronage network was easier than building a state
and its bureaucracy (Reno, 1995, p. 111). Indeed Taylor had access to annual revenues exceeding $100 million, with
an upper limit around $200 million, from 1990 to 1996 (Reno 2000, pp. 243, 252). Even Zaire’s President Mobutu
Sese Seko (1965-1997), like other hard-pressed rulers in weak African states, mimicked the “warlord” approach of
his non-state rivals. But with the shrinking patronage base from foreign aid and investment, to prevent a coup by
newly marginalized groups in the army or bureaucracy, Mobutu, similar to rulers in other retrenching African states,
needed to reconfigure his political authority. In this situation, foreign firms and contractors served as a new source
of patronage networks. Indigenous commercial interests that profit from the new rules are not independent
capitalists with interests distinct from the state’s. As Reno (1996, p. 16) points out, “Those who do not take part in
accumulation on the ruler's terms are punished.” Mobutu weathered the collapse of the state bureaucracy, but fell
because his strategy of milking state assets had reached a limit, seriously weakening the patronage system. In 1997,
his forces fell to the Alliance des Forces Democratique pour la Liberation (AFDI) of Laurent Kabila, the eventual
president of the Democratic Republic of Congo until assassinated in 2001 (ibid., pp. 9-16; Reno, 1998, pp. 147-81).
State failure, as in Sierra Leone, Liberia, and Zaire, increases vulnerability to war and humanitarian emergencies.
Yet, in a weak or failed state, some rulers, warlords, and traders are more likely to profit from war and violence than
in peacetime. Indeed, as Väyrynen (2000b, p. 442) argues, war, political violence, and state failure do not result
from the incapacity of public institutions but from the fact that rulers, warlords, and their clients benefit from the
harm thereby befalling a substantial share of the population. Relative deprivation also helps explain the increased
violence by belligerents and their clients. An abrupt rush of mineral wealth increases the
expectations of prosperity, not only by the allies of rulers and warlords that control the
resource but also the lure to potential rebels of combat to control the resource. Indeed, as Gurr
(1970, pp. 73, 79) indicates, the intensity of deprivation felt increases with the discrepancy between potential and
actual conditions, and with the length of time the deprivation persists. In Angola, Congo - Kinshasa, and
Sierra Leone, the length and intensity of perceived deprivation were considerable.

The Method Lab


Michigan 2009 57
Risk Calculus
They Say “Economic Collapse = Nuclear War Improbable”
[ ] Economic decline leads to a sense of social injustice, sparking violent conflict.
Wayne Nafziger, Prof of Comparative Economics, Univ of Illinois, 2002 [ economic development, inequality, war,
and state violence e. and Juha Auvinen, Juha Auvinen is a Docent of International Politics at the University of
Helsinki. http://www.ksu.edu/ economics/nafwayne/EcDev.doc

Contemporary emergencies are found in low- and middle-income (that is, developing) countries, suggesting a
threshold above which war and massive state violence do not occur. A disproportional number of these states are
also weak or failing (Holsti, 2000, pp. 243-50), a trait that interacts as cause and effect of their relative poverty.
Moreover, emergencies are more likely to occur in countries experiencing stagnation in real GDP per capita
and a breakdown in law and public services. These phenomena affect relative deprivation, the actors'
perception of social injustice from a discrepancy between goods and conditions they expect and those they can
get or keep. This deprivation often results from vertical (class) or horizontal (regional or communal) inequality
(Stewart 2000, p. 16), where the actors’ income or conditions are related to those of others within society. Relative
deprivation spurs social discontent, which provides motivation for collective violence (Gurr, 1970). Among the
components of emergencies, war and violence have major catalytic roles, adding to social disruption and political
instability, undermining economic activity, spreading hunger and disease, and fueling refugee flows. Tangible and
salient factors such as a marked deterioration of living conditions, especially during a period of high expectations,
are more likely to produce socio-political discontent that may be mobilized into political violence.

[ ] Economic downturns in states with corrupt governments leads to violent acts of


revolution.
Wayne Nafziger, Prof of Comparative Economics, Univ of Illinois, 2002 [ economic development, inequality, war,
and state violence e. and Juha Auvinen, Juha Auvinen is a Docent of International Politics at the University of
Helsinki. http://www.ksu.edu/ economics/nafwayne/EcDev.doc

Only a portion of violence, however, results from insurgent action. In fact, Holsti (2000) demonstrates that the
policies of governing elites are at the root of most humanitarian emergencies, a fact not recognized in most research
on war (cf. Collier, 2000a and Collier and Hoeffler, 2000a). Slow or negative per-capita growth puts ruling
coalitions on the horns of a dilemma. Ruling elites can expand rent-seeking opportunities for existing political
elites, contributing to further economic stagnation that can threaten the legitimacy of the regime and increase
the probability of regime turnover. To forestall threats to the regime, political elites may use repression to
suppress discontent or capture a greater share of the majority's shrinking surplus. These repressive policies may
entail acts of direct violence against or withholding food and other supplies from politically disobedient
groups, as in Sudan in the 1980s (Keen, 2000, pp. 292-94). Moreover, repression and economic discrimination
may generate relative deprivation and trigger sociopolitical mobilization on the part of the groups affected,
leading to further violence, worsening the humanitarian crisis.

The Method Lab


Michigan 2009 58
Risk Calculus
They Say “Economic Collapse = Nuclear War Improbable”
[ ] Unstable governments create economic downfalls that inevitably lead to
humanitarian crisis.
Wayne Nafziger, Prof of Comparative Economics, Univ of Illinois, 2002 [ economic development, inequality, war,
and state violence e. and Juha Auvinen, Juha Auvinen is a Docent of International Politics at the University of
Helsinki. http://www.ksu.edu/ economics/nafwayne/EcDev.doc

Since economic deceleration or collapse can disrupt ruling coalitions and exacerbate mass discontent, we
should not be surprised that since 1980 the globe, particularly Africa, has been more vulnerable to humanitarian
emergencies. This increase in intrastate political conflict and humanitarian emergencies in Africa in the last two
decades of the twentieth century is linked to its negative per-capita growth in the 1970s and 1980s and virtual
stagnation in the 1990s. Indeed in Africa, which had the highest death rate from wars,1 GDP per capita was lower in
the late 1990s than it was at the end of the 1960s (World Bank, 2000, p. 1). This stagnation and decline was often
associated with, and exacerbated by, a predatory state, driven by ethnic and regional competition for the
bounties of the state. Predatory rule involves a personalistic regime ruling through coercion, material inducement,
and personality politics, tending to degrade the institutional foundations of the economy and state. Elites extract
immediate rents and transfers rather than providing incentives for economic growth. In some predatory states,
the ruling elite and their clients “use their positions and access to resources to plunder the national economy through
graft, corruption, and extortion, and to participate in private business activities.” (Holsti, 2000, p. 251). Ake (1996,
p. 42) contends that “Instead of being a public force, the state in Africa tends to be privatized, that is, appropriated to
the service of private interests by the dominant faction of the elite.” People use funds at the disposal of the state
for systematic corruption, from petty survival venality at the lower echelons of government to kleptocracy at the
top. Humanitarian crises are more likely to occur in societies where the state is weak and venal, and thus subject to
extensive rent-seeking, “an omnipresent policy to obtain private benefit from public action and resources.” (Väyrynen 2000b, p. 440). Cause and effect between state failure and rent seeking are
not always clear. State failure need not necessarily result from the incapacity of public institutions. Instead, while “state failure can harm a great number of people, it can also benefit others,”
(ibid., p. 442) especially governing elites and their allies. These elites may not benefit from avoiding political decay through nurturing free entry and the rule of law and reducing corruption and
exploitation. Instead political leaders may gain more from extensive unproductive, profit-seeking activities in a political system they control than from long-term efforts to build a well-
functioning state in which economic progress and democratic institutions flourish. These activities tend to be pervasive in countries that have abundant mineral exports (for example, diamonds
and petroleum), such as Sierra Leone, Angola, Congo - Kinshasa, and Liberia (section 4), while predatory economic behavior has a lower pay-off in mineral-export-poor economies such as
Tanzania and Togo. The majority of countries with humanitarian emergencies have experienced several years (or even decades) of negative or stagnant growth, where growth refers to real
growth in GNP or GDP per capita. Widespread negative growth among populations where a majority is close to levels of subsistence increases the vulnerability to humanitarian disasters. From
1980 to 1991, 40 of 58 (69 per cent of) Afro-Asian countries experienced negative growth, according to the World Bank's World Development Report (1993, pp. 238-9). In contrast, from 1960 to
1980, only 9 of 53 had negative economic growth, according to the earlier World Bank annual (1982, pp. 110-1). In addition, the positive growth of Latin America and the Caribbean during the
1960s and 1970s also reversed to negative growth in the 1980s, according to the same World Bank sources. The interrelationships between growth and emergencies suggest that the increased
emergencies in the early 1990s are connected to the developing world's disastrous growth record of the 1980s. This disastrous growth was accompanied by state decay, as ruling elites, facing
limitations in dispersing benefits to a wide-ranging coalition of ethnic communities and economic groups, struggled for control, allied with other strongmen, and strengthened their military
capability to repress potential rebels and dissidents. Econometric and country evidence indicates that, holding other variables constant, slow real GDP growth helps explain humanitarian
emergencies. Humanitarian emergencies also contribute to reduced (often negative) growth (Stewart et al., 1997, pp. 11-41), although, according to econometric tests by Auvinen and Nafziger
(1999), the direction of causation is weaker than from growth to emergencies. Contemporary humanitarian disaster is rarely episodic but is usually the culmination of longer-term politico-
economic decay over a period of a decade or more. Negative per-capita growth interacts with political predation in a downward spiral, a spiral seen in African countries such as Angola, Ethiopia,
Sudan, Somalia, Liberia, Sierra Leone, and Zaire (Congo). Economic stagnation, frequently accompanied by chronic trade deficits and growing external debt, intensifies the need for economic
adjustment and stabilization. A persistent external disequilibrium has costs whether countries adjust or not. But non-adjustment has the greater cost;2 the longer the disequilibrium, the greater is
the social damage and the more painful the adjustment. Most LDCs face frequent international balance-of-payments problems, which reduce the ability of political leaders to maintain control.
But, abundant exports, such as minerals, together with a strong military, can provide the ruler or warlord with a modicum of security. More than a decade of slow growth, rising borrowing costs,
reduced concessional aid, a mounting debt crisis, and the increased economic liberalism of donors and international financial institutions, compelled LDC (especially African) elites to change
Widespread economic liberalization and adjustment provided chances for
their strategies during the 1980s and 1990s.
challenging existing elites, threatening their positions, and contributing to increased opportunistic rent-
seeking and overt repression. Cuts in spending reduced the funds to distribute to clients, and required greater
military and police support to remain in power.

The Method Lab

Das könnte Ihnen auch gefallen