Sie sind auf Seite 1von 19

Learning to make more

effective decisions: changing


beliefs as a prelude to action
Sheldon Friedman
Albertus Magnus College, New Haven, Connecticut, USA
Keywords Decision making, Beliefs, Simulation, Role play, Group thinking, Critical thinking
Abstr act Decision-makersin organizationsoften make what appear as beingintuitivelyobviously
and reasonable decisions, which often turn out to yield unintended outcomes. The cause of such
ineffective decisions can be a combination of cognitive biases, poor mental models of complex
systems, and errors in thinking provoked by anxiety, all of which tend to reinforce the currently
held belief structures that reinforce even further resistance to change in people. While Senge has
advocated for the use of simulations, called, microworldsto overcome such resistance, there are
times when such simulations are not available for use or are otherwise infeasible. At these times,
alternative methods need to be considered for improving the capacity of managers to learn from
experience and improve the quality of their decision-making. Among the alternatives that can be
used to improve decision-makingare role-play, neuro-linguisticprogramming, the use of corrective
methods related to groupthink, critical thinkingskills and failureanalysis. A reviewof the causes of
poor decision-making, methods of changing ones beliefs, guiding principles for making better
decisions, and a process for improving the quality of lessons learned from experienceis presented
in this article.
Introduction
The purpose of this article is to explore the relationships between weak
decisions and the perceived causal relationships that inuence these decisions
and subsequent actions taken. A model is presented which attempts to explain
the psychological causes of beliefs and their impacts on counter intuitive
decision. In addition, recommendations for evaluating and improving decisions
are offered. During the development of a system dynamics model (Friedman,
2003) it was discovered that decisions were made by managers of a system
based on a perceived causative relationship that did not necessarily exist in
reality. As in many organizations, those decisions meant to improve the state of
the system were actually creating unintended consequences. The cause of this
behavior may be explained as being due to stress placed on the decision-maker;
and a need for risk avoidance associated with the type of decision being made.
It is possible that the combination of these stressors could create an
anxiety-induced perseverance of belief, which would be difcult to change.
Often, such behavior is related to rigidity induced by threat or the regression to
simpler and familiar ways of doing things, rather than the use of more current
methods that have been learned. The effect of perseverance of belief is that
managers often fail to see the breadth of the array of options that lay before
them.
The Emerald Research Register for this journal is available at The current issue and full text archive of this journal is available at
www.emeraldins ight.com/researchregis ter www.emeraldins ight.com/0969-6474.htm
TLO
11,2
110
The Learning Organization
Vol. 11 No. 2, 2004
pp. 110-128
q Emerald Group Publishing Limited
0969-6474
DOI 10.1108/09696470410521583
There are things that are known and things that are unknown; in between are doors
(Anonymous).
When the original decision process described above was observed, research
found that the reasons for the decisions were not supported by statistical
data[1]. In organizations, we often nd behavior that is based on incomplete
mental models of how complex systems actually work and adhering to
untested assumptions. The important question for organizational learning
seems to be: why do managers persist in making decisions that create
unintended consequences?Is there a way to change their belief systems so as to
yield action that is more effective? Potential insights into the possible causes of
this type of ineffective decision-making process lies in research on:
.
cognitive bias;
.
mental models;
.
schemata;
.
emotions; and
.
perseverance of ineffective beliefs.
The actions that humans take, depending on our attributions or cognitive
biases, mental models, etc., often create unplanned or unintended
consequences.
While unintended consequences may not necessarily be undesirable in their
outcome, Forrester (1975) found that we often take actions that create effects
that are counter in nature to the expected and desired results. These outcomes
are often negative, even when those implementing decisions are trying to do
their best to act purposively and attain goals. That these untoward outcomes
occur with surprisingly regularity should not come as a surprise to us. Simply,
people have difculty in managing in complex systems. Sterman (1994) cites
several causes for such phenomena, such as:
.
The counterintuitive nature of decisions, as cause and effect may not
necessarily be related, but may be coincidental, arising from the dynamics
of the system structure.
.
Corrective programs, which are initiated are often counteracted by other
decisions that have been made in isolation. Such individual decisions
often nullify one another, and create outcomes that are unintended[2,3].
People tend to misperceive dynamic feedback in organizations. Such
misperceptions often involve attributing peoples actions to some false
underlying cause. This behavior can be explained in terms of what
psychologist term as attribution theory. Attribution theory looks at peoples
explanations of cause and effect and the causes that they attribute various
effects to. It is based on discovering why something happened. Alternatively,
Higgins (1999) has proposed another explanation, which he calls the
Effective
decisions
111
aboutness principle. He proposes that all actions that we take are ultimately
based on some decision we have made about the cause of a problem, and the
perceived outcomes of any actions we take to correct the problem. There are
three problems related to the aboutness principle.
(1) People can represent a response as being about something that it is not
actually about. (We make decisions based on availability not accuracy of
information.)
(2) People can mistakenly infer that what they represent their responses as
being about is a source of the response. (We can mistake concurrent
unrelated events as related or mistaken correlation.)
(3) People can mistakenly infer that what they represent their responses as
being about is the source of the response. (There is usually more than
one variable that causes an event in a system.)
The core of attribution theory and the aboutness principle seems to be taking
something that is perceived as known and making it known in a more concrete
way. For example, economists use supply and demand curves to determine the
price for products. However, with the eld of system dynamics, it is widely
believed that we do not know the real amount of demand or supply at any given
moment. Fluctuation exists around the point that the two curves meet together,
yet the concept is used as a model for decision-making. Cognitive causes of the
difculties cited are attributed to the relatively small mental working space
available or capacity that humans have for dealing with large quantities of
complex information (Miller, 1956). In addition, such decision biasing as
availability of data, primacy of experience, and causalation (attributing cause
and effect when none exists) affect cognitive processes. Besides the negative
impacts of the weaknesses of cognition, humans maintain mental models and
schema, which drive the rules that they use for solving problems.
Cognitive bias
Managers learn part of their belief systems from exposure to formal
educational systems, from fellow managers, and from the generally accepted
operating characteristics of an organizations culture. Among the kinds of
cognitive biases that can occur in such situations are those due to availability
and primacy. Availability implies that people judge the probability of an event
according to the ease with which examples are remembered (Tversky and
Kahnman, 1973). Asch (1946) found that information that is presented rst in a
sequence (primacy effect) can have an inordinate inuence on decisions. The
initial exposure to rules of the practice of management during the process of
educating managerial professionals has a major impact on how we analyze
information. The use of information we are exposed to relatively early in our
lives may act as an anchor and create a tendency to misinterpret feedback from
systems. For example, until recently in the medical community, normal blood
TLO
11,2
112
pressure was considered to be 120/80. It has now been discovered that this set
of values is slightly high pressure. However, the average person when asked
about their blood pressure of 120/80 will report that they are normal.
Another bias is that solutions to problems are often designed to address
symptoms of problems rather than their causes and attempt to operate through
leverage points in systems that have little appreciable effect in yielding desired
changes. As one manager put it, we need to educate the users of the system.
This remark is based on the managers belief that education is a leverage point.
However, depending on the systems problem, education alone may not be
effective. Education is often based on the assumption that most students will
acquire the information, interpret, and evaluate it in the same manner as
presented by the instructor (as already pointed out, people often reject new
information). In addition, most education is delivered assuming that students
learn at the same rate and acquire knowledge during a given time period. This
does not allow for different learning styles. Therefore, education does not
always have the expected outcome.
Mental models and schemata
Mental models are a means of interpreting cause and effect relationships that
are shaped by our values, beliefs, and culture. They reect an internal personal
view of how the world works and which behaviors are appropriate for dealing
with events in the world. They are internal representations of the external
world (Cannon-Bowers et al., 1993, p. 225). While Senge (1990, p. 8) notes that
they are deeply ingrained assumptions and generalizations, or even pictures
or images that inuence how we understand the world and how we take action.
Why do mental models affect us as they do? According to Endsley (1997), it is
the relationship to situation awareness that seems important. Situation
awareness involves perceiving critical factors in the environment,
understanding what those factors mean, how they will affect our goals, and
what will happen within a system in the future. While mental models impact
our overall view of the world, schemata are situation specic constructs. The
schemata (Neisser, cited in Lipshitz, 1997) are cognitive structures that:
.
direct external information search processes;
.
specify which available information will be attended to and which
ignored;
.
organize information in memory;
.
direct the retrieval of information from memory; and
.
become more differentiated as a function of experience.
Schemas are structures that we hold that determine the way we act in a given
environment. They help create our responses to the aboutness questions
discussed earlier. To complicate the decision-making process further, there is
the gap between information received and information perceived. This gap is
Effective
decisions
113
lled with biases and errors, which ultimately weaken the impact of new
information on current beliefs (Slusher and Anderson, 1989). Often, even when
decision-makers are confronted with information that runs counter to their
beliefs, the original belief does not change correspondingly. Cognitive biases
and mental models are not the only factors that impact our rational thought
processes emotions also play a role and overlay our everyday decisions.
Emotions
Managers ordinarily view emotions as being irrational human experiences that
cloud their judgment and distort their reasoning. This view is well entrenched,
despite work in both philosophy and psychology that establishes a strong
connection between emotion and cognition (Barnes and Thagard, 1996, p. 2).
Contrary to conventional thinking, emotions have a direct effect on
decision-making (Zey, 1992). The emotions invoked by an event may have a
further effect on memory and, hence judgments. We tend to remember those
events that are similar in some way to a current situation (Hastie and Dawes,
2001). Emotions dictate and constrain which bits of information are used by
managers. Some believe (Damsasio, cited in Barnes and Thagard, 1996), that
once a decision is made a physical change in body awareness (gut feel) is
created and that this state becomes a marker that inuences future decisions.
After this marker decision is established, one is emotionally primed by the past
decision as if the emotional result of the new decision has already impacted the
decision-maker, before the results of the new action are known. Pert (cited in
Oelklaus, 2003) discovered that 98 percent of our memory is stored outside of
our brains; it is chemically bonded in peptide receptors, distributed throughout
our bodies. These memories are capable of being quickly retrieved, because the
neural paths from emotion to consciousness are so well traveled. Emotions
become triggered before our more rational thoughts can override them
(Oelklaus, 2003). Acute or experienced emotions are immediate visceral
reactions triggered by a decision situation. Anticipated emotions are not
experienced in the immediate present but are expected to be experienced in the
future (Bosman and van Winden, 2001). Further, March and Shipira (cited in
Zey, 1992) have described the concept of anticipatory choice where:
it is possible to see individuals and organizations as acting on the basis on some
conception of the future consequences of present action for preferences currently held (Zey,
1992, p. 274).
It is as if our future preferences are also known and the emotional response to
them occurs now.
Experimental evidence shows that anxious individuals are biased towards
low risk/low reward options. Lowenstein (quoted in Bosman and van Winden,
2001), in his survey on emotions and risk has described many studies found
effects of fear and anxiety on various types of judgment and tend to favor
cautious, risk averse decision making. An emotion or mood state may in and
TLO
11,2
114
of itself affect thinking (Forgas, cited in Siepmann, 1995). Simon (1987)
discusses that the need to ally feelings of guilt, anxiety and embarrassment
may lead to behavior that produces temporary personal comfort at the expense
of bad long-run decisions. In fact, the repeated use of such comfortable
behavior creates a decision style pattern for many managers. This in itself is
not bad, as many routine decisions require expediency. The danger is in using
the same kinds of routines for decisions that require more thoughtfulness and
less expediency. The solution to this conundrum appears to be the ability of
managers to determine when their own habits should be used or ignored. The
effects of anxiety are not limited to the single decision-maker. Organizational
creation of anxiety has multiple causes, much of the cause due to members
activities themselves (Voyer et al., 1997). When adults are faced with anxiety
they may revert to many infantile ways of behaving (Seel, 2001). This
reversion is accompanied by incompetence as fear prevents active involvement
in a process. There appears to be a fear to act in the face of the anxiety created
or if actions are taken, they are less then effective.
Of greater concern are decisions made in a state of mindlessness, a state of
reduced attention or situational awareness. Attention is given to limited
amounts of information that may not be correct (Dunegan, 1994). Our feedback
about global judgments is often awed. Not only do we selectively remember
our successes, we often have little knowledge of our failures and any
knowledge we do have may serve to explain them away (Hastie and Dawes,
2001). Therefore, a manager can think, if their decision analysis system is not
working correctly, I must not be doing enough to make the system act that
way that I want it to perform[3]. According to Siepmann (1995), anxiety may
reduce open-mindedness. The process, however, may generate its own
feedback and reinforce itself, as in cases such as efforts intended to reduce
anxiety, may actually reinforce the original belief. It appears that emotions may
be capable of creating an internally generated environment that drives
decisions that have the primary effect of making us comfortable rather than
helping to be effective. The combination of cognitive biases, mental models,
schemata, and emotions tend to create persistence in our beliefs and therefore
our decision process.
Belief persis tence
The process of belief persistence occurs when we resist adjusting our beliefs to
correspond to how things actually work in practice. The question of why
certain beliefs that are out of accord with effective practices tend to persist in
the face of counter evidence is at issue. Self defeating behavior predicted on
honestly (but incorrectly) held beliefs would seem readily amenable to change
(Slusher and Anderson, 1989). The reason for this may simply be that we dont
accept the fact that we may be wrong!We acquire beliefs from several sources,
from classrooms, parents, our personal experiences and during our
Effective
decisions
115
socialization into organizations. We develop sets of attributions based on our
existing belief set. Once a set of beliefs has been established, they develop high
resistance to change. For example, the basis of organizational culture is formed
around belief in the continuing effectiveness of routines that worked to solve
past problems. Such beliefs are highly resistant to change. It has been shown
that feedback about a wrong belief is often ineffective (Anderson et al., 1980),
and that new evidence counter to an existing belief can, in some cases reinforce
the current belief (Lord et al., 1979). People do not see their own bias therefore,
why should they change the beliefs they have? Further, Dutton and Jackson
(1987) have suggested that, once an issue has been categorized, new
information congruent with the category is more likely attended to,
remembered, used for lling in missing information and distorted to conform
to beliefs, in order to avoid ambiguity. In order to improve our decision
processes incorrect beliefs need to be changes. Are there ways to accomplish
this?
A proces s for changing belie fs
How does one go about the process of aligning your beliefs about how things
work in complex systems with the way they really work in practice? Figure 1
represents a model of the relationships of beliefs and feedback to actions taken
and how these create unintended consequences. In this model, the cognitive
biases, mental models, and emotions all are depicted as affecting our situational
awareness (SA). As decisions are based on the gap that we believe exists
between a desired state and the state of the system, as is, we make an
evaluation of the gap for the system. However, we are now making a decision
based on the perceived state of the system, using rules to close the perceived
gap. As the correct evaluation of the perceived state is nowincorrect, due to our
current belief, we tend to apply an incorrect set of rules. We apply Gap P rules
to a Gap A situation. The rules we should use are those meant to close the gap
between the actual and desired states of the system. Note in the model that the
rules and actions that should close Gap A are not enacted. The intended result
does not occur. While Gap P is effected, the actual state of the system is not
affected and continues to worsen. With some delay, the actual system condition
is perceived and the deterioration creates more anxiety, leading back toward a
decrease in the SA. We apply an incorrect set of rules, based on an incorrect
perception. Therefore, the real cause of the gap in the system is not impacted.
The incorrect relations held in the mind of the decision-maker are enacted, but
become reinforced.
As the system declines it creates more anxiety and a drive to increase the
actions, in hopes of gaining improvement in the system. The beliefs held by
managers are especially important as managers act as the interpreter of events
for members of the work force (Isabella, 1990). However, situations arise which:
TLO
11,2
116
are often beyond the capacity of managers to sense and act upon, To the extent that
managers can be powerless to change external forces that can be trapped in a xed mind
set, they will be forced into a reactive rather than proactive mode (Dunphy and Stace, 1988,
p. 319).
This reactive mode may be the realm of habit, which has been dened as, an
unreective, set disposition to engage in actions that have been long practiced
(Camic, 1992, p. 202). Habits are difcult to break! How can we effect and
change the beliefs, biases, habit and mental models held by decision-makers?
Improving decision making
It has been suggested that managers can improve their decisions by:
(1) listening to dissent;
(2) converting events into learning opportunities; and
(3) adopting experimental frames of reference (Nystrom and Starbuck,
1984).
Options 1 and 2 seem less likely to have an effect based on the defenses that are
created to protect ones beliefs as described earlier in this paper. The alteration
Figure 1.
Initiation of incorrect
actions
Effective
decisions
117
of perceptions needs to come from the self-awareness and realization that the
perceptions we have may be incorrect and not necessarily accurately reect
reality. Sterman (1994) has argued that learning depends on feedback and has
suggested the causes of barriers to learning that exist. It has been suggested
that one way to improve learning is the use of computer simulations and
developing system models. A formal model has advantages over the informal
or mental model. These advantages are: the formal model is explicit and
communicable. A system dynamics model exposes its assumptions about a
problem to criticism, experimentation, and reformulation; a formal model
handles complexity fairly easily (Pugh and Richardson, 1981). Simulations can
present numerous variables for evaluation that the human mind cannot.
When a persons fundamental belief system needs to be changed, it requires
a secure situation. It is during this time that one may feel there is no cohesive
view of reality, yet one must be able to develop a new set of beliefs (Feather,
1989). Instead of interpreting the belief, a new belief must be self-generated.
The use of simulations is based on the expectation that during the simulation a
user will have an integrated experience of management, not possible with
tradit ional subject -by-subject teaching (Graham and Senge, 1990).
Microworlds, as a learning tool, became popularized after the publication of
Senges (1990)text, The Fifth Discipline. The impact of microworlds on learning
has been well covered (Morecroft and Sterman, 1994). Attempts at assessing
the impacts of microworlds have been reported by Huz et al. (1997), Doyle
(1997), Durham (2002, 2003), Cavaleri and Sterman (1997), Cavaleri (2002) and
Friedman and Cavaleri (2003). However, simulations are not then only means of
changing our interpretation of the world and in fact may not be available to a
manager or organization. The use of counter-attitudinal role-play, methods
related to groupthink, NLP, improved critical thinking skills and failure
analysis, may all be effective alternatives.
Role play
Romme (2003, p. 53) has suggested that learning can arise due to six
consequences. One of these is learning as changing as a person: here, learning
involves not just seeing the world differently but seeing ones own perception
in the world differently. For role play to be successful the role player must
experience the same context in which they normally function. Characteristics
that satisfy the contextual requirement have been enumerated by Binstead and
Stuart (cited in Linstead and Harris, 1983). Such role play can be structured as
counter-attitudinal, which has been evaluated by (Greenwald, 1969, 1970).
During this process, biases can be overcome due to the development of new
beliefs fostered during the role play. Verbalization of the belief during play
tends to create support for the new belief. The results are that the supporting
arguments developed during play may counter the arguments made prior to
the role-play. In other words saying is believing (Higgins, 1999). Role-playing
TLO
11,2
118
allows an individual the chance to take on the role of a character or part in a
particular situation. The improvisations created during role-play are not
unlike the improvisations that are part of a managers daily life, (and) requires
the participants to attend to all forms of feedback available in the role
environment (Feinstein et al., 2002).
When simulations and role play are not available and a manager needs to
change their decision-making frame of reference, then it becomes necessary to
use other methods. Among well known methods recommended to improve
decision-making are those related to groupthink (Janis, 1982), NLP, (OConnor
and McDermoot, 1996), development of critical thinking skills (Browne and
Kelley, 2001) and failure analysis (Carroll and Brenneman, 2003). These
methods are aimed at reframing and identifying the underlying causes of our
abouts. A method that managers can use and that combines several
characteristics of each tool is presented for use in this article.
Most of the previous methods that have been listed aim at improving the
introspective capacity of managers and engaging in the types of reection
associated with adult learning. The ability to evaluate your feelings about an
issue and the ability to question oneself about the correctness of a mindset are
keys to understanding new experiences and evaluating old ones. The problem
that managers face when they are about to make a decision can be couched as:
how do I know I will be making the most correct decision I can? Note that the
question is not: how will I make the correct decision? The issue at hand is, for
the circumstances that exist (context): how do we ever know we are ever right
about those circumstances? Uncertainty plagues most of our decisions. Any
method used must allow us to reect on our feelings and understanding of the
current circumstances as separate from any other set of circumstances. Using
repeated solutions triggered by memories of this is close enough very often
can lead to ineffective decisions.
The recommended process is aimed at targeting the decision-makers
thinking into the correct gap described in Figure 1. It is aimed at the messages
we carry and a reframing of our beliefs of what cause and effect exist in our
current mental models. The recommended question set for managers found in
the section entitled Self-diagnostic decision analysis instruction set
incorporates various ideas from the following:
(1) Managing groupthink, such as:
.
the search for information counter to the information that one has
been exposed to;
.
an alternate explanation for the believed cause of events; and
.
seeking evaluation from others.
(2) Reframing of meaning from NLP by using our thoughts in clarifying
how we created the cause and effect relationships we believe to exist.
NLP uses the concept that it is easier to restructure a cause and effect
Effective
decisions
119
relationship, using the same basis as the original relationship, but
creates a new meaning for the user or meaning reframing (Andreas
and Andreas, 1987).
(3) Critical thinking skills by addressing:
.
conclusions drawn;
.
the reasons for these conclusions;
.
the information used assumptive and not supported by data;
.
accuracy of the data;
.
suggestions for the use of alternative sources of data that might
counter the data used; and
.
the possibility of drawing new and different conclusions.
(4) Failure analysis that puts a focus on what would need to happen for an
event to occur. In action it works from effect backward to cause rather
than the direction we normally take, looking for cause then effect.
Guiding principle s
To overcome some of the difculties of decision-making the following set of
guiding principles is recommended:
.
Assume your decision is correct, then, ask: what would I need to do to
disprove my ndings?
.
Use multiple sources of data (e.g. the Web, statistical abstracts, research
journals, etc.). When using multiple resources give preference to primary
literature.
.
Avoid newspaper and trade journals. Often they do not have detailed
accuracy of data.
.
If sources of data are contradictory, assume that your base assumption
could be incorrect.
.
Always check the latest knowledge base about concepts and current
validity.
.
Try to have a real time information system in place. During development
of plans, nancial information may lag and decisions may be based on
incorrect information.
.
Consider using additional staff to research information. Focus on the class
of information needed, not outcomes (avoid instructions such as nd
information that will prove X).
.
Obtain varying opinions from experts in the eld related to your problem.
Discussion will often lead to questions about a decision, which were not
initially raised.
TLO
11,2
120
.
Dialogue with someone who is not familiar with the system. Often, these
individuals raise questions not normally thought of in an attempt to learn
about the problem[4]. Knowledge of a system can often blind us to the
errors in the system. (Argyris (1992) called this skilled incompetence.)
.
Discuss with other managers their experiences in similar situations. How
have they managed any uncertainty?
.
Try to remember that biases are always present and interpretation of
information is often subjective in nature, based on values and beliefs.
Measuring change
Currently, research into the effect of change on mental models is underway.
The evaluation is based on a self-report process, which is statistically analyzed
for what has been referred to as gamma change (Friedman and Cavaleri,
2003) which is used to statistically measure the change in variables under
consideration in a system.
If we think of the verbal model as a set of statements describing a system, then the observed
behavior of this model would be to see if thosestatements would changeas the systemmoves
over time or space. It would be clear that the words would be used to describe the system
elements that in the mathematical model would be described via symbols (Feinstein et al.,
2002).
Conclusion
This article has attempted to develop an understanding of the causes of
ineffective decisions based on perseverance of beliefs, emotions, mental models
and cognitive bias. While the use of simulations is one method of affecting
change, it is not the only method. In some cases, methods may need to be
combined, depending on the depth and incorrectness of the belief. Several
alternative suggestions have been made available in order to provide other
techniques where simulations are available for use or where it is felt that a
combination may be more effective.
Recommended technique for assessment and evaluation of a decision
follows.
Self-diagnostic decision analysis instruction set
A set of questions is provided in Figures 2-4
Step 1. After your comparison of both sets of questions (Figures 2 and 3)
think about the following.
(1) Is there a difference in any of your answers?
(2) For each comparison describe your feelings and thoughts.
(3) What new insights related to the problem came up?
(4) Are you still sure that you are correct about the current state of the
system?
Effective
decisions
121
Figure 2.
Questions: Part 1
TLO
11,2
122
Figure 3.
Questions: Part 2
Effective
decisions
123
(5) If a difference exists, dene how you will determine what is causing the
difference?
(6) How will you go about closing the difference between the answers?
(7) What data will you need?
(8) How accurate is the data?
Figure 4.
Questions: Part 3
TLO
11,2
124
(9) How many sources will make you comfortable enough to accept the
validity of the data?
(10) What are you nding out about what you know or thought you knew?
(11) What questions would you like to ask about the current state of the
system?
Step 2. Compare your answers to both sets of questions using the last chart
(Figure 4).
If there are differences or new insights that cause any concerns or doubts; do
not make your nal decision until these have been claried.
Notes
1. Managers of road pavement systems tend to spend large portions of their budgets on high
volume roads to prevent accidents. Statements include: we do the most used roads to keep
user costs and complaints down, Ill do a larger road before I do a secondary road, we try
to do these roads before they go bad. The reasoning behind such decisions is given as
higher volume roads have more accidents, with higher volumes of use, you have a greater
chance of a problem, its risk that leads to repairs, with high volumes you have a 70
percent chance of accidents, with low volumes you have a 30 percent chance, thats my
risk, thats when I will do it.
What was found was a situation that did not conrm what managers believe or why they
commit resources as they do. Reports that were reviewed indicatedthat most accidents occur
when road conditions are better. By improving the road system, the rate of accidents may
actually be increased.
2. The demand to maintain high quality systems creates a pressure to spend, but may be
having a negative effect on another performance measure, accidents.
3. If accident rates are going up, I must not be spending enough to prevent the increase.
4. During consultationto a healthcaredelivery systemconcerningan accounts-receivableissue,
the author asked questions of an employee of the rm. The employeewas terse and indicated
that she knew the system. This author acknowledged her skills, but persisted in the
questions. The result was the nding of an error in the computer programthat was creating
an incorrect monthly account receivable.
References
Anderson, C.A., Lepper, M.R. and Ross, L. (1980), Perseverance of social theories. The role of
explanation in the persistence of discredited information, Journal of Personality & Social
Strucutre, Vol. 39, pp. 1037-49.
Andrea, S. and Andreas, C. (1987), Changing Your Mind and Keeping It Changed, Real People
Press, Moab, UT.
Argyris, C. (1992), On Organizational Learning, Blackwell Publishers, Cambridge, MA.
Asch, S.E. (1946), Forming impressions of personality, Journal of Abnormal and Social
Psychology, Vol. 41, pp. 258-90.
Barnes, A. and Thagard, P. (1996), Emotional decisions, University of Waterloo, Waterloo,
available at: http://cogsci.uwaterloo/Articles/Pages/Emot.Decis.html
Bosman, R. and van Winden, F. (2001), Anticipated and experienced emotions in an investment
experiment, CREED Department of Economics, University of Amsterdam, Amsterdam.
Effective
decisions
125
Browne, M.N. and Kelley, S. (2001), Asking the Right Questions, Prentice-Hall, Englewood Cliffs,
NJ.
Camic, C. (1992), The Matter of habit, in Zey, M. (Ed.), Decision Making, Sage, Thousand Oaks,
CA.
Cannon- Bowers, J.A., Salas, E. and Convers, S. (1993), Shared mental models in expert team
decision making, in Castellan, J. Jr (Ed.), Individual and Group Decision Making, Ch. 12,
Lawrence Erlbaum, Hillsdale, NJ.
Carroll, J. and Brenneman, B. (2003), Leveraging failed performance as learning, paper
presented at the workshop at the Society of Organizational Learning, Boston, MA, June.
Cavaleri, S. (2002), Evaluating the performance efcacy of systems thinking tools, Proceedings
of the 20th InternationalConferenceof theSystems Dynamics Society, 2002, Palermo, Italy.
Cavaleri, S. and Sterman, J. (1997), Towards evaluationof systems thinking interventions:a case
study, Systems Dynamics Review, Vol. 13 No. 2, pp. 171-86.
Doyle, J. (1997), The cognitive psychology of systems thinking, System Dynamics Review,
Vol. 13 No. 3, pp. 253-65.
Dunegan, K.J. (1994), Feedback and mindful vs mindless information processing, Advances in
Managerial Cognition and Organizational Information Processing, Vol. 5, pp. 315-36.
Dunphy, D.C. and Stace, D.A. (1988), Transformational and coercive strategies for planned
organizational change, Beyond the O.D. Model. Organizational Studies, Vol. 9 No. 3,
pp. 317-34.
Durham, J.I.R. (2002), Balanced scorecards, mental models and organizational performance. A
simulation experiment, PhD dissertation, University of Texas, Austin, TX.
Durham, J.I.R. (2003), The systematic inuence of scorecards on mental models and
organizational performance, Proceedings of the System Dynamics Winter Camp 2003,
The University of Texas, Austin, TX.
Dutton, J.E. and Jackson, S.E. (1987), Categorizing strategic issues: links to organizational
action, Academy of Management Review, Vol. 12 No. 1, pp. 76-90.
Endsley, M.R. (1997), The role of situation awareness in naturalistic decision making, in
Zsambok, C.E. and Klein, G. (Eds), Naturalistic Decision Making, Laurence Earlbaum,
Hillsdale, NJ, pp. 269-83.
Feather, N. (1989), Trying and giving U, in Curtis, R.C. (Ed.), Self Defeating Behaviors, Plenum
Press, New York, NY.
Feinstein, A.H., Mann, S. and Corsun, D.L. (2002), Charting the experiential territory: clarifying
denitions and uses of computer simulations, games and role play, Journal of
Management Development, Vol. 21 No. 10, pp. 732-44.
Forrester, J. (1975), Counterintutive behavior in social systems, in Forrester (Ed.), Collected
Papers of Jay Forrester, Wright-Allen Press, Cambridge, MA.
Friedman, S. (2003), The effects of dynamic decision making on resource allocation:the case of
pavement management, dissertation, Worcester Polytechnic Institute, Worcester, MA.
Friedman and Cavaleri (2003), Evaluating changes in systems thinking capacity:a methodology
based on Alpha, Beta, Gamma Analysis, in Proceedings of the 21st International
Conference of the System Dynamics Society, New York, NY.
Graham, A. and Senge, P. (1990), Computer based case study and learning laboratory projects,
System Dynamics Review, Vol. 6, Winter, pp. 100-5.
Greenwald, A.G. (1969), The open-mindedness of the counterattitudinal role player, Journal of
Experimental Social Psychology, Vol. 5, pp. 375-88.
TLO
11,2
126
Greenwald, A.G. (1970), When odes role playing produce attitude change: toward an answer,
Journal of Personality and Social Psychology, Vol. 16 No. 2, pp. 214-19.
Hastie, R. and Dawes, R.M. (2001), Rational Choice in an Uncertain World, Sage, Thousand Oaks,
CA.
Higgins, E.T. (1999), Saying is believing effects: when sharing reality is about something biases
knowledge and evaluations, in Thompson, L.L., Levine, J.M. and Messick, D.M. (Eds),
Shared Cognition in Organizations, Lawrence Erlbaum, Hillsdale, NJ.
Huz, S., Anderson, D.F., Richardson, G.P. and Boothroyd, R. (1997), A framework for evaluating
systems thinking interventions: an experimental approach to mental health systems
change, Systems Dynamics Review, Vol. 13 No. 2, pp. 149-69.
Isabella, A. (1990), Evolving interpretations as change unfolds: how managers construct key
organizational events, Academy of Management Journal, Vol. 33 No. 1, pp. 7-41.
Janis, I. (1982), Groupthink, Houghton-Mifin, New York, NY.
Linstead, S. and Harris, B. (1983), Reality and role playing: the use of a living case study,
Management Education, pp. 9-16, PR 12.1.
Lipshitz, R. (1997), Schemata and mental models in recognition primed decision making, in
Zsambock, C.E. (Ed.), Naturalistic Decision Making, Lawrence Erlbaum, Hillsdale, NJ.
Lord, C.G., Ross, L. and Lepper, M.R. (1979), Biased assimilation and attitude polarization: the
effects of prior theories on subsequent considered evidence, Journal of Personality &
Social Psychology, Vol. 37 No. 11, pp. 2098-109.
Miller, G. (1956), The magical number seven, plus or minus two, some limits on our capacity for
processing information, Psychological Review, Vol. 63, pp. 81-96.
Morecroft, J. and Sterman, J. (1994), Modeling for Learning Organizations, Productivity Press,
Portland, OR.
Nystrom, P.C. and Starbuck, W.H. (1984), To avoid organizational crises, unlearn,
Organizational Dynamics, Spring, pp. 53-65.
OConnor, J. and McDermoot, I. (1996), Thorsons Principles of NLP, Thorsons (Harper Collins
Publishers), London.
Oelklaus, N. (2003), Eye of the needle: a communication tool, The Systems Thinker, Vol. 14
No. 4.
Pugh, A. and Richardson, G. (1981), Introduction to Systems Dynamics Modeling with Dynamo,
Productivity Press, Cambridge, MA.
Romme, A.G.L. (2003), Learning outcomes of microworlds for management education,
Management Inquiry, Vol. 34 No. 1, pp. 51-61.
Seel, R. (2001), Anxiety and incompetencein the large group, Journal of OrganizationalChange
Management, Vol. 14 No. 5, pp. 493-503.
Senge, P. (1990), The Fifth Discipline, Double Day Currency, New York, NY.
Siepmann, M. (1995), Can anxiety reduce open-mindedness, available at:
www.psyct.com/siepman/papers/AnxOpen.html
Simon, H.A. (1987), Making management decisions:the role of intuition and emotion, Academy
of Management Executive, February, pp. 57-64.
Slusher, M. and Anderson, C.A. (1989), Belief perseverance and self defeating behaviors, in
Curtis, R. (Ed.), Self Defeating Behaviors, Plenum Press, New York, NY.
Sterman, J. (1994), Learning in and about complex systems, working paper D-4428, MIT Sloan
School of Management, Cambridge, MA.
Effective
decisions
127
Tversky, A. and Kahnman, D. (1973), Availability: a heuristic for judging frequency and
probability, Cognitive Psychology, Vol. 5, pp. 207-32.
Voyer, J.J., Gould, J.M. and Ford, D.N. (1997), Systematiccreation of organizationalanxiety, The
Journal of Applied Behavioral; Psychology, December, pp. 471-89.
Zey, M. (1992), Decision Making: An Alternative to Rational Choice Models, Sage Publications,
Newbury Park, CA.
Further reading
Cavaleri, S. and Fearon, D. (1996), Managing in Organizationsthat Learn, Blackwell Publishing,
Cambridge, MA.
Cavaleri and Friedman (2003), Evaluating changes in systemthinking capacity: a methodology
based on Alpha, Beta, Gamma analysis, in Proceedings of the 21st International
Conference of the Systems Dynamics Society, New York, NY.
Curtis, R.C. (1989), Self-Defeating Behaviors, Plenum Press, New, York, NY.
TLO
11,2
128

Das könnte Ihnen auch gefallen