Sie sind auf Seite 1von 4

“Transparent Vulnerabilities”

How we overlook the obvious, because it is too clear that it is there


By Geary W. Sikich

Introduction

Transparent vulnerabilities are so obvious that they are easily overlooked; they are the ones we:
• see when they are pointed out;
• recognize when we are made aware of them;
• often fail to acknowledge — leading to potentially significant consequences when the
vulnerability is realized.

Time matters most when decisions are irreversible

Risk and time are opposite sides of the same coin; for if there were no tomorrow, there would be no
risk. Time, therefore becomes a transparent vulnerability that we more often than not do not account
for in our identification of vulnerabilities. Time transforms risk. The nature of risk is shaped by our
time horizon. In order to achieve corporate goals and objectives time horizons are established – a five
year goal; vision 2015, etc., are often seen in corporate mission, vision and value statements.

Deciding what to do when it is uncertain what will happen, because more things can happen than will
happen is a challenge that the typical Business Impact Assessment (BIA) cannot answer. The BIA is a
picture of what was not what could be or what is necessary to achieve corporate goals and objectives
if the business is faced with disruption. A transparent vulnerability of the BIA is that it is often times
limited in the assessment to the internal logistics of identifying systems, processes, hardware,
software and relocation options. The underlying vulnerability is that the BIA fails to address the
obvious – “if my company loses a significant portion of its revenue stream (i.e., customers), how will
my business survive?”

To act or not to act – is that the question?

Once we act, we forfeit the option of waiting until new information is made available. As a result, not
acting has value. As depicted in the figure below we can see that the impact of variables, too
numerous to be easily identifiable; transparent vulnerabilities, too obvious to be seen; and outliers,
the actions of the markets that we operate in; all affect our ability to act decisively.

An action can be a right action or it can be a wrong action. A decision not to act can result in the right
action with positive consequences or it can be the wrong action with negative consequences; or visa
versa.
A traditional planning process based on the BIA generally will identify limited dimensions of
uncertainty – for example, “what do we do if we lose our facility?” These uncertainties are often
combined to produce a matrix of options for response to the planned uncertainty. Once this framework
is created, (the plan), the full richness of trends and uncertainties that are evolving constantly do not
get addressed because they become transparent to the process of building the plan. As a result, we
end up with resource poverty instead of resource affluence. An, example is the identification of
hardware that would be required to be replaced. We identify computers, workstations (desks, chairs,
telephones, etc.); yet we fail to ascertain (a transparent vulnerability) if we can expect the suppliers
of these essentials to produce them in the quantity needed, in the time we need them in order to stay
in business. What we are left with is potential fragmentation instead of the cohesion that the plans are
supposedly designed to address; and yes, let it be said, to assure.

We do not address the transparent vulnerability of resources availability; that is what it will actually
take to move from resource poverty to resource affluence – the availability and real cost of key
resources including energy, value chain ability to respond and human capital (skill-sets) that are
necessary to accomplish what the plan sets forth.

You believe or you do not believe

The present is being described and the future forecasted through heavy reliance on past historical
information. When the most important events by far cannot be predicted, can we really talk of the
possibility of a theory of continuity? The probability of an event occurring may be 50-50. The
consequences (a transparent vulnerability) of the two outcomes are different. Probability is
determined by evidence and reason. Probability means the degree of believe or approvability of an
opinion or insight. The transparent vulnerability is that we cannot comprehend the limits of human
knowledge as it is truly not fully analyzable. Humans influence the development of the BIA, the plan
and the execution of the plan. The assumptions that we make regarding our ability to address
complexity creates within itself a transparent vulnerability. For example, the current news from the
Centers for Disease Control (CDC) and World Health Organization (WHO) fail to recognize transparent
vulnerabilities associated with the Swine Flu. Yahoo
(http://cosmos.bcst.yahoo.com/up/player/popup/?cl=15217123) recently aired a video wherein a news
report focused on the CDC’s expectations of what could happen this fall flu season. In the report they
cite that 60 – 120 million Americans could succumb to the Swine Flu, with up to 1.8 million being
hospitalized and deaths in the range of up to 1 million people. The current approach is to vaccinate
everyone – that’s over 300 million people if the census is correct. The vaccination process takes two
shots delivered three weeks apart.
The transparent vulnerabilities; rather obvious don’t you think? When does flu season start? According
to the CDC: Flu season starts to peak in November and continues to peak through April. Let’s see – 2
shots administered three weeks apart to approximately 300 million people; that means that if we start
on 1 September 2009 and want to finish on 31 October 2009 (61 days for you math wizards), we have
to vaccinate approximately 5 million people per day. By the way I am writing this on 25 August 2009,
a mere 6 days away from 1 September. Transparent vulnerabilities? Hmmm, let’s see – how about
production, distribution, inoculation (you need needles and people) process for starters. This is further
complicated by the fact that it is two shots with a three week break between shots! My math is off, we
would have to inoculate 300 million before the end of September (that works out to 60 million weekly)
and then inoculate the 300 million again three weeks later (approximately 25 October) before the
peak in November. That feat would be accomplished in the span of the last week of October – roughly
43 million people per day). It’s not like we have any economic problems that would preclude everyone
taking off from work to be jabbed.

A last transparent vulnerability associated with the flu – commodity market impacts. We already know
that the food supply will be impacted due to the readily recognized limited amount of food that
grocery stores carry in inventory and due to potential transportation issues resulting in lack of
movement over the roads. Have you thought about the impact on commodity markets? Probably not,
it is a transparent vulnerability. Why? The answer: Agricultural Inspectors. Yes, the lowly Agricultural
Inspector with his or her white lab coat and blue ink stamp that indicates USDA Approved (United
States Department of Agriculture). Talk to any commodities trader at the Chicago Board of Trade,
Chicago Mercantile Exchange, New York Mercantile Exchange or any other commodities exchange and
you get the following answer regarding trading risk. “Without the USDA stamp of approval, I will not
expose myself to the risk that the product traded is not tainted.” No Agricultural Inspectors, or a
greatly degraded workforce (remember we expect 40% to be sick and out of work) and the movement
of agricultural products essentially may grind to a halt (no pun intended). Oh, and this is also
important to know – the bulk of fresh produce tends to spoil over time. Okay, one last jab –
Containers coming into the ports of entry (worldwide) – according to the National Cargo Security
Council – contain a substantial amount (upwards of 80%) of food products.

Conclusion

It takes 85 million barrels of oil per day globally, as well as millions of tons of coal and billions of cubic
feet of natural gas to enable modern society to operate as it does. We fail to see transparent
vulnerabilities because they are all too recognizable and therefore are dismissed all too readily. In
order to overcome the trap of transparent vulnerabilities we need to overcome our natural tendency
toward diagnostic bias.

A diagnostic bias is created when four elements combine to create a barrier to effective decision
making. Recognizing diagnostic bias before it debilitates effective decision making can make all the
difference in day-to-day operations. It is essential in crisis situations to avert compounding initial
errors.

The four elements of diagnostic bias are:


• Labeling
• Loss Aversion
• Commitment
• Value Attribution

Labeling creates blinders; it prevents you from seeing what is clearly before your face – all you see is
the label. Loss aversion essentially is how far you are willing to go (continue on a course) to avoid
loss. Closely linked to loss aversion, commitment is a powerful force that shapes our thinking and
decision making. Commitment takes the form of rigidity and inflexibility of focus. Once we are
committed to a course of action it is very difficult to recognize objective data because we tend to see
what we want to see; casting aside information that conflicts with our vision of reality. First
encounters, initial impressions shape the value we attribute and therefore shape our future
perception. Once we attribute a certain value, it dramatically alters our perception of subsequent
information even when the value attributed (assigned) is completely arbitrary.

Recognize that we are all swayed by factors that have nothing to do with logic or reason. There is a
natural tendency not to see transparent vulnerabilities due to diagnostic biases. We make diagnostic
errors when we narrow down our field of possibilities and zero in on a single interpretation of a
situation or person. While constructs help us to quickly assess a situation and form a temporary
hypothesis about how to react (initial opinion) they are restrictive in that they are based on limited
time exposure, limited data and overlook transparent vulnerabilities.

The best strategy to deal with disoriented thinking is to be mindful (aware) and observe things for
what they are (situational awareness) not for what they appear to be. Accept that your initial
impressions could be wrong. Do not rely too heavily on preemptive judgments; they can short circuit
more rational evaluations.

Are we asking the right questions? When was the last time you asked, “What Variables (outliers,
transparent vulnerabilities) have we Overlooked?”
About the Author:

Geary Sikich
Entrepreneur, consultant, author and business lecturer
Contact Information:
E-mail: G.Sikich@att.net or gsikich@logicalmanagement.com
Telephone: (219) 922-7718
Geary Sikich is a Principal with Logical Management Systems, Corp., a consulting and executive
education firm with a focus on enterprise risk management and issues analysis; the firm's web site is
www.logicalmanagement.com. Geary is a frequent speaker on business continuity issues business
performance management. He is the author of over 195 published articles and four books, his latest
being “Protecting Your Business in Pandemic,” published in June 2008 (available on Amazon.com).
Copyright© Geary W. Sikich 2009. World rights reserved. Published with permission of the author.

Das könnte Ihnen auch gefallen