Beruflich Dokumente
Kultur Dokumente
May 2000
Algorithmics Publications
Mark-to-Future
Technical Document
Editor
Judith M. Farvolden
Editorial
Editorial Assistance I suspect that the first trade-off between return and risk occurred when a
Lynn Coulthard, Christine Farmery, caveman had to decide whether to spear a mastodon or run. The relevant
Marguerite Martindale scenarios probably sped instinctively through the neurons. Stand and
throw the spear. If successful, have a fine feast, if not, get trampled. Or,
Editorial Board run and go hungry but live to hunt another day.
Andrew Aziz
Judith M. Farvolden We naturally deal with risky situations by contemplating the outcomes
Dan Rosen and musing about possible scenarios. Oddly, though, as we have grown
Stephen A. Ross more sophisticated in the world of financial risk control, we have often
Michael Zerbs left this intuitive and common sense approach behind us in favour of
technically more formal approaches such as “mean-variance.” There is
Layout
nothing inherently wrong with these formal ways of analyzing risky
Tammy McCausland
situations, but it is wrong to think that they have the same universal
applicability as the scenario approach. They are, after all, just special
cases of the caveman’s scenario analysis.
Mark-to-Future is a full-scale realization of the scenario approach. It
addresses risk and return by examining possible futures, i.e., scenarios,
http://www.mark-to-future.com and by cataloguing their effects on financial positions and portfolios. It is
based on the most fundamental of economic models: the state space
approach to uncertainty. In this view of the world, a state of nature is a
2000 Algorithmics Incorporated
realization of uncertainty, i.e., a possible scenario. In each state of nature,
(“Algo”). All rights reserved. Errors
assets and portfolios are marked-to-market along the future scenario,
and omissions excepted. Permission whence Mark-to-Future. Risk and reward analysis consists of tabulating
to make digital/hard copy of part or all all of the financial consequences of each state of nature and then making
of this work for personal or classroom decisions by trading off the benefits in some scenarios against the losses in
use may be granted without fee others.
provided that no copy is made, used
or distributed for any commercial While this is easy to grasp conceptually, it is a Herculean task to make it
purpose whatsoever and provided that practical. First and foremost, we have to identify the economic drivers of
Algo’s prior written consent has been future valuations. Then we have to generate a meaningful set of scenarios
obtained, and that Algo’s copyright that spans the relevant future possibilities for those drivers. Along each
notice, the title of the publication and scenario, assets must be priced and they must be priced with a full
its date appear on any such copy. understanding of all of their cash flows, of liquidity and trading issues and
To request permission to use part or of the policy decisions that will be made as the scenario unfolds.
all of this work contact: Fortunately, there is a major simplification—the Mark-to-Future Cube—
Algorithmics Incorporated that allows these computational tasks to be undertaken in an efficient
185 Spadina Avenue and easily decentralized fashion.
Toronto, Ontario
Canada M5T 2C6 This book, the result of 10 years of effort by the men and women of
Tel: (416) 217-1500 Algorithmics, describes the practical achievements that have taken their
Fax: (416) 971-6100 vision, spearheaded by Ron Dembo, and turned it into a working reality
that is being implemented in the market place today. This work is what
There are no representations or risk monitoring and measurement in financial markets should be about
warranties, express or implied, as to and now, thanks to them, it is what risk management can actually be.
the applicability for any particular The next ten years will see this approach become the standard for
purpose of any of the material(s) financial risk monitoring, analysis and regulation.
contained herein, and Algo accepts no
liability for any loss or damages,
consequential or otherwise, arising
from any use of the said material(s).
Stephen A. Ross
Guest Editor
With Thanks
The editors would like to acknowledge the individuals listed here for
their contributions. Our sincere thanks for their diligent efforts
which made the Mark-to-Future Technical Document possible.
Internal Referees
Nisso Bucay
Olivier Croissant
Ben De Prisco
Neil Dodgson
Joan den Haan
Alexander Kreinin
Asif Lakhany
Helmut Mausser
Jivantha Mendes
Leonid Merkoulovitch
Savita Verma
Hao Wang
Contributors
Ben De Prisco
Nisso Bucay
Olivier Croissant
Doug Gardner
Alexander Kreinin
Helmut Mausser
Savita Verma
Hao Wang
Preface
Ten years ago, Algorithmics was formed in response to the complex
issues surrounding risk management in banking. At that time, even the
most advanced financial institutions associated risk management with
instantaneous hedging of an options book. We recognized then that to
proactively measure and manage risk and reward, one needed to know
the total exposure of the institution across all of its global activities.
This gave birth to what is today referred to as enterprise risk
management or ERM. Since then, our software has helped to transform
the way in which approximately 100 banks, asset managers, insurance
companies and corporations, in 20 countries, are able to measure their
risk and manage their capital. At the heart of our software solution is
the Mark-to-Future methodology.
We have decided to make Mark-to-Future generally available as a
proposed new standard for risk/reward measurement. This innovative
approach to risk measurement significantly extends current
methodologies. It conforms not only with existing regulatory
requirements, but also to proposed regulatory changes. It responds
perfectly to the calls by regulators for a comprehensive framework and
not “formulaic approaches to risk” (Greenspan 1999). As the first truly
forward-looking risk/reward framework, we believe Mark-to-Future has
the potential to change the industry by profoundly affecting the way
risk management and capital allocation are practised in financial
institutions. Most importantly, since Mark-to-Future is a generic
framework and not based on a single formula or method, we believe it
will evolve with risk management practice.
Mark-to-Future allows for portfolios that may change over time and
under differing scenarios. By taking into account the effects of
changing portfolios at future points in time, a more realistic assessment
of risk is possible. Mark-to-Future provides a natural basis for linking
market, credit and liquidity risk. It also provides a unified framework
(the Put/Call Efficient Frontier) for calculating the risk/reward trade-
off of any combinations of these risks.
In a Mark-to-Future world, the fundamental input is scenarios and the
emphasis is on instruments, not portfolios. Since the framework is
additive, the Mark-to-Future of a portfolio is simply the same
combination of each instrument's Mark-to-Future, regardless of the
nature of the security. Accordingly, once a decision has been made
with respect to the future scenarios and the instruments one is likely to
trade, the Mark-to-Future values of these instruments may be
computed well before the actual portfolio composition has been
determined. This makes computation of marginal near-time risk/
reward measurement feasible, even for large institutions. With current
simulation methods this is impossible, since prior to a risk calculation
one needs to assemble all portfolios centrally, which is extremely time
consuming and fraught with difficulty.
What is Mark-to-Future? Most importantly, we believe that Mark-to-Future will break down
significant barriers to the risk service bureau business and make risk
A new, adaptable, multi-step, management outsourcing a reality, by making it possible for institutions
dynamic simulation framework to obtain risk management services without divulging their portfolios.
that integrates disparate sources This will level the playing field and put smaller institutions on an equal
and measures of risk and footing with even the biggest of banks. The same risk management
reward. Algorithmics has analytics, formerly very expensive and available to only the largest
proposed Mark-to-Future, or financial institutions, will be easily accessible. Ultimately, this will lead
MtF, as a standard for evolving to far more extensive and better risk management worldwide.
financial risk/reward
measurement. Mark-to-Future will also revolutionize application development and
risk architecture. The additive nature of Mark-to-Future makes truly
What outstanding problems
decentralized risk architecture possible. Mark-to-Future eliminates the
does MtF address?
need for monolithic applications; instead, software can be created to
• Measures the risk/reward of meet the specific needs of each and every business unit within financial
portfolios that are dynamic institutions. These “thin client” applications will be network-enabled
in nature. and will access pre-computed Mark-to-Future data. Because
• Captures the impact of applications can be built quickly and easily and do not require a large
settlement, maturity and investment, they may be modified quickly to adapt to an ever-changing
trading/investment risk management landscape.
strategies.
• Provides a framework for This document and the implementation of the ideas herein is the result
computing liquidity risk. of a collaborative undertaking, over a 10-year period, by an extremely
• Captures the interaction of knowledgeable, dedicated and talented workforce at Algorithmics. In
market, credit and liquidity particular, I would like to thank Bob Boettcher, Alex Kreinin, Savita
risk and reward. Verma, Olivier Croissant, Ben De Prisco, Judy Farvolden, Helmut
• Makes simulation-based Mausser, Doug Gardner, Michael Durland and David Penny for their
pre-deal risk analysis a contributions to material in this book and to the methodology itself.
reality for large institutions.
• Computes marginal market
An earlier version of this document appeared in Dembo et al. (1999)
and counterparty credit risk and in summarized form in Dembo (1998a, 1998b). We have also
in near time. presented the ideas herein at industrial conferences over the years.
• Measures accurately risk/ Many of our colleagues, both clients and non-clients, have provided
reward for all types of invaluable comments on this proposed standard. In particular, I would
financial instruments. like to acknowledge the feedback from Carol Alexander, Jean-Louis
• Addresses the needs of Bravard, Michael Durland, Robert Fiedler, Andrew Freeman, Arthur
large and small institutions Geoffrion, Fabiano Gobbo, André Horovitz, Chuck Lucas, Robert
in a consistent manner. Mark, Gary Nan Tie, Ariel Salama, Charles Smithson, Lawrence Tabb,
• Breaks the major barrier to Debbie Williams and Rudi Zagst.
risk services—it is not
necessary to disclose I would also like to extend my gratitude to Stephen Ross for his advice,
holdings to outsource risk insight and detailed comments, which have resulted in significant
calculations. improvements to this manuscript. Stephen worked tirelessly with the
What are the business benefits Algorithmics’ research team to refine the presentation and ensure that
of MtF?
the material was consistent, accurate and appropriate for the target
audience.
• Consolidates market risk,
credit risk, liquidity risk and The Mark-to-Future framework is the latest milestone in an incredible
ALM functions in one journey to understand risk and reward. It is not abstract theory. The
consistent framework for ideas presented here have been tried and tested and are available in
efficient capital allocation our products today. We anticipate that this framework will evolve and
and risk management.
grow over time and consequently welcome and look forward to your
• Enables truly distributed feedback. Updates to the methodology as well as discussions, critiques
risk management. and commentary will be posted on www.mark-to-future.com.
• Provides a common
language for technical and
non-technical risk
managers.
• Adapts to changes as risk
management needs evolve. Ron S. Dembo,
Toronto, April 2000
Table of Contents
References ........................................................................................ 75
Notation ...........................................................................................79
ARQ Review ...................................................................................... 81
List of Figures
We are striving for a framework whose underlying goals and broad strategies can remain
relatively fixed, but within which changes in application can be made as both bankers and
supervisors learn more, as banking practices change, and as individual banks grow and
change their operations and risk-control techniques. Operationally, this means that we
should not view innovations in supervision and regulation as one-off events. Rather, the
underlying framework needs to be flexible and to embody a workable process by which
modest improvements in supervision and regulation at the outset can be adjusted and fur-
ther enhanced over time as experience and operational feasibility dictate. In particular, we
should avoid mechanical or formulaic approaches that, whether intentionally or not, effec-
tively ‘lock’ us into particular technologies long after they become outmoded. We should be
planning for the long pull, not developing near-term quick fixes. It is the framework that
we must get right. The application might initially be bare-boned but over time become
more sophisticated.
Governor Laurence H. Meyer of the US Federal the requirements set out by Greenspan. It is
Reserve Board (1999) echoed the views of many designed to accommodate evolving standards
regulators who vigorously advocate increasing and, as a framework, does not “lock” into any
both the scale and the scope of public disclosure particular technology so as not to become
of information for various risk categories, most outmoded as technology changes. As Greenspan
notably credit exposure and credit concentration. contends, risk management is all about the
Governor Meyer identified internal bank systems fundamentals: “It is the framework that we must
as the first lines of defence in the prevention of get right.” In what follows, we introduce the
undue risk-taking, and was quick to point out Mark-to-Future methodology and describe how
that supervision and regulation should not this unifying framework advances the standard of
duplicate the efforts of increasingly sophisticated, risk management practice and oversight.
internal best-practice risk management systems.
Notwithstanding these sentiments, coupling Mark-to-Future is the product of over 10 years of
minimum capital regulation to the internal risk- research by Algorithmics’ financial and software
profiling system of a bank is a large and daunting engineers. However, Mark-to-Future is not just a
task. theory—it is the basis of Algorithmics’ risk
management solution, AlgoSuite. AlgoSuite has
Our goal at Algorithmics in writing this book is helped transform the way more than 100 banks,
to describe a standard framework for simulation- asset managers and corporations in 18 countries
based risk management that links together measure their risk and manage their capital.
market, credit and liquidity risks. Mark-to-Future Today, as the leading software provider with the
is an open, flexible and extensible framework most experienced team in the industry,
that advances best-practice risk management as Algorithmics continues to develop and market
advocated by Meyer and systematically satisfies enterprise risk management solutions that
2 A call for a framework
At the heart of every investment decision is the What is needed, then, is an approach that not
question, “What will be the value of a given only deals robustly with the full range of future
portfolio at some future time horizon?” The possibilities, but is also backward compatible with
uncertainty inherent in virtually all investment older technologies such as mean-variance
choices implies that this question can only be approaches.
effectively addressed by assessing a range of
possible future outcomes. The decision to invest Mark-to-Future focuses on simulated future
in a given security is made on the basis of some scenarios. In a simulation-based framework,
trade-off between the security’s contribution to scenarios completely define the possible future
overall risk and reward. realizations of value. Only through simulation
can the choice of risk/reward measures be fully
Risk and reward are typically quantified by decoupled from the estimation of future
measures computed from the distribution of outcomes. Bringing scenarios to the forefront is
future portfolio values. For example, risk is often the key to designing a flexible and extensible risk/
measured by a statistic or descriptor such as reward framework. Scenarios, effectively, become
variance, Value-at-Risk (VaR), worst case or the language of risk and reward.
regret while reward is often measured by a
statistic such as mean return, expected profit or In addition, scenarios provide a natural linkage
expected upside. between risk factors that traditionally have been
treated as separate “islands” in the risk
Traditional approaches to quantifying this trade-
management landscape. For example, the
off are typically constrained by narrow
integration of market and credit risk can be
assumptions, implicit or explicit, regarding the
achieved by defining the joint evolution of
shape of the distribution and the choice of
market risk factors and credit drivers. Each
appropriate statistics or descriptors that
individual scenario represents a given realization
characterize the distribution. In most cases, these
of the systemic risk factors, while idiosyncratic
assumptions are inextricably bound together in
risks may be modeled as realizations conditionally
the formulation of the model because a given
independent upon the occurrence of that
statistic serves as the key information input. As
scenario.
such, the choice of a particular risk/reward
measure invariably precludes arbitrary choices of What is Mark-to-Future?
portfolio realizations.
A case in point is the familiar mean-variance At any point in time, the levels of a collection of
framework, in which the use of standard risk factors completely determine the mark-to-
deviation as a risk measure is tightly bound to the market value of a portfolio. Scenarios on these
assumption that future portfolio values are risk factors determine the distribution of possible
normally distributed. Unfortunately, such mark-to-market values. Scenarios on the
traditional approaches are awkward at best, and evolution of these risk factors determine the
at worst, begin to unravel when the underlying distribution of possible Mark-to-Future values
assumptions are violated. (MtF values) through time. Because scenarios
capture future uncertainty as time unfolds, and
For example, the distribution of future values for because Mark-to-Future has scenarios as its key
a “zero-cost” spread position is clearly not normal information input, the framework enables the
and the use of return as a reward measure is calculation of future mark-to-market values that
meaningless when the current mark-to-market capture future uncertainty across scenarios and
value is zero. As another example, the estimation time steps.
A call for a framework 3
Mark-to-Future is essentially about two things: framework is strictly linear across position,
scenario and time dimensions. The addition
• Mark-to-Future is a robust and forward- of a new position, scenario or time step
looking framework that integrates disparate requires only the simulation of the new MtF
sources of risk. By explicitly incorporating values, which are then appended to the
the passage of time, the evolution of previously computed results. Previously
scenarios over time, and the dynamics of simulated MtF results need not be
portfolio holdings over time, Mark-to-Future recalculated; only the calculation of risk or
provides a flexible and unifying platform for reward measure need be repeated.
assessing future uncertainty.
• Mark-to-Future produces risk and reward
• Mark-to-Future is an extensible risk measures that explicitly capture the
architecture that can be leveraged within a passage of time. Thus, challenging issues
single organization and across several such as portfolio path dependency,
organizations. Mark-to-Future enables the settlement and reinvestment, dynamic
decoupling of the computationally intensive rebalancing and thin market effects can be
simulation stage (the risk service) from the addressed effectively.
post-processing risk/reward assessment stage
(multiple risk clients). It also accommodates • Mark-to-Future enables multiple portfolio
differing views of what the future may bring, regimes (strategies) to be evaluated. In
and different models for pricing assets and addition to the analysis of static portfolios,
positions, now and in the future. the MtF Cube makes it possible to assess the
risk/reward of a portfolio with holdings that
A framework for risk and reward change dynamically over scenarios and time.
As a risk framework for producing simulation- An architecture for risk and reward
based risk and reward measures, Mark-to-Future
As a risk architecture for the efficient delivery of
provides the following benefits:
risk and reward measures, Mark-to-Future
• Mark-to-Future is an intuitive framework. provides the following benefits:
Scenarios are the drivers of all future • MtF values need only be computed once.
uncertainty; they are the language of risk. Thus, Mark-to-Future is efficient and the
Individual MtF values are produced as a results can be leveraged to support multiple
direct function of a given scenario whose applications and/or risk clients.
characteristics can be explained in a
straightforward manner. Individuals of very • Mark-to-Future can be implemented
different levels of sophistication can globally across an organization. The
contribute to risk/reward discussions through generation of MtF values serves as the
the focus on plausible scenarios. common input to many risk/reward
applications at different levels of the
• Scenarios on risk factors define organization, from desk-level trader
distributions of MtF values. As individual applications to enterprise-wide applications.
risk factors can evolve jointly (and
• Mark-to-Future decouples the
arbitrarily) over time, Mark-to-Future can
computationally intensive simulation stage
capture the implied relationships (including from the post-processing stage. A service
correlations) among disparate risk factors bureau network can be established over
over multiple time steps. Thus, market, which risk services can be distributed by a
credit and liquidity risks may be integrated service provider to clients who need not
within a common framework. divulge their holdings.
• The realization of MtF values over • Mark-to-Future is an open and extensible
individual scenarios and time steps framework. The framework enables the
determines any risk or reward measure. incorporation of new scenario generation
Most importantly, as we will see, the techniques, new pricing algorithms and new
4 A call for a framework
In certain applications, a cell of the MtF Cube 2. Aggregate across dimensions of the portfolio
may contain other measures in addition to its MtF table to produce risk/reward measures.
MtF value, such as an instrument’s MtF delta or
MtF duration. In the general case, each cell of a 3. Incorporate portfolio MtF tables into
MtF Cube contains a vector of risk-factor advanced applications.
dependent measures for a given instrument
under a given scenario and time step. In some The simulation of the MtF Cube in Step 1 to
applications, the vector may also contain a set of Step 3 represents the only computationally
risk-factor dependent MtF cash flows for each intensive stage of the process and, significantly,
scenario and time step. For ease of exposition, need be performed only once. These steps
however, we focus primarily on the typical case in represent the pre-Cube stage of MtF processing.
which each cell contains only the instrument’s In contrast, Step 4 to Step 6 represent post-
MtF value. processing exercises, which can be performed
with minimal additional processing (Step 4 and
Key to the MtF framework is the premise that Step 5) or slightly more complex processing
A call for a framework 5
Step 1: Scenarios
Define the scenarios
and time steps Time steps
Basis instruments
Step 2:
Define the basis
instruments
Scenarios
Pre-Cube
p s
ste
MtF Cube
e
Instruments Tim
Step 3:
Simulate the instruments
over the scenarios and time steps
Step 4:
Map the MtF Cube into
portfolios/portfolio regimes
Scenarios
Post-Cube
MtF values
Step 5:
Aggregate portfolio MtF values
to produce risk/reward statistics
Step 6:
Incorporate portfolio MtF Portfolio credit loss
values into other applications
(Step 6). These steps represent the post-Cube framework designed not merely to measure risk
stage of MtF processing. Figure A.2 provides a and reward but, significantly, to manage the
schema illustrating the six steps of the MtF trade-off of risk and reward. The remainder of
methodology. this document is organized as follows.
The decoupling of the post-Cube stage from the Step 1 discusses the definition of scenarios. In
pre-Cube stage is the key architectural benefit of the MtF framework, scenarios represent the joint
the Mark-to-Future framework. A single risk evolution of risk factors through time and are,
service may generate a MtF Cube (pre-Cube) thus, the ultimate determinant of future
that can be distributed to multiple risk clients uncertainty. The explicit choice of scenarios is
(post-Cube) for a variety of customized business the key input to any analysis. Accordingly,
applications. This generates leverage as a scenarios directly determine the future
common risk/reward framework and can be distributions of portfolio MtF values, the
widely distributed throughout the organization as dynamics of portfolio strategies, the liquidity in
well as to external organizations for user-specific the market and the creditworthiness of
analyses. counterparties and issuers. This step discusses
scenarios in risk management, their importance,
The six steps of Mark-to-Future and various methodologies used to generate
them.
The objective of this document is to provide a
step-by-step description of the fundamentals of Step 2 discusses the definition of basis
the MtF framework and to demonstrate why it instruments. Portfolios consist of positions in a
represents a standard for simulation-based risk/ number of financial products, both exchange
reward management. Mark-to-Future is a traded and over-the-counter (OTC). The MtF
6 A call for a framework
Cube is the package of MtF tables, each Step 5 discusses the estimation of risk/reward
corresponding to an individual basis instrument. measures derived from the distribution of
A basis instrument may represent an actual portfolio MtF values. The portfolio MtF table
financial product or an abstract instrument. As resulting from the mapping of the MtF Cube into
the number of OTC products is virtually
a given portfolio or portfolio strategy contains a
unlimited, it is often possible to reduce
full description of future uncertainty. Each cell of
substantially the number of basis instruments
required by representing the MtF values of OTC the portfolio MtF table contains a portfolio MtF
products as a function of the MtF values of the value for a given scenario and time step. The
abstract instruments. This step discusses the actual risk and reward measures chosen to
issues surrounding the selection of basis characterize this uncertainty can be arbitrarily
instruments to be contained in the MtF Cube. defined and incorporated strictly as post-
processing functionality in the post-Cube stage.
Step 3 discusses the generation of the MtF
Cube. The MtF Cube consists of a set of MtF
tables each associated with a given basis Step 6 discusses more advanced post-
instrument. The cells of a MtF table contain the processing applications using the MtF Cube.
MtF values of that basis instrument as simulated MtF Cubes may serve as input for applications
over a set of scenarios and a number of time more complex than calculating simple risk/
steps. This step discusses the relationship reward measures. The properties of linearity and
between risk factors, scenario paths and pricing conditional independence on each scenario can
functions for the simulation of MtF values.
be used to obtain computationally efficient
Step 4 discusses the mapping of the MtF Cube methodologies. For example, conditional
into portfolios and portfolio strategies. From independence within a particular scenario is a
the MtF Cube, multiple portfolio MtF tables can powerful tool that allows the MtF framework to
be generated as functions of the MtF tables incorporate effectively processes such as joint
associated with each basis instrument. Key to the counterparty migration. In addition, portfolio or
MtF framework is the premise that a MtF Cube is
instrument MtF tables may be used as input to a
generated independently of portfolio holdings.
Any portfolio or portfolio regime can be wide variety of scenario-based risk management
represented by mapping the MtF Cube into static and portfolio optimization applications.
or dynamically changing portfolio holdings. This
step discusses how portfolio MtF tables are The notation used in this book is summarized in
produced from a single MtF Cube. the Notation chapter (page 79).
Step 1: Defining Scenarios and
Time Steps
Scenarios represent the joint evolution of risk factors through time and are, thus,
comprehensive descriptors of future uncertainty. They directly determine the future
distributions of portfolio MtF values, the dynamics of portfolio strategies, the liquidity
in the market and the creditworthiness of counterparties and issuers. The explicit
choice of scenarios is the key input to a MtF analysis. This chapter discusses scenarios
in risk management, their importance and various methodologies used to generate
them.
The quality of a risk management analysis recognize the limitations of such techniques.
depends on the ability to generate relevant, Scenario-based risk management analyses
forward-looking scenarios that properly represent overcome these limitations, but create a need for
the future. In the MtF framework, at a point in relevant scenarios. Accordingly, in the last
time, the distribution of underlying risk factors decade there has been extensive research into
such as interest rates, equity indices, commodity techniques for generating realistic and
prices, foreign exchange rates or macroeconomic computationally efficient scenarios for risk
variables is defined by the set of scenarios management. Many of these techniques are
explicitly chosen. Thus, any desired future risk largely derived from major advancements in
factor distribution can be incorporated directly scenario generation in both the physical and
into the analysis in a manner that is not economic sciences, starting with the
constrained by a specific summary statistic or a development over half a century ago of tools such
single scenario generation technique. Future as statistical bootstrapping and the Monte Carlo
distributions of portfolio outcomes are then method.
ultimately driven by future distributions of
underlying risk factors. The transparent This chapter defines scenarios formally, discusses
integration of various seemingly different risks, their merits, provides an overview of the
such as market, credit and liquidity risk, can be characteristics of scenarios for different risk
accomplished through the explicit introduction analyses and, finally, considers the main
of scenarios which include the factors that are techniques used to generate scenarios including
the sources of these risks. history, scenario proxies, scenario bootstrapping,
subjective opinions and model-based methods.
Traditional risk/reward methodologies are based
on simplifications, often parametric, of the What are scenarios?
description of the risk factor evolution that are
necessary to achieve mathematical tractability. A scenario is the basic descriptor of the
Simplifying assumptions are made despite the evolution of the state-of-the-world over time. It
fact that financial risk managers now widely gives a joint realization of all the relevant
8 Step 1: Defining scenarios and time steps
financial and economic risk factors at a discrete In contrast, by generating scenarios on systemic
set of times in the future. The period of the risk factors, the estimation of models is more
analysis is [0,T], where today is time t = 0 and robust and the use of historical simulation is
the time horizon is t = T. A MtF table is straightforward, as is obtaining the MtF value of
constructed by valuing a financial instrument a new instrument that is completely consistent
across each of the j = 1, …, S scenarios over with the prices of the existing instruments. Using
this approach, it is not necessary to define
t = 1, …, T time steps. The MtF value, mijt, of explicitly the correlations among all instruments,
instrument i under scenario j at time step t is a since the co-dependence is defined implicitly by
function f(•) of the levels of specified risk factors the scenarios.
mijt = f ( u 1jt, u 2jt, u 3jt, …, u Kjt ) Scenario selection differs considerably from
forecasting. A forecast is a prediction that a
where u kjt = ( k = 1, …, K ) is the level of risk factor single scenario will occur, and its accuracy is
therefore crucial. However, no one is able to
k under scenario j at time step t. Thus, the
predict accurately specific financial events in the
realization of risk factor levels over time
future. As Nobel Laureate Paul Samuelson
completely defines the future realizations of
observed, “analysts successfully predicted 12 out
instrument and portfolio values through time. In of the last four major recessions”
advanced applications, the scenarios may (Samuelson 1991).
describe only the systemic realizations of
instrument values. This is further described in The goal in selecting scenarios in the MtF
Step 6. framework is to span the range of future possible
events, and not necessarily to forecast that any of
The current state-of-the-world is the point of these events will actually occur. Thus, in a given
departure of every scenario. Today, the current risk analysis, we assume the existence of a set of
market and economic data determine, for scenarios that may occur in the future. If the
example: scenarios set is rich enough, one of these events
will actually occur, but at the start of the period,
• the mark-to-market value of all instruments there is uncertainty as to which one it will be.
in a portfolio
It is also assumed that we have a perception of
• the cash flows to be paid, received and the likelihood of these scenarios. The likelihood,
reinvested or probability, pj , ( j = 1, 2, … , S ) , of a given
• the bid-ask spreads scenario j is not needed to compute MtF values.
Probabilities are required only at a later stage
• the credit quality of all obligors (issuers, when risk/reward statistics are computed. In
counterparties, accounts) in the portfolio. addition, the choice of scenarios need not
necessarily depend on the probability assigned to
In principle, scenarios can be generated directly them. This separation of probabilities from the
on the prices of all securities (accounting for the scenarios themselves allows MtF values to be
sizes of trades), the credit state of given obligors computed once and subsequently used as inputs
and the composition of the portfolio, etc. This into many different risk analyses.
would be impractical, however, for a number of
Scenarios must span a wide range of possible
reasons. For example, to generate the scenarios,
outcomes and they must also extend over a
correlations between all instruments are needed, horizon, or multiple horizons, of appropriate
and each time a new instrument is added to the length. The length of the horizons varies with the
portfolio, its correlation to all other instruments problem. While the appropriate horizon for a
must be computed. Furthermore, it is over- market risk analysis might be one to 10 days (or
restrictive because the characteristics of longer depending on the liquidity of the
instruments change through time and it is position), estimating counterparty exposure
unclear how we would construct their joint profiles may require multiple time horizons over
evolution from previous historical data. 10 years or more.
Step 1: Defining scenarios and time steps 9
(but not necessarily weights) in the MtF factors affecting their value over time (Aziz and
framework. All scenarios are part of the same Charupat 1998). Portfolio credit risk models, on
consistent MtF Cube. A distinction is made only the other hand, are concerned with the joint
at the post-Cube stage of MtF processing. defaults and credit migrations of all obligors, and
hence require scenarios on the credit drivers that
The choice of scenarios typically depends upon influence obligor creditworthiness. For example,
the risk management application. Table 1.1 CreditMetrics (J.P. Morgan 1997) defines
presents a summary of scenario requirements country, region and sector indices as credit
(type of scenario, risk factors, horizon and drivers for the asset value and credit quality of
number of times steps) that depend on the risk each obligor.
management area and desired application. These
are general guidelines on the characteristics of Integration of market and credit risk through
the scenarios required, and not necessarily the scenarios
current “best practices.” In most applications It is still common practice to treat market and
today, rigorous scenario-based risk management credit risk separately. This separation has a major
is still not the norm. impact when measuring counterparty exposures,
portfolio credit risk and specific risk for bonds.
In general, scenarios in market risk applications
This is further described in Step 6.
are described by financial market risk factors that
include various zero coupon term structures, Ultimately, a comprehensive framework requires
equity indices, exchange rates, commodity prices the full integration of market and credit risk. In
and implied volatilities. Typically, a market risk the Mark-to-Future framework, the integration
analysis is performed over a single, short horizon of market and credit risk is achieved through the
(one to 10 days). Term structures and volatility scenarios, which explicitly define the joint
surfaces are modeled with a large number of evolution of market risk factors and credit
factors (e.g., the RiskMetrics dataset contains drivers. Market factors drive the prices of
14 nodes per term structure). Statistical analysis, securities, and hence exposures, whereas credit
such as VaR, requires the description of the joint drivers are systemic factors that drive the
distribution of the factors at the given horizon. creditworthiness of obligors in the portfolio.
Both historical and Monte Carlo scenarios are Factors are general and can be microeconomic,
widely used. Scenarios for asset management macroeconomic, economic and financial.
applications are similar but extend over longer
horizons ranging generally from one month to Consider, for example, the counterparty exposure
one year and may include multi-time step to a position in a corporate bond. The present
simulations. Accordingly, asset managers also value of the bond is a function of the Treasury
customarily use factor models to reduce the curve and an additional credit spread that
factors in their analysis to a more manageable compensates for the default risk of the corporate
number. issuer. The credit spread, in turn, depends on the
credit quality of the corporate issuer. If a credit
In contrast, asset liability management (ALM) downgrade occurs, cash flows must be valued
and credit risk analyses require the generation of using a higher credit spread. Market and credit
scenarios over multiple steps spanning long risk are consistently linked by incorporating all
horizons, sometimes extending to the maturity of the systemic information on the interrelated
all outstanding positions. An ALM analysis is outcomes for the issuer’s credit rating and for the
concerned mainly with scenarios describing the Treasury curve in the scenarios. An illustration of
evolution of interest rate risk factors (and integrated market and credit scenarios based on
perhaps foreign exchange risk factors) affecting the integrated model of Iscoe et al. (1999) is
the cash flows of trading and banking positions. presented on page 68.
Since the values of derivatives are extremely
Scenarios and liquidity risk
sensitive to the market conditions, the analysis of
counterparty credit exposures in a derivatives Although the cost of liquidity is usually reflected
trading book requires scenarios on all market in bid-offer spreads, liquidity risk can also be
Step 1: Defining scenarios and time steps
Number of Number of
Area Factors Risk analysis Horizon Type of scenarios
time steps scenarios
Market risk 50–1,000 statistical (VaR, RAROC, risk 1–10d 1 historical 100–500
IR (government, spreads) contributions)
MC: normal and fat-tail 1,000–
foreign exchange distribution 10,000
equity stress testing (worse-case, what-if) 1–30d 1–10 extreme: historical “crashes”, 5–50
commodities subjective
sensitivity analysis (deltas, etc.) 0–1d 1 shifts in small number of factors 10–100
Asset/fund 20–500 statistical (VaR, risk contributions, risk 30d–1y 1–10 historical, bootstrapping 100–500
management IR (government, spreads) adjusted returns)
MC: multi-factor processes 1,000–
foreign exchange 10,000
equity stress testing (worse-case, what-if) 30d–1y 1–10 extreme: historical “crashes”, 5–20
commodities subjective
Asset liability 20–100 VaR (over time) 6m–30y 20–100 historical, bootstrapping 100–500
management IR (government, spreads) cash flow at risk
earnings at risk MC: multi-factor processes 500–5,000
foreign exchange
stress testing 6m–30y 20–1,000 extreme: historical “crashes”, 5–20
subjective
Counterparty 50–100 statistical: expected: “VaR” exposures” 1–30y 10–40 MC: multi-factor processes 1,000–5,000
credit IR (government, spreads)
exposures stress testing 1–30y 10–5,000 extreme: historical “crashes”, 10–50
foreign exchange subjective, scenario banding
equity
commodities
Portfolio 50–200 Capital: Expected losses, Credit VaR, 1–10y 1–30 MC: multi-factor processes 1,000–5,000
credit risk (systemic market and RAROC, risk contributions
credit factors): IR
(government, spreads), stress testing 1–10y 1–30 extreme: historical “crashes”, 5–20
foreign exchange subjective
11
equity
commodities
macroeconomic factors
modeled indirectly by designing joint scenarios that span the range of possible future states-of-
on daily trading volumes and market risk factors the-world over the time horizon relevant for risk/
such as price and volatility (see, for example, reward measurement. Ideally, the scenarios in a
Yung 1999a). In such a model, portfolio positions risk analysis should satisfy three principles:
are liquidated conditional on the simulated
outcomes for these risk factors in each scenario. 1. Scenarios must “cover” all relevant past
For example, when simulated trading volumes history. As Mark Twain once said: “history
are high, positions are liquidated faster than does not repeat itself, but it rhymes.”
when trading volumes are low. If portfolio
Historical scenarios satisfy this principle
holdings are conditional on scenario outcomes, it
directly. For model-based scenarios this
must be possible to simulate the value of
means that history must be used explicitly to
portfolios that evolve over time. An example of
estimate the parameters of the model. Of
simulated trading volumes is presented on
course, the process of determining the
page 44.
relevant history is largely subjective. The
Joint scenarios on market and liquidity risk for appropriateness of using one or five years of
equity portfolios might include equity indices and historical data (either in a historical analysis
specific risk factors, which capture market risk, as or to calibrate a model), depends on the
well as bid-ask spreads and trading volumes, application and on the analyst’s opinion of
which determine liquidity risk. Joint scenarios on what periods of history are relevant.
market, credit and liquidity risk for emerging Although theories can be applied to
market bonds, for example, may cover US determine the amount of data required for
Treasury rates, credit risk drivers such as an robust estimation or to exclude (include)
emerging market index, sovereign spreads and outliers, in the end there is no simple formula
bid-ask spreads. to determine the relevance of history for the
future and, hence, the subjective
Scenario generation methodologies assumptions must be made as explicitly as
possible.
There are many ways to generate scenarios for
2. Scenarios should also account for events that
stress testing or statistical risk measurement.
have occurred in the past, but which may be
Systematic scenario generation methods can be
plausible under the current circumstances.
broadly classified into historical and model-based
methods. Historical scenario generation methods In the case of historical scenarios, this
are sometimes referred to as non-parametric requires the use of history in creative ways.
methods, while model-based methods are First, the history of each risk factor can be
referred to as parametric methods. In addition to augmented by including more distant events
these systematic methods, risk managers also from the risk factor’s own history than would
customarily use ad hoc or subjective methods, be a part of a standard, fixed calibration
which are not based directly on a model or period. Second, the history of one risk factor
history but rather on experience, intuition or a can become a proxy for scenarios on another
specific need. Examples of these are an opinion factor. For model-based scenarios, certain
that the NASDAQ will plummet 30%, a “what- parameters of a model account for current
if” analysis to investigate the impact of a information that is not rooted in history,
100 basis point move in a GBP interest rate book, such as a new political development or
and a sensitivity analysis of a position performed special market conditions.
by shocking each node in an interest rate curve
separately by one basis point. 3. Scenario generation methods must be
validated, whenever possible, out of sample.
Whichever method is chosen, with careful
analysis, and a broad view on where to seek As an example, in the case of a market risk
pertinent information, it is generally possible to analysis, the validity of the scenario set
generate reasonable, forward-looking scenarios generated yesterday can be tested by
Step 1: Defining scenarios and time steps 13
comparing today’s mark-to-market values mean that a 6-sigma crash is a very unlikely
with yesterday’s Mark-to-Future values. By historical event?
repeating this exercise over an extended
To answer this question, assume that daily
period, it is possible to determine the
returns are normally distributed and
performance of various methods. In the risk
independent. The probability of a single 6-sigma
management literature, this is commonly
referred to as backtesting, and has become an event on any given day is about 10-9, significantly
important regulatory requirement. There are less than 0.04%. The probability of one event in
various statistical backtesting methods (Basle 84 years is then about 0.002%.; the probability of
Committee on Banking Supervision 1997, nine 6-sigma events occurring in 84 years is
Lopez 1999). Note that in the case of longer about 10-43. Whereas according to history, 6-
simulation horizons, out-of-sample testing sigma events are rare, a simple parametric model
becomes more difficult, but it is still possible grossly underestimates their probability.
to apply advanced statistical methods (Lopez
and Saidenberg 2000) and to perform some In established markets where no fundamental
checks and balances. Finally, in practice it is shifts in the underlying structure have recently
clearly not possible to perform out-of-sample occurred, a long history may be an adequate
validation of extreme scenarios for stress source of scenarios describing typical market
testing in isolation. conditions. However, shifts can and do occur and
the future is sure to contain events outside this
In the following sections we review the basic range. When history is the sole source of
principles of historical and model-based methods. scenarios, it is implicitly assumed that between
now and the horizon there will be no
Historical scenario generation methods fundamental shift in the underlying forces in the
market, other than those already observed during
Historical time series over long periods reflect a the historical period chosen. This precludes the
very wide range of scenarios. Since these events incorporation of new views that could be
have occurred previously in the market, they are extracted from other events or by other
certainly candidates for possible future events. techniques.
However, significant events occur infrequently
and a long history may be required to produce a Consider a second example. At the end of
set of scenarios that spans the possible range of March 1998, the Canadian dollar was trading at
outcomes. The longer the time series, the wider approximately 0.708 USD. The Canadian dollar
the range of possible outcomes that can be taken had last touched 0.700 USD in 1994. A risk
into account. analysis of a portfolio with Canadian dollar
exposure based on a standard historical time
Neglecting the possibility of a crash scenario is series would have provided a false sense of
clearly not prudent in a comprehensive risk security to the holders of that portfolio. In fact,
analysis. Yet, if only recent history is considered, the drop in the Canadian dollar from 0.708 to
or the model is restricted to normal distributions 0.633 USD that occurred between March and
that do not include fat tails, this possibility will August 1998 represented a three standard
probably be excluded. In the MtF framework, a deviation move with about a 0.02% likelihood of
crash scenario can be incorporated explicitly. occurring. (The standard deviation of daily log
Risk analyses can include both historical returns with zero mean is calculated based upon
scenarios to account for typical events and stress one year of daily observations between
test scenarios to account for atypical events. March 1997 and March 1998 and scaled to a
five-month horizon by the square root of time).
For example, in the last 84 years, there have been
nine crashes in the Dow Jones Industrial Index Beyond that, how would we understand the risk
that exceeded six standard deviations (6-sigma) or return of a stock that has not yet been issued
in a single day (based upon the previous 250 or of the euro before it became a traded currency?
business days). From historical experience, the How can we measure the risk of Internet stocks
actual likelihood of a crash of that magnitude when there is insufficient history in this market
occurring on any given day is 0.04%. Does this on which to base historical scenarios?
14 Step 1: Defining scenarios and time steps
In situations such as these we must go beyond last decade, for example in 1987, 1989 and 1993,
the history of the risk factors in order to generate the exchange rate on the Australian dollar
future events that reasonably span the range of dropped by 10% or more within a short number
possible outcomes. One way to add to the history of months. Scenarios on the AUD/USD
exchange rate would have provided a sensible
forward-looking view of future Canadian dollar
behaviour, including a Canadian dollar scenario
During World War II, a team of three
describing a drop to 0.633 USD or below.
eminent professors was asked to make long-
range weather forecasts for the Air Force. It A useful way to address the sentiment or
did not take them long to realize that they economic environment of a given market is to
had no idea how to go about this, but one of look for history in a different market or region
them remembered that each year the that may have gone through similar situations in
Farmer’s Almanac published detailed and the past. We then take these conditions and apply
somewhat reliable predictions of the them today as scenarios on the desired market.
weather for the upcoming year. Such scenarios are generally called proxies.
Unfortunately, when the professors asked
how the Almanac made the weather Consider Brazil as another example. Early in
prediction, the publisher refused to say! 1999, market risk managers in Brazil asked, “How
can we calculate Value-at-Risk figures for our
Rebuffed but inspired, the professors set portfolios when we don’t have any historic data?”
about to reverse engineer the process and (Locke 1999). One possibility would have been
find out how the Almanac’s predictions to consider the market turbulence caused by the
were made. After some effort, they were devaluation of the ruble in August 1998 as a
convinced they had figured it out: to proxy for how events in Brazil might unfold
generate a forecast for, say, the year 2001, should the real be allowed to float (Yung 1999b).
the summer of 1954 was combined with
the fall of 1973, the winter of 1937 and Bootstrapping can be performed in more
the spring of 1982. By combining events sophisticated ways to obtain conditional
that had actually occurred they were able to scenarios. For example, high volatility scenarios
generate a plausible forecast for 2001. can be generated by sampling repeatedly from a
Bootstrapping was born when numerous history filtered to exclude periods of low
such forecasts were generated randomly, volatility.
creating a scenario set of weather forecasts.
Bootstrapping can also be used in conjunction
with scenario proxies. Previous markets that have
of a market is to use the history for similar enjoyed exceptional growth, Nokia A stock (an
situations in different markets as a proxy. emerging industry) and the gold rush in the
1970s, can serve as proxies for upside scenarios to
Scenario proxies and bootstrapping determine the risk in a position in a new Internet
stock (Yung 1999c). The 1929 stock market crash
that precipitated the Great Depression, and
If standard historical methods would not have Black Monday of October 1987 can serve as
captured the exchange risk in the Canadian proxies for downside scenarios. Bootstrapping can
dollar in March 1998, what methods might have then be used to scramble and rearrange daily
been used to determine the six-month risk in the returns from the applicable proxy periods. Further
CAD/USD exchange rate? Novel scenarios on randomization can be achieved by randomly
future Canadian dollar rates for August and selecting the starting point for a scenario series.
September 1998 could have been extracted from
an AUD/USD time series. Using the AUD/USD Model-based scenario generation
exchange rate as a proxy for the scenarios on the
CAD/USD rate could have been justified on the Research in computational finance over the last
basis of perceived similarities between the decade has resulted in powerful models for
Australian and Canadian macroeconomic scenario generation applicable to risk
conditions at that time. Several times over the management. By adding theory and structure to
Step 1: Defining scenarios and time steps 15
Parallels to Russia
In August 1998, Russia
announced the devaluation of
its currency and temporary
default on its government
debt. Russian stocks fell by Brady spreads US 30-year Treasury rate
more than 35% while the ruble
tumbled by more than 50%. Changes in key rates (August 14–28, 1998)
the way scenarios are generated, these models strong emphasis on modeling tails is required.
allow us to Pricing derivatives, on the other hand,
requires the computation of expectations of
• extend what has been observed, using all the distributions and may be less concerned with
historical and “current event” information extreme events.
available
• generate a large number of scenarios in the As with pricing models, however, risk
sample that can lead to reliable statistical risk management scenario generation models must be
measures financially consistent and build on existing
mathematical and financial theory. For example,
• move from a mere description of what could a model for interest rate scenarios that permits
happen to an explicit explanation of negative forward rates would permit static
correlations and causality that explains the arbitrage and could result in unreasonable MtF
sources of risk values for certain securities.
• construct tools for forecasting and for the
A scenario generation model consists of three
computation of conditional events.
separate parts:
Models of the evolution of financial risk factors
1. The risk-factor evolution model is a
are used for both pricing of derivative securities
mathematical model that describes the joint
and risk management. However, the models
evolution of the risk factors in the future.
traditionally built for derivatives pricing may not
The model is chosen for its mathematical
be appropriate for risk management for various
tractability, the underlying financial theory,
reasons:
its asymptotic properties and its ability to
• Risk management scenarios must use the true explain causality, as well as other scientific
probabilities of events and are calibrated and aesthetic properties of good modeling.
using history and current event information.
In contrast, pricing models generally use 2. The model calibration method refers to the
what are commonly called the risk-neutral methodology used to estimate the parameters
probabilities. This is a mathematical of the model. Generally, models are
convenience that allows the direct calibrated using historical information,
incorporation of a no-arbitrage condition incorporating current events and using the
while expressing the price of a derivative as experience of the modeler.
an expectation. These models are calibrated 3. The sampling methodology is used to obtain
to fit current available prices of traded random samples from a model. The objective
securities. of scenario generation in risk management is
• Risk management scenario generation not to provide a single forecast of the risk
models are generally high dimensional. Since factor levels in the future. Rather, we require
the emphasis is on understanding the joint methodologies to sample various scenarios,
behaviour of many instruments in a portfolio, generated according to the evolution model,
they must incorporate the relationships and assign probabilities to them. Thus, at the
between all factors. Pricing models, on the heart of model-based scenarios is the
other hand, generally use a small number of generation of random numbers. Note that, in
factors; for example, most implementations addition to sampling randomly from a model,
of term structure models use one or two it is possible also to sample specific likely
factors. Also, since they are used to price scenarios, extreme events and conditional
single securities, these models usually do not scenarios on the realization of a small
incorporate the relationships to other factors. number of variables.
• Risk management focuses generally on the The most general method for sampling a large
tails of portfolio distributions; this is where number of random scenarios from a model is the
the action is for a risk manager! Hence, a Monte Carlo method. Monte Carlo simulation is
18 Step 1: Defining scenarios and time steps
Mark the position to future volatility scenarios are Would a trading strategy
bootstrapped by combining mitigate risk? Five million shares
The three-year period returns from the upside and represent 10% of Amazon.com’s
following each historic market downside proxies. This process total market float. The average
peak serves as the proxy for creates a set of 200 scenarios daily volume for Amazon.com is
downside risk while the one- over a horizon of 100 days. only seven million shares; it is
year period that precedes each common for daily trading
historic market peak serves as Assuming a hold strategy, the
volume to be as low as 10% of
the proxy for further growth Mark-to-Future values of one
average volume. This lack of
scenarios. share of Amazon.com indicate
liquidity poses additional
that the stock can potentially
Downside scenarios are concerns—instantaneous
gain 120% in the next 100
bootstrapped using daily disposition of the entire holdings
days. However, it can also lose
returns generated from the is not likely. A stress test model
more than 85% over the same
downside proxies, upside must forecast the risk specific to
period. This translates into a
scenarios are bootstrapped the Amazon.com shares, as well
loss of more than 587 million
using daily returns generated as the risks of trading in this
USD in just five months on a
from the upside proxies, and illiquid market.
position of five million shares!
20 Step 1: Defining scenarios and time steps
The impact of liquidity (the next 100 days) The dynamic trading strategy
yields benefits. Under the worst
To account for a light trading the position size is decreased downside scenario, the hold
day in which daily volume is by 250,000 shares. When the strategy forecasts a potential loss
limited to 700,000 shares, the price drops below 20% of its of 85% over the next 100 days.
strategy is conditioned on the current value, the disposition By locking in some of the gains,
movement of the of shares is increased to potential losses are limited to
Amazon.com share price. 500,000. For simplicity, 28% under normal liquidity
Whenever Amazon.com falls proceeds from the sale remain conditions and 38% in a highly
below 10% of its current value, as uninvested cash. illiquid market.
Step 1: Defining scenarios and time steps 21
the most widely used methodology but, generally, be modeled as random variables. The model
it is not computationally efficient. For example, describes the future distribution of those random
it is not uncommon for the portfolio of a bank or variables at the horizon.
an insurance company to contain several
A model of the joint distribution of the random
hundred thousand positions, including
variables must realistically describe
substantial volumes of derivative products such
as swaps, caps and floors, swaptions and • the marginal distribution of each variable
mortgage-backed securities. A full Monte Carlo • the co-dependence structure of all variables.
simulation of a large and complex portfolio
containing such instruments is computationally
expensive, and may not even be achievable
The introduction of computers in the late
within a reasonable time.
1940s stimulated the development of
Various computational methodologies have been mathematical methods that employ random
developed to make the Monte Carlo method, numbers. A mathematician at Los Alamos,
Stanislav Ulam, was one of the first to
which uses pseudo-random numbers, more
realize that they would allow computations
efficient for various applications. These on a sufficiently large scale to be useful for
methodologies, commonly referred to as variance engineers and scientists designing nuclear
reduction methods, include antithetic variables, bombs. In 1947, Ulam and Princeton
control variates, importance sampling, stratified mathematician, John von Neumann,
sampling and quasi Monte Carlo methods. proposed the use of computers to apply a
statistical approach to physics calculations.
Most of the financial literature has focused on The story goes that the name “Monte Carlo”
the application of variance reduction methods was given because “Ulam had an uncle who
for pricing (e.g., Boyle 1977, Boyle et al. 1997, would often borrow money from relatives
Ackworth et al. 1997, Broadie and because he just had to go to Monte Carlo to
Glasserman 1997, Caflisch et al. 1997, Paskov visit their famous roulette wheels”
and Traub 1995, Schoenmakers and (Peterson 1998). The name is now
Heemink 1997). Some of these tools do not commonly used to refer to any mathematical
provide advantages for risk management in method that employs statistical sampling
practice because of the high dimensionality of and random numbers.
the problems and the required focus on the tails
rather than on the expected value. However, An efficient and flexible method of modeling the
several methods have proven advantageous. joint distribution of a large number of financial
Applications of stratified sampling, control variables is to model the marginal distribution of
variates, important sampling and quasi Monte each variable separately and then construct a
Carlo methods have demonstrated performance model of the co-dependence structure (Carillo et
improvements in a number of risk management al. 2000).
applications (e.g., Jamshidian and Zhu 1997,
Shaw 1997, Kreinin et al. 1998a, 1998b, Traditional applications assume that the joint
Cardenas et al. 1999). For example, Kreinin et distribution of financial returns is described by a
al. (1998a, 1998b) show that quasi Monte Carlo multi-variate normal or log-normal distribution.
methods, in conjunction with principal This means that the marginal distribution of all
component analysis, compute VaR an order of variables is normal and the co-dependence
magnitude faster than standard Monte Carlo structure is defined in its entirety by the
simulation. An example of quasi Monte Carlo correlation matrix of all the variables. These
methods coupled with principal component distributions are also easy to estimate from
analysis is given in the sidebar. historical data. Some advanced estimation
methods that have proven quite popular in
Risk factor evolution models practice include exponentially weighted moving
If the simulation requires only one step, as may averages (Longerstaey and Zangari 1996) and
be the case in many market risk and portfolio GARCH (see, for example, Bollerslev 1986 and
management applications, then risk factors can Engle and Mezrich 1995).
22 Step 1: Defining scenarios and time steps
over the total number of • they yield probabilistic • they have been well-tested
points, over all possible errors and a priori bounds in the non-financial and
subsets. on VaR estimates. financial literature (for
derivatives pricing).
Sobol points cover the hyper- The main disadvantages of
cube more uniformly than do MC methods are The main disadvantages of
the randomly generated QMC methods are
points. Thus, the discrepancy • they use pseudo-random
number generators that • their lack of generality when
of the Sobol points is lower
tend to generate clusters compared to MC methods
than that of the pseudo-
of points means that their
random points.
effectiveness may be largely
• they do not explicitly dependent on the problem,
Advantages and exploit particular features and extensive testing is
disadvantages of the problems required
• their rate of convergence • they do not yield
The main advantages of
is slow. probabilistic errors or a priori
standard MC methods are
The advantages of QMC bounds on VaR estimates
• they are generally methods are • their rate of convergence
applicable to all problems
depends on the
• their rate of convergence • they are based on
dimensionality of the risk
is independent of the sampling techniques that
factor space, d
dimensionality of the risk generate points evenly
within the region and • they may be inefficient for
factor space
avoid the clustering problems in very large
• they are very popular and generally associated with dimensions.
their properties are well MC methods
known
Note that on two occasions VaR for a US investor using different models
the returns from the
observation period fall below
–5%. Although the MN model
captures the tail fairly well, it
may still underestimate the
probability of an extreme
event because the marginal
distribution in the tail is still
approximately normal. In
contrast to the MN model, the
N+W model gives about eight
times and almost 50 times
higher probability to moves of
–4% and –5%, respectively.
Under the N+W model, a
move of –6% (which
corresponds to more than Tails of the distributions
Step 1: Defining scenarios and time steps 27
However, it is widely recognized that, in many modeled as random processes which describe the
cases, the normal distribution does not describe evolution of a random variable through time.
financial returns accurately and that their joint Models of random processes have been widely
behaviour is also more complex than can be used in finance and, specifically, in derivatives
described by the correlations. To address this pricing. For example, the Black-Scholes model for
problem, a number of models have been option pricing (Black and Scholes 1973) assumes
developed and tested that capture the nature of that the stock prices follow a process called
financial returns better and describe the tails of geometric Brownian motion (GBM).
their distributions more accurately. These models
include multi-factor t-distributions, mixtures of As with normal distributions in the single-step
normals, lambda distributions, distributions from case, GBM models are commonly used for both
extreme value theory summed with the normal pricing and risk management because of their
distribution and non-parametric marginal simplicity and mathematical tractability, though
distributions (see, for example, Hull and they do not describe real processes accurately. To
White 1998a, Shaw 1997, Carillo et al. 2000, describe the evolution of interest rates, more
Embrecht et al. 1998). Distribution models that sophisticated models that account for mean
capture fat tails can be effectively incorporated reversion are used. In mean-reverting models,
into a MtF simulation framework and lead to when the level of an interest rate becomes too
more accurate risk measurement and high or too low, a force pulls it back to its mean,
management (see page 28). In summary, these or equilibrium, level. Models with mean
models reversion have been used widely for pricing
interest rate derivatives (e.g., Cox et al. 1985,
• permit more accurate risk measurement and Hull and White 1990, Black and Karasinski
management than commonly used normal 1991, Black et al. 1990). For risk management
distributions purposes, multi-factor models are generally
• provide a better statistical description of the required to capture the joint evolution of term
tails of the distributions and of extreme structures (see page 28).
events with low probability than pure In general, different types of assets have different
historical simulation methods. characteristics and hence a combination of
Most traditional applications use a correlation processes is required to model accurately their
matrix to describe the co-dependence structure evolution into the future. For example, in
of the risk factors. Correlations fully describe the addition to GBM and mean-reverting diffusion
co-dependence structure of random variables processes, one can add processes with jumps and
when the distribution is normal, or more (reflecting) barriers. For example, a complex
generally when the distribution is elliptic. model is required to simulate electricity spot
(Another example of an elliptic distribution is prices (see page 30).
the t-distribution.) However, when the The models discussed above are generally
distribution is not elliptic, there are better described in terms of continuous time models and
descriptors of the co-dependence such as rank the equations used to characterize them are
correlations and copulas Embrecht et al. (1999). called stochastic partial differential equations
Copulas—probably the most general way of (e.g., Karatzas and Shreve 1994). There are also
modeling the joint behaviour of random discrete time models, mostly from the
variables—have been well-researched in recent econometrics literature (e.g., Kim et al. 1999).
years. The description of these tools is beyond These models are commonly used to include
the scope of this document. auto-correlations and other non-Markovian
In contrast to the single-step problem, a multi- structures.
step MtF analysis is required when measuring Continuous time models have been used in
counterparty credit exposures, for ALM, when finance for the last 30 years (Sundaresan 2000).
analyzing dynamic portfolios, etc. (see Table 1.1 They are generally mathematically tractable, and
on page 11). In this case, the risk factors are
28 Step 1: Defining scenarios and time steps
The dependence on
transmission lines to convey
electricity further limits the
available supply of electricity
between regions. One region
can transfer electricity to
another region with which it is
connected, but an excess of
electricity cannot be rerouted Daily electricity spot prices, February 6, 1997–June 9, 1999
along lines that are already
working at capacity.
Therefore, electricity markets
are regional.
∆ ( ln ( St ) ) = b ( t, St ) ( ln ( S∞ ) – ln ( S t ) )d t
+ σ ( t, S )∆
t
reasonably well.
33 Step 1: Defining scenarios and time steps
The MtF Cube consists of a set of N MtF valuation aids in the selection of basis
tables—one for each instrument—generated by instruments.
simulating a given basis instrument over
S scenarios and T time steps. Basis instruments The MtF value, mijt, of basis instrument i,
may represent actual financial products such as (i = 1,…,N) under scenario j at time step t is a
stocks and bonds, synthetic instruments such as a function, f(•), of the levels of specified risk
series of zero coupon bonds or, in some cases, factors and can be represented as
abstract indices such as the inputs to a multi-
factor model. m ijt = f ( u1jt, u 2jt, u 3jt, …, u Kjt )
The choice of basis instruments to be simulated where u kjt , ( k = 1, …, K ) is the level of risk
in the pre-Cube stage of the MtF framework factor k under scenario j at time step t.
depends on the overall business application and
may involve trade-offs between the magnitude of When the basis instrument represents a synthetic
data storage (the dimensions of the MtF Cube) instrument or an abstract index, it is often a
and pricing accuracy. Nonetheless, the function of a single risk factor k, m ijt = f ( u kjt ) . In
dimensions of the MtF Cube can often be this case, the basis instrument itself can be
significantly reduced with little or no loss of mapped one-to-one to a risk factor.
pricing accuracy by the judicious choice of basis
instruments. The key benefit derived from the use of basis
instruments is the ability to pre-compute most of
In most applications, the basis instruments the results of the computations required to
consist of a mixture of actual products calculate portfolio MtF values. The MtF value of
representing exchange-traded securities (or OTC a portfolio containing a single financial product is
securities held in inventory) and synthetic determined by a post-processing function, g(•),
instruments which may be mapped into a large applied to the pre-computed MtF values of the
volume of OTC securities. An understanding of basis instruments. In general, when the financial
how risk factors have an impact on product product is itself defined as a basis instrument in
34 Step 2: Defining the basis instruments
the MtF Cube, the MtF value of the portfolio is stock and a forward contract (expiring at time t) on
simply the MtF value of the basis instrument that stock.
(and g(•) is simply an identity). In the case when
the financial product is not an instrument In this mapping sequence, the MtF value of the
defined in the MtF Cube, the MtF value of the stock under a scenario j at time step t is simply that
portfolio is calculated as a function, g(•), of the of the stock basis instrument. The MtF value of the
MtF values of one or more basis instruments. stock basis instrument is modeled as a simple
realization of the stock risk factor under each
As illustrated by the schema in Figure 2.1, a scenario and time step. The MtF value of the
mapping sequence defines the functional forward contract is a straightforward function of
relationship between risk factors, basis the MtF values of two basis instruments: the stock
instruments, financial products and portfolios. and a zero coupon bond.
In order to select the appropriate basis Consider a second mapping sequence, illustrated in
instruments to be contained in the MtF Cube, a Figure 2.3. In this case, the stock is valued by a
mapping sequence for each financial product generic N-factor equity model which determines
must be defined. The next sections describe some the MtF value of the stock as a function of its
of the modeling issues involved in defining the factor loadings with respect to the individual equity
mapping sequence from risk factor to financial factors. For example, one equity factor could be a
product. In Step 4, the post-processing functions market index; others could be industry factors.
g(•) are applied to basis instrument MtF values to
produce portfolio MtF values. Under this sequence, the MtF value of the stock is,
once again, simply that of the stock basis
Modeling considerations instrument, but now the MtF value of the basis
instrument is based upon the N-factor equity
The choice of basis instruments depends upon model. The difference in modeling between
the mapping sequences that result from the sequence A and sequence B occurs strictly in the
modeling choices made prior to generating the pre-Cube stage.
MtF Cube. As an example, consider mapping
sequence A, illustrated in Figure 2.2, chosen to Finally, consider a third mapping sequence,
determine MtF values for a non-dividend paying illustrated in Figure 2.4. In this case, the basis
Equity Factor n
instruments defining the MtF Cube are the sequence often involves more pragmatic trade-offs.
abstract equity factors themselves. The MtF For example, mapping sequence A may be
value of the stock must, therefore, be determined preferred when full explanatory power is desired for
in a secondary post-processing step as a function future equity distributions (perhaps based upon
of the MtF values of the equity factor basis historical realizations of stock prices). However,
instruments. this approach may prove deficient when newly
issued securities (that possess no price history) are
In this sequence, the MtF value of the stock is modeled.
determined as a function of the MtF values of
the N equity-factor basis instruments. Unlike In contrast, mapping sequence C may be preferred
sequences A and B, sequence C involves a for the purpose of minimizing the dimensions of the
modeling change in the post-Cube stage as well MtF Cube as well as for valuing equities with no
as in the pre-Cube stage. In other words, the previous price history. However, this approach may
post-Cube mapping of basis instruments into be less appropriate when it is important to consider
portfolios in sequence C requires the mapping of the specific risks associated with individual
basis instruments into individual financial equities. Note that specific risks can still be
products in an intermediate step. addressed in this particular mapping sequence
through more complex post-processing that
The benefit of the intermediate step is that the
directly incorporates the specific risk of the equity
MtF Cube in this case need never incorporate
as an additional input. The modeling of specific
more than the N MtF tables associated with each
of the N basis instruments. This is true because risk is discussed in Step 5.
the MtF values of all stocks are functions of the
Mapping sequence B may emerge as the preferred
MtF values of the same N basis instruments,
alternative in this case. Specific risk and newly
regardless of the number of stocks in a portfolio.
issued equities may be addressed easily using this
While it is interesting to compare alternative approach, since each security is simulated uniquely
modeling techniques, the choice of mapping using the factor model, thereby allowing for the
Stock instrument s0 s1 s2 s3
straightforward incorporation of specific risk. coupon bond (with a notional of one and
The trade-off is that this sequence has maturity at t = 2) along with a settlement
implications for the size of the MtF Cube, which account which captures and reinvests the
now must incorporate MtF tables for each notional amount paid at the maturity date.
individual equity. Table 2.2 summarizes the MtF values of the all-in
basis instrument across a given scenario path for
Settlement and reinvestment considerations the time horizon, T = 4. The MtF value of the
all-in instrument is the sum of the MtF values of
The choice of appropriate basis instruments is the standard instrument and the cash account.
also influenced by the consideration of cash flow
settlement and reinvestment over time. In order Figure 2.5 provides an illustration of MtF values
to capture the distribution over time of the total for a one-unit investment in the all-in zero
return of a financial product, any cash flow coupon instrument maturing at t = 2, simulated
dispersal must also be incorporated in the pre- across S = 100 scenario paths over the time
computed MtF Cube. This can only be achieved horizon, T = 4.
by defining an all-in basis instrument that
consists of a standard basis instrument and a
settlement account in which any cash settled by
the basis instrument through time is reinvested.
The all-in instrument captures the total return of
a financial product through time.
As an example, consider a stock that pays
dividends at time steps t = 1 and t = 3. An
appropriate risk/reward assessment of this
security at the time horizon, T = 3, requires the
inclusion of any dividends settled (and
reinvested) prior to that time. An all-in stock
instrument represents a portfolio consisting of
the basis stock instrument along with a Figure 2.5: MtF values of all-in t = 2 zero
settlement account that captures and reinvests coupon instrument across scenarios
dividends paid at time steps t = 1 and t = 3. In comparison, Figure 2.6 and Figure 2.7
Table 2.1 summarizes the MtF values of the all-in illustrate the MtF values simulated across
stock basis instrument across a given scenario S = 100 scenarios over the time horizon, T = 4,
path over the period [t =0, T = 3]. The MtF of one-unit investments in two additional all-in
value of the all-in instrument is the sum of the zero coupon instruments (with maturities of
MtF values of the standard instrument and the t = 0 and t = 4, respectively).
cash account. Note that the t = 4 all-in instrument poses only
As a second example, consider an all-in zero price risk over the time period 0 ≤ t < 4 , during
coupon instrument consisting of a standard zero which its MtF value is sensitive to interest rate
Step 2: Defining the basis instruments 37
Risk factors
S Scenarios
S Scenarios
ps
ste
ime
N MtF tables T T
Once the scenarios and time steps have been The MtF table associated with basis instrument i,
defined, the cell content of a MtF table is Mi, is of dimension S x T, where S is the number
generated by a pricing model that is attached to of scenarios and T is the number of time steps.
the applicable basis instrument. A pricing model The cells of Mi are populated by simulating the
represents a string of pricing functions associated values of the basis instrument i over each of the
with specified output attributes; pricing functions j = 1,…, S scenarios and t = 1, …,T time steps.
produce MtF values as well as other measures The MtF Cube, M, consists of N MtF tables
such as MtF deltas and MtF durations. The associated with each of the i = 1,…,N basis
simulation of a basis instrument over the defined instruments. Figure 3.1 illustrates the process for
scenarios and time steps produces MtF measures generating a Basis MtF Cube.
for the specified output attributes that are
functions of the realized risk factor levels.
Step 4: Producing the Portfolio
MtF Table
Key to the MtF framework is the premise that a MtF Cube is generated independently
of portfolio holdings. Any portfolio or portfolio regime (strategy) can be represented by
mapping the MtF Cube into static or dynamically changing portfolio holdings. The
regimes may be predetermined, such as a buy-and-hold strategy, or they may be
conditional upon the scenario-based MtF values, such as a delta-hedging strategy.
Conditional regimes are used to capture liquidity risk. This chapter discusses how
multiple portfolio MtF tables are produced from a single MtF Cube.
In Step 4 of the MtF framework, the MtF Cube is are contained in the MtF Cube, the values in the
mapped into multiple portfolios or portfolio portfolio MtF table are a linear combination of
regimes over time. The result of this mapping is the values in the instrument MtF tables,
the creation of a portfolio MtF table containing weighted by portfolio holdings in each scenario
the MtF values of a portfolio regime simulated and time step. In this case, the portfolio MtF
across scenarios and time steps. The mapping value in scenario j and time step t is
incorporates the holdings in the instruments,
which may be dynamic across scenarios and N
through time. Thus, a portfolio MtF table R R
m jt = ∑ x ijt ⋅ m ijt
captures the MtF values of a portfolio regime— i=1
such as a roll-over strategy or an immunization
strategy—in a manner similar to that used to
where x Rijt is the holdings of instrument i under
capture the MtF values of a static portfolio. This
mapping is completely independent of the portfolio regime R for scenario j at time step t.
generation of the MtF Cube and occurs strictly as
a post-processing step in the post-Cube stage. The positions, x Rijt , are only constant for all
scenarios and time steps for a static buy-and-hold
A portfolio MtF table, MR, associated with portfolio regime. In the general case, position size
portfolio regime R has dimensions S x T, where S may be time and scenario dependent, thus
is the number of scenarios and T is the number of enabling the risk/reward assessment of
time steps. Each cell in the table contains the dynamically changing portfolios. The fourth step
MtF value, m jtR , of portfolio regime R under of the MtF framework maps basis instruments
scenario j at time step t. A value in the portfolio into the portfolio regimes, as illustrated by the
MtF table is a function g(•) of the values in the mapping sequence in Figure 4.1. For now, we
instrument MtF tables contained in the MtF assume the basis instruments are actual financial
Cube. In the case where actual financial products products.
42 Step 4: Producing the portfolio MtF table
It is useful to identify two general types of For example, consider a position in a single bond.
strategies for mapping the MtF Cube into Differing portfolio liquidation assumptions can
portfolio holdings: predetermined regimes and be assessed by the application of different
conditional regimes. predetermined portfolio regimes. The portfolio
mapping sequence associated with two bond
Predetermined regimes liquidation regimes is illustrated in Figure 4.2.
In a predetermined regime, portfolio strategies First, assume that the given bond position may be
are independent of the contents of the fully liquidated in a single day. In this case, the
MtF Cube, but may, nonetheless, change over distribution of portfolio MtF values over the first
time in a predetermined fashion. Under these time step represents the one-day market risk of
regimes, the values in a portfolio MtF table a the bond position, assuming complete liquidation
straightforward linear combination of fixed on that date. A predetermined portfolio regime
quantities of the values of financial products. A captures this liquidation strategy through a
static buy-and-hold strategy is simply a special schedule that reduces the holdings in the bond
case of a predetermined regime describing a from 100% to 0% in one day.
schedule of portfolio holdings that remains
Next, assume that the bond position must be
constant for all scenarios and time steps.
liquidated proportionately over a 10-day time
As a slightly more complex example, consider the horizon. The effect of illiquidity is to lock in
modeling of liquidity risk. In a standard exposure to market risk until the position is
approach, the change in mark-to-market value— closed. A full 10-day distribution of MtF values
a portfolio’s unrealized profit and loss—serves as must be considered in order to capture this
a proxy for its realized profit and loss or the impact. A predetermined portfolio regime
actual liquidation of portfolio holdings. A captures this liquidation strategy through a
liquidity-adjusted risk measure is often estimated schedule that reduces the holdings in the bond at
by applying an ad hoc addition to a non-adjusted a rate of 10% each day for 10 days (see page 43).
risk measure. The liquidity risk measure assessed Note that by dynamically mapping the individual
in a standard risk framework does not bonds into appropriate liquidation regimes for a
incorporate the explicit passage of time nor allow common 10-day time horizon, the risk of both
for dynamically changing portfolio holdings. In liquid and illiquid bonds may now be assessed
contrast, the MtF framework allows for the within a consistent framework.
explicit liquidation of holdings through the
specification of a regime that liquidates the A liquidation risk measure must often account
portfolio over a given time horizon. for the reinvestment of any cash settlement in
Liquidation (liquid)
Bond
Liquidation (illiquid)
Regime A MtF
Instrument MtF 100% mA14, ….…, mAS4
liquid regime
100% mA12, ….…, mAS2
t=4 m114, …….…, m1S4 A A
0% m 10, ….…, m S0
t=2 m112, …….…, m1S2
t=0 m110, …….…, m1S0
Regime B MtF
scenarios
100% mB14, ….…, mBS4
illiquid regime 50% mB12, ….…, mBS2
B B
0% m 10, ….…, m S0
Liquidation regimes
44 Step 4: Producing the portfolio MtF table
The fall of Gold Fields (Share price = 100 on April 14, 1997)
Step 4: Producing the portfolio MtF table 45
Infinite Market risk only. No impact on liquidity Market Risk Scenario Instantaneous
Liquidity risk. Liquidation
Average Market risk with a reasonable amount of Market Risk Scenario Unconditional
Liquidity liquidity in the market. Liquidation
Limited Market risk with reduced liquidity, Market Risk Scenario Conditional
Liquidity simulating the impact of liquidity under and Liquidation
volatile markets. High Volatility Scenario
Negligible Market risk under the worst case Market Risk Scenario Conditional
Liquidity liquidity scenarios, simulating the impact and Liquidation
of liquidity under extreme situations. Low Trading Volume
Scenario
3. Conditional Liquidation:
Sale of Gold Fields is
restricted to 20% of each
day's trading volume,
assuming that, at this rate,
shares can be sold at the
quoted market price.
Product 1
Previous Day holdings
Current Holdings
Product n
addition to the bond liquidation mechanism just where g(•) represents a function that maps the
described. As the settlement amount depends contents of the MtF Cube into the portfolio
upon the bond MtF values under each scenario holdings appropriate for each scenario and time
and time step, this represents an example of a step. Notice that portfolio holdings are a function
conditional regime; conditional regimes are
of the MtF values contained in the MtF Cube
described in the next section.
which, in turn, are a deterministic function of the
As a second example of a predetermined regime, risk factor levels realized under each scenario and
consider an attribution analysis for a portfolio time step.
over a given time period (see page 47). Standard
risk/reward frameworks presume that the As an example, consider a straightforward
portfolio holdings do not change; the impact on Treasury bill roll-over strategy. At the maturity of
the portfolio profit and loss due to market risk
a Treasury bill currently held in a portfolio, the
factor changes over this period is captured
through the appropriate selection of scenarios for proceeds from its notional amount are reinvested
the market risk factor levels of the current and in a second Treasury bill. The actual quantity of
previous periods. In addition to these factors, the second Treasury bill purchased is a function
however, in the MtF framework, the profit and of its MtF value at the maturity date of the
loss impact due to time decay can be captured by current Treasury bill. At the maturity of the
explicitly incorporating the passage of time and second Treasury bill, the proceeds are reinvested
the impact due to position change by applying into a third Treasury bill in an amount that is a
different portfolio regimes. The portfolio
function of its MtF value at that second maturity
mapping sequence associated with an attribution
analysis is illustrated in Figure 4.3. date. The portfolio mapping sequence is
illustrated in Figure 4.4.
Conditional regimes
Each scenario path through time dictates a
In a conditional regime the portfolio strategies different total return for the roll-over strategy as
are dependent on the contents of the MtF Cube,
differing quantities of each Treasury bill will be
M, or values that may be derived from the MtF
purchased (see page 48).
Cube. Under these regimes, the quantity of
holdings is a function of the MtF values for given
scenarios and time steps. The portfolio holding, As a second example, consider a delta-hedging
R regime applied to a portfolio consisting of an
of instrument i under regime R for scenario j
x ijt ,
equity option, the underlying stock and a cash
at time step t is defined by the function
account. At specified dates, the portfolio is
R
xijt = g ( M )
dynamically rebalanced to delta hedge the
option. This is achieved by acquiring holdings in
= g ( f ( u 1jt, u 2jt, u 3jt, …, u Kjt ) ) the stock (funded by the cash account) sufficient
Step 4: Producing the portfolio MtF table 47
Attribution Analysis
Consider the impact of risk
factor and position changes Portfolio
Attribution factor Portfolio MtF value
between the previous day regime
(t = 0) and today (t = 1) on a
Previous day mark-to- A A A
portfolio containing a single m 00 = x 1 ⋅ m 100
market
financial product (i = 1) with
MtF values of m1jt. Change due to market A A A
m 10 = x 1 ⋅ m 110
factors
In addition to capturing the
Change due to time A A A
impact on profit and loss due m 01 = x 1 ⋅ m 101
decay
to market risk factors, the MtF
framework enables the capture Change due to B B B
m 00 = x 1 ⋅ m 100
position change
of the impact on profit and
loss due to time decay and Total profit and loss B B B
m 11 = x 1 ⋅ m 111
position change. Time decay is attribution
modeled by explicitly
Portfolio MtF values for an attribution analysis
incorporating the passage of
time while position change is
modeled by applying different consists of a static portfolio position change are appropriate
portfolio regimes. In this based strictly on the previous for attribution or backtesting
example, two scenarios are day’s positions, or x A1jt = x A1 , for analysis. The impact on profit
utilized: the risk factor levels and loss of the attribution
all j and t. Portfolio regime B
from the previous day (j = 0) analysis is the difference
consists of a static portfolio
and the risk factor levels from between the portfolio MtF
based strictly on today’s
today (j = 1). values under the two regimes
positions, or x B1jt = x B1 , for all j and the portfolio mark-to-
Individual portfolio MtF and t. market value of the previous
values are calculated based on day.
the mapping of two position The mark-to-market value
schedules associated with the from the previous day, as well
change in position from t = 0 as the change in mark-to-
until t = 1. Portfolio regime A market value due to changes
in risk factors, time decay and
Regime A MtF
m A01 mA11
Instrument MtF ions
posit
e rd a y
’s mA00 mA10
Today m101 m111 Yest
osi
j = fac ’s
j = act ay’
1 tors
tion
k ay
k e rd
s m B01 mB11
ris Tod
ris est
f
Y
mB00 mB10
Attribution analysis schema
48 Step 4: Producing the portfolio MtF table
to make the overall portfolio delta neutral. The MtF values. Specifically, the MtF deltas for each
portfolio mapping sequence for this portfolio of the financial products are required. The
regime is illustrated in Figure 4.5. impact of delta hedging on the risk and reward
can be captured through a conditional portfolio
Position holdings change as a function of the MtF regime (see page 50).
values and the MtF deltas of each product. To
execute a delta-hedging strategy, the MtF Cube Additionally, changes in the MtF values of the
must contain other MtF measures in addition to cash account reveal the funding requirements in
t =1 T-Bill
t =5 T-Bill
Equity Option
Cash Account
each scenario, thus determining the associated There are two primary benefits of incorporating
funding and liquidity risk. Finally, the this intermediate step. The first is that all
effectiveness of different trading regimes, such as financial products of interest may not be known
delta and delta-gamma hedging strategies, can be a priori; this step provides a mechanism for
easily compared. evaluating them in the post-Cube stage rather
than requiring additional simulation in the pre-
A conditional strategy is directly dependent on Cube stage. The second benefit is that the
the contents of the MtF Cube and, thus, is a judicious choice of basis instruments and
deterministic function of the scenarios that corresponding mapping rules can significantly
underlie it. A complex multi-stage, multi-period reduce the dimensions of the MtF Cube and,
stochastic problem has become a single-stage, accordingly, the computational requirements in
multi-period problem that can be readily the pre-Cube stage.
analyzed.
Consider a financial institution with an inventory
Mapping abstract basis instruments into of 100,000 vanilla swaps (of a common currency)
portfolios and the desire to produce a risk report by
simulating across 1,000 scenarios and over
10 time steps. Without this intermediate
If the full set of financial products of interest is mapping step, the dimensions of the required
not contained in the pre-computed MtF Cube, MtF Cube are 100,000 instruments x 1,000
an intermediate mapping is required to produce scenarios x 10 time steps. The number of cells
the portfolio MtF values. This intermediate contained in the MtF Cube and, hence, the
mapping step determines a MtF value for a given number of revaluations required is one billion.
financial product as a function (linear or non-
linear) of the MtF values of the basis instruments The inclusion of an intermediate step that maps
contained in the MtF Cube. The MtF value of a abstract basis instruments into the 100,000 swaps
portfolio containing this product is then simply a reduces the computational requirements of this
function of the product MtF value which itself is exercise significantly. If we assume that the swaps
a function of the MtF values of the basis pay cash flows on 5,000 different cash flow dates
instruments. The mapping sequence in this case (corresponding to daily payments over the next
includes the intermediate step, as illustrated in 20 years), then the number of basis instruments
Figure 4.6. can be reduced to 5,000 without loss of accuracy.
Delta-Hedging Regime
A delta-hedging regime is Prior to the first rebalancing From the second rebalancing
applied to a portfolio date (t = 2), the MtF value of date (t = 4) onward, the MtF
consisting of positions in an the portfolio can be value of the dynamically
equity option (i = 1), the represented as rebalanced portfolio can be
underlying equity (i = 2) and represented as
R R R
a cash account (i = 3). At m jt = x 1jt ⋅ m 1jt + x 2jt ⋅ m 2jt
R R R R
t = 0 the portfolio contains a mjt = x1jt ⋅ m 1jt + x 2jt ⋅ m 2jt + x 3jt
single position in the option = m 1jt – ∆ 1j0 ⋅ m2jt
⋅ m 3jt
and a position in the
underlying equity such that for t ≤ 2 = m 1jt – ∆ 1j4 ⋅ m 2jt
the overall portfolio is initially
delta neutral. The cash m 2j2
From the first rebalancing date + ( ( ∆ 1j2 – ∆ 1j0 ) ⋅ ----------
account at t = 0 has a position m
(t = 2) until just prior to the 3j2
of zero. Under this conditional second rebalancing date
delta-hedging regime, at t = 2 m 2j4
(t < 4), the MtF value of the + ( ∆ 1j4 – ∆ 1j2 ) ⋅ ------------ ) ⋅ m3jt
and t = 4 the portfolio is m 3j4
dynamically rebalanced
dynamically rebalanced to portfolio can be represented as for t ≥ 4
delta hedge the option. This is
achieved by acquiring enough R R R R
m jt = x 1jt ⋅ m 1jt + x 2jt ⋅ m 2jt + x 3jt The portfolio MtF values in this
holdings in the equity (funded example not only enable the
by the cash account) so that ⋅ m 3jt assessment of a delta-managed
the overall portfolio becomes portfolio but, additionally,
= m 1jt – ∆ 1j2 ⋅ m 2jt + ( ∆ 1j2 – ∆ 1j0 )
delta neutral again. enable the assessment of funding
m2j2 risk by inspection of the cash
The MtF deltas for each of the ⋅ ---------- ⋅ m 3j2
m account MtF values.
three financial products are 3j2
The dimensions of the MtF Cube are reduced to As an example, a series of zero coupon all-in basis
5,000 instruments x 1,000 scenarios x 10 time instruments can be mapped into an OTC interest
steps and the number of revaluations is reduced rate swap. Figure 4.7 illustrates the sequence
to 50 million, a decrease in the computational used to map the MtF values of these basis
effort by a factor of 20. instruments into the MtF value of the swap.
t = 2 Zero Bond
t = t* Zero Bond
t = 3 Zero Bond
Figure 4.8: Mapping zero coupon basis instruments into zero bond
The MtF value of the fixed leg of the swap can be Including the interpolation component in the
determined by a static linear function of the mapping can further reduce the number of basis
values of the zero coupon basis instruments, instruments required in the MtF Cube. Consider
while the MtF value of the floating leg can be the 100,000 swaps described above. Assuming
determined by a dynamic non-linear function of that the institution utilizes a 10-node term
the values of the same basis instruments. The
structure in the pricing of these swaps, then the
mapping of the basis instruments into the floating
number of basis instruments can be reduced to 10
leg changes at each reset date and is a function of
their MtF value under each scenario and time without loss of accuracy. The dimensions of the
step. This effect can be captured through a new MtF Cube are reduced to 10 instruments x
conditional portfolio regime (see page 52). 1,000 scenarios x 10 time steps. Correspondingly,
the number of revaluations is reduced
If the cash flow dates of the swap do not coincide dramatically to 100,000, a decrease in the
with the maturity dates of the basis instruments,
computational effort by a factor of 10,000.
a perfect mapping can be achieved by
incorporating a second component in the
Perfect mapping can still be achieved even if the
intermediate mapping. The function of this
swaps pay cash flows that do not fall on the term
additional mapping component is roughly
equivalent to the interpolation method used to structure node points. As all independent
value cash flows that fall between two nodes on interest rate risk factors are represented as basis
an interest rate term structure. The instruments in the MtF Cube, the number of
“interpolation” component of the mapping is basis instruments need not exceed the number of
illustrated in Figure 4.8. nodes in the term structure (see page 53).
52 Step 4: Producing the portfolio MtF table
( t 3 – t2 ) ( t2 – t 1 ) ( t5 – t 4 ) ( t 4 – t3 )
m B00 = 10 ⋅ ------------------
- ⋅ m 100 + ------------------
- ⋅ m 300 + 110 ⋅ ------------------
- ⋅ m 300 + ------------------
- ⋅ m 500
( t 3 – t1 ) ( t3 – t 1 ) ( t5 – t 3 ) ( t 5 – t3 )
= 100.90
In this case, the mapping is a basis instruments. This however, perfect replication of
non-linear combination of the mapping is more complex, the bond can be achieved.
(t – t ) t (t – t ) t t –t t t –t t
3 2 ⋅ ----2
-------------------- 2 1 ⋅ ----2
-------------------- 5 4 ⋅ ----4
--------------- 4 3 ⋅ ----
--------------- 4
m B00 = 10 ⋅ m 100 ( t 3 – t 1) t
1 ⋅ m 300 ( t3 – t1 ) t
3 + 110 ⋅ m 300 t5 – t 3 t
3 ⋅ m500t 5 – t3 t
5
( 3 – 2) 2 (2 – 1) 2 (5 – 4) 4 ( 4 – 3) 4
---------------- ⋅ -- ---------------- ⋅ -- ---------------- ⋅ -- ---------------- ⋅ --
( 3 – 1) 1 (3 – 1) 3 (5 – 3) 3 ( 5 – 3) 5
= 10 ⋅ 0.9704 ⋅ 0.8869 + 110 ⋅ 0.8869 ⋅ 07788
= 101.20
In Step 5, risk/reward measures as well as other values. In some cases, additional information,
quantitative portfolio analytics are applied to the such as scenario probabilities or counterparty
portfolio MtF table. Four general categories of default probabilities are required. In others, the
analytics may be applied to MtF values must be transformed into other
measures before being used to calculate risk/
• calculate distribution descriptors from the reward measures.
portfolio MtF table
• transform MtF values into other MtF Calculating distribution descriptors
measures
Typical post-processing applications based on
• trade off risk and reward
untransformed MtF values calculate statistics
• incorporate specific risk. characterizing the market risk and liquidity risk
associated with particular portfolios or portfolio
A key construct underlying the calculation of the regimes. Selected measures associated with these
distribution descriptors associated with various two risk classes are summarized in Table 5.1 and
risk/reward measures is that the underlying input Table 5.2. In some cases, the applications require
for each is the same MtF Cube. Thus, the MtF the scenario weightings as defined by the S x 1
methodology provides a unifying framework for probability vector p, where each cell
the integration of market risk, credit risk and p j, ( j = 1 , 2, …, S ) contains the probability
liquidity risk measures. A variety of summary
associated with each state j. The formulations are
statistics can be calculated by aggregating the
based on the notation summarized in Table N.4
R
MtF values, m ijt , across the scenario and/or time (see page 80.)
step dimensions of a portfolio MtF table. Thus, in
addition to standard measures that may not The MtF Cube can be used for the assessment of
account for the passage of time, the MtF reward as well as for risk measurement. Table 5.3
framework incorporates forward-looking provides selected reward measures applied as
measures based on future distributions of MtF post-processing applications.
56 Step 5: Producing the desired risk/reward measures
Measure Formulation
Variance
S 2
R 2 R S R
( σt ) = ∑ j jt ∑ j jt
p m – p m
j=1 j=1
Value-at-Risk R R R
VaR t ( α ): Pr { ( m 0t – m jt ) ≥ VaR Rt ( α ) }
(confidence level
α) = 1–α
Expected shortfall S
(confidence level R 1 R R
E [ S jt ] ( α ) = ------------ ∑ p j ( m 0t – m jt – VaR Rt ( α ) )+
α) 1–α
j=1
Regret (risk- S
preference R R –
λE [ D t ] = λ ∑ p j ( m jt – τ jt )
adjusted expected
j=1
downside)
Measure Formulation
Liquidity-adjusted
R R R t∗
VaR (period t*, VaR t, t∗ ( α ): Pr ( m 0t∗ – m jt ) ≥ VaR Rt ( α )
confidence level α)
= 1–α
Measure Formulation
Expected return S R
R p j m jt∗
R t∗ = ∑ -------------
R
–1
j = 1 m 0t
Expected upside S
R R
E [ Ut ] = ∑ pj( mjt – τ jt )+
j=1
Another observation from Figure 5.1 is that the In addition, the value of the short straddle is
portfolio loses value for all DAX moves, whether highly sensitive to changes in the implied
58 Step 5: Producing the desired risk/reward measures
The mean and 95% VaR measures for the 30-day Figure 5.3: Changes in portfolio MtM values
time horizon are calculated to be losses of across Monte Carlo scenarios
1 million EUR and 4.5 million EUR, respectively.
This is not surprising. Inspection of Figure 5.3a
reveals that every scenario results in a loss, and
thus there appears to be risk with no possibility of
reward. A standard Monte Carlo approach
captures the non-linearity but misses the reward
derived from the positive drift that accrues from
the passage of time. In addition, standard Monte
Carlo approaches typically do not account for the
risk associated with changes in implied
volatilities.
Note that the positive drift of the portfolio is With the imposition of the delta-neutral regime,
captured as most scenario paths result in a it is now possible to assess the risks of a managed
modest profit. A number of scenario paths, portfolio. Under this regime, the mean and 95%
however, result in a loss that, in contrast, can be VaR measures in 30 days are calculated to be a
quite dramatic. This is the risk inherent in the gain of 175,000 EUR and a loss of 750,000 EUR,
portfolio if the positions are static over the 30- respectively. There is a 2.5% chance that the
day time horizon. The mean and 95% VaR portfolio will lose 20% of its initial value but no
measures in 30 days are calculated to be a gain of chance that the portfolio will be in ruin. Analysis
250,000 EUR and a loss of 4.0 million EUR, of the managed portfolio reveals that delta
respectively. There is a 2.5% chance that the hedging constrains the potential downside risk of
portfolio will be in ruin, having lost all of its the portfolio considerably, but at the expense of
initial value of 5 million EUR. reduced mean performance.
Measure Transformation
Actual exposure R R
AE jt = max [ mjt ,0 ]
Potential exposure R R R
PE jt = maxt < t* ≤ T [ m jt* – AE jt ,0 ]
Total exposure R R R
TE jt = AE jt + PE jt
60 Step 5: Producing the desired risk/reward measures
reinvestment through the definition of all-in Figure 5.6b illustrates the transformation of
instruments. portfolio MtF values into counterparty MtF
actual exposures across three scenarios and over
Table 5.4 summarizes the transformations of
portfolio MtF values required to provide some 10 time steps. Figure 5.6c and Figure 5.6d
example credit exposure measures. Note that illustrate the transformations of portfolio MtF
these transformation functions are based solely values into counterparty MtF potential exposures
on the information residing in the portfolio MtF and MtF total exposures, respectively, across
table. To calculate actual exposure, a simple three scenarios and over 10 time steps.
transformation is applied to the portfolio MtF
value under each scenario and time step. In the In other cases, additional information such as
slightly more complex case of potential exposure, counterparty default probabilities and recovery
the transformation is based on all portfolio MtF rates may be required. This information can be
values over the t ≤ t* ≤ T time horizon for a given incorporated statically or dynamically in the
scenario. transformation in the algorithms embedded in
Figure 5.6a illustrates the portfolio MtF values of the post-processing application. A summary of
a position with exposure to a given counterparty selected counterparty credit risk measures is
across three scenarios and over 10 time steps. presented in Table 5.5.
Step 5: Producing the desired risk/reward measures 61
Expected S
counterparty R R
credit exposure
E [ TE jt ] = ∑ pjT E jt
j=1
Expected S T
counterparty R R R R
credit loss
Lt = ∑ pj ⋅ ∑ A E jt ∗ ⋅ p ( t∗ j ) ⋅ ( 1 – r ( t∗ j ) )
j=1 t∗ = t
Expected cross-
∑ . ∑ AE
S T R
counterparty R R R
credit loss
Lt = ∑ pj ⋅ jt∗
⋅ p ( t∗ j ) ⋅ ( 1 – r ( t∗ j ) )
j=1 t∗ = t R = 1
Measure Formulation
Sharpe ratio R
R R
S t = ------t
R
σt
RAROC R
Rt
R *t ( α ) = ----------------------
-
R
VaR t ( α )
Examples of this approach include modeling As an example, for a given equity pricing model,
equities by their systemic risk factors alone (see the specific risk component of an individual
Figure 2.4 in Step 2) or by holding spreads-over- stock is independent of the levels of the systemic
yields constant for individual bonds. risk factors that, when combined, determine its
MtF value. The pre-computed MtF Cube need
The MtF framework offers a powerful alternative only contain the systemic realizations of
approach. Specific risk may be incorporated in individual stocks under each scenario and time
the post-Cube stage rather than in the pre-Cube step. Conditional upon a given scenario, the
stage by taking advantage of the independence specific risk for each stock can be incorporated in
between systemic risk factors and specific, or the post-Cube stage to produce appropriate risk/
idiosyncratic, risk factors. Given this property, reward measures combining both systemic and
the specific components of the MtF values of a specific risks.
set of securities are independent of each other, As a second example, for a given credit risk
conditional upon the occurrence of a given model, the specific, or idiosyncratic, component
scenario. This implies that the systemic that contributes to the creditworthiness of a
component of a MtF value can be pre-computed counterparty is independent of the levels of the
and stored in the MtF Cube with the specific risk systemic credit drivers. The pre-computed
component incorporated strictly in the post- MtF Cube contains the counterparty MtF credit
Cube stage, either analytically or through a exposures as well as the systemic realizations of a
secondary sampling exercise. Methodologies that creditworthiness index under each scenario and
may be applied in the post-Cube stage include time step. Conditional upon a given scenario, the
the law of large numbers, the central limit specific risk component of the creditworthiness
theorem, probability generating functions and index can be incorporated in the post-Cube
moment generating functions. stage. This is further explained in Step 6.
Step 6: Advanced Mark-to-
Future Applications
MtF Cubes may serve as input for applications more complex than calculating simple
risk/reward measures. The properties of linearity and conditional independence on
each scenario can be used to obtain computationally efficient methodologies. For
example, conditional independence within a particular scenario is a powerful tool that
allows the MtF framework to incorporate processes such as joint counterparty
migration effectively. In addition, portfolio or instrument MtF tables may be used as
input to a wide variety of scenario-based risk management and portfolio optimization
applications.
Financial institutions worldwide have devoted market, credit and liquidity risk, and facilitates
considerable effort to developing enterprise-wide the construction of effective scenario-based risk
systems that integrate financial information management and optimization tools. In this
across their organizations to measure their section, we demonstrate these principles with
institution’s risk. Probabilistic measures, such as two applications: an integrated market and credit
VaR, are now widely accepted by both financial risk framework and scenario-based optimization
institutions and regulators for assigning risk tools that trade off risk and reward efficiently.
capital and monitoring risk. Since development
efforts have been driven largely by regulatory and Integrated market and credit risk
internal requirements to report risk numbers, the
development of tools to understand and manage Credit risk modeling is one of the most important
risk across the enterprise has generally lagged topics in risk management and finance today.
behind those designed to measure it. The last decade has seen the development of
models for pricing credit risky instruments and
Measuring risk is a passive activity; simply derivatives, for assessing the creditworthiness of
knowing the VaR does not provide much obligors, for managing exposures of derivatives
guidance for managing risk. In contrast, risk and for computing portfolio credit losses for
management is a dynamic endeavour, requiring bonds and loan portfolios.
tools that construct a comprehensive picture that However, common practice still treats market
integrates all types of risk as well as identify and and credit risk separately. When measuring
reduce the sources of risk. Risk management market risk, credit risk is commonly not taken
tools should lead to an effective utilization of the into account; when measuring portfolio credit
wealth of financial products available in the risk, the market is assumed to be constant. The
markets to obtain the desired risk and reward two risks are then “added” in ad hoc ways,
profiles. resulting in an incomplete picture of risk.
The Mark-to-Future framework enables the There are two types of credit risk measurement
integration of various types of risks, such as models: counterparty credit exposure models and
64 Step 6: Advanced applications
portfolio credit risk models. The integration of One of the strongest arguments for integrating
market and credit risk has a major impact in both market and credit risk is the desire to avoid
cases. “wrong-way” counterparty exposures. Wrong-way
exposures occur when, in a given scenario, the
Counterparty exposure models market move increases the counterparty
exposure and simultaneously weakens the
Derivative desks traditionally manage credit risk
counterparty's credit quality. Many analysts have
by monitoring and placing limits on counterparty
credit exposures. For derivative portfolios, identified wrong-way exposures as the cause of
counterparty exposure is generally defined as US commercial bank problems during the 1998
the economic loss that will be incurred on all Asian crisis. Most methodologies do not measure
outstanding transactions if a counterparty counterparty exposures properly since they
defaults, unadjusted for possible future assume, either explicitly or implicitly, that
recoveries. More generally, exposures may also be interest rates and other financial factors and a
defined conditional not only on a counterparty counterparty's credit quality are independent.
default but also on a credit migration such as a
downgrade. Counterparty exposure models Portfolio credit risk models
measure and aggregate the exposures of all
Counterparty credit risk models focus on risk at
transactions with a given counterparty.
the counterparty level only; they do not attempt
As explained in Step 5, the total exposure to a to capture portfolio effects such as the
counterparty is defined as a single number which correlation between counterparty defaults and
is the sum of the actual exposure, based on the migrations. In contrast, portfolio credit risk
mark-to-market of the portfolio, and the models measure credit capital and are specifically
potential exposure, reflecting the changes of the designed to capture portfolio effects, specifically
counterparty exposure in the future. In the BIS obligor correlations. Portfolio credit risk models
regulatory model, the potential exposure is given popular in the industry include CreditMetrics
by an add-on factor multiplying the notional of (J.P. Morgan 1997), CreditRisk+ (Crédit Suisse
each transaction (Basle Committee on Banking Financial Products 1997), Credit Portfolio View
Supervision 1988). Although simple to (Wilson 1997a and 1997b) and KMV’s Portfolio
implement, the model has been widely criticized Manager (Kealhofer 1996). Although
because it does not accurately account for the superficially they appear quite different—the
nature of these exposures in the future. Since models differ in their distributional assumptions,
exposures of derivatives such as swaps depend on
restrictions, calibration and solution—
the level of the market when default/migration
Gordy (1998) and Koyluoglu and
occurs, models must capture not only the actual
Hickman (1998) show an underlying
exposure to a counterparty at the time of the
mathematical equivalence among these models.
analysis but also its potential future changes.
Furthermore, empirical work shows generally
Recently, more advanced methods based on that all portfolio credit risk models yield similar
Monte Carlo simulation (Aziz and results if the input data is consistent (Crouhy and
Charupat 1998) have been implemented by Mark 1998, Gordy 1998).
financial institutions. The contingency of the
market on derivative portfolios and credit risk A major limitation of portfolio credit risk models
can be captured explicitly by using a MtF is the assumption that market risk factors, such
framework to simulate counterparty portfolios as interest rates, are deterministic. Hence, they
through time over a wide range of scenarios. do not account for stochastic exposures. While
Furthermore, natural offsets, netting, collateral this assumption has less consequence for
and various mitigation techniques used in portfolios of floating rate loans or bonds, it has
practice can be modeled accurately in a MtF great impact on derivatives such as swaps and
framework. The framework also lends itself to options.
implementing systematic stress testing for
exposures using techniques such as scenario Ultimately, a comprehensive framework requires
banding (Cartolano and Verma 2000). the full integration of market and credit risk.
Step 6: Advanced applications 65
Scenarios
Part 1 (risk factors) Basis instruments
Generation of systemic risk factors
Pre-Cube
MtF Cube
Simulation of instrument
MtF values
Mapping into obligors to Obligor mapping into
produce obligor MtF values credit drivers
Obligor MtF Part 3
Part 2 values & CWI’s
Transformation of MtF
Post-Cube
values to exposures
Estimation of conditional
portfolio loss distributions
Part 4
Figure 6.1: Integrated market and credit risk in the MtF framework
Correlations among obligors are determined by intensity based (Lando 1997 and 1998, Duffie
the joint variation of conditional probabilities and Singleton 1997, Jarrow and Turnbull 2000).
across scenarios. Thus, a default/migration model
defines a functional relationship that maps the Part 4: Conditional portfolio loss distribution
CWI and the unconditional default/migration in a scenario. Conditional upon a scenario,
probabilities into conditional default/migration obligor defaults and migrations are independent.
probabilities for each scenario and time step. In practice, the computation of conditional losses
can be onerous. In the most general case, a
Default/migration models can be econometric as Monte Carlo simulation can be applied to
in the logit model (Wilson 1997a, 1997b), determine conditional portfolio losses. However,
structural as in the Merton model (J.P. the observation that obligor defaults are
Morgan 1997, Iscoe et al. 1999), reduced form or independent permits the application of more
Step 6: Advanced applications 67
effective computational tools. Some of these • explain the effect of new trades on portfolio
techniques are described in Crédit Suisse (1997), risk
Finger (1999) and Nagpal and Bahar (1999).
• explain the impact of positions in non-linear
For example, for very large and homogeneous instruments and of non-normal risk factor
portfolios, the law of large numbers can be used distributions on portfolio risks
to estimate conditional portfolio losses. As the
number of obligors approaches infinity, the • explain complex, non-intuitive, market views
conditional loss distribution converges to the implicit in the portfolio as well as in the
mean loss over that scenario; the conditional
investment policy or market liquidity
variance and higher moments become negligible.
Other methods include the application of the • generate potential hedges and optimize
central limit theorem (which assumes the portfolios.
number of obligors is large, but not necessarily as
large as that required for the law of large
The most widely used tools are based on
numbers), the application of moment generating
extensions of the insights originally developed by
functions with numerical integration or the
application of probability generating functions Markowitz (1952) and Sharpe (1964) in modern
with a discretization of exposures. portfolio theory. For example, Litterman (1996a,
1996b, 1997a, 1997b) describes a comprehensive
Part 5: Aggregation of losses in all scenarios. set of analytical risk management tools,
Finally, the unconditional distribution of portfolio developed in close collaboration with the late
credit losses is obtained by averaging the Fisher Black and his colleagues at Goldman
conditional loss distributions over all possible
Sachs. These tools are based on a linear
scenarios.
approximation of the portfolio to measure its risk
In summary, the integration of market and credit and assume a joint (log)normal distribution of
risk is achieved in the scenarios which define the underlying market risk factors, similar to the
explicitly the joint evolution of market risk RiskMetrics VaR methodology (Longerstaey and
factors and credit drivers. The Mark-to-Future Zangari 1996). Litterman further emphasized the
framework enables the integration of market and dangers of managing risk using only linear
credit risk and an accurate, computationally approximations. However, in spite of their
efficient estimation of counterparty exposures
onerous assumptions, the insights provided by
and portfolio credit risk.
these tools are very powerful and constitute a
An example of integrated market and credit solid conceptual basis for a risk management
scenarios is presented on page 68. toolkit. (The reader is also referred to the related
papers by Garman (1996, 1997) on marginal VaR
Risk management tools and optimization and risk decomposition.)
applications
Within the Mark-to-Future framework, these
In addition to monitoring risk, an effective risk concepts can be extended to create a simulation-
management function must help the firm
based risk management toolkit. The MtF
understand the sources of its exposures, how
market or portfolio changes affect its risk profile simulation-based tools provide additional insights
and how to obtain optimal trade-offs between when the portfolio contains non-linearities, when
risk and reward, within and across various the market distributions are not normal or when
business lines. A comprehensive risk manager’s there are multiple horizons. Furthermore, they
framework must also explicitly model discrete markets that are
often observed in practice, where trading may be
• represent complex portfolios simply
costly and liquidity limited. The methodology
• decompose risk by asset and/or risk factor also naturally accommodates transaction costs,
68 Step 6: Advanced applications
liquidity and other specified user constraints, as Mausser and Rosen (1998, 1999c) demonstrate
well as investor preferences. how the MtF framework can be used to create
computationally efficient scenario-based risk
MtF risk management tools can be based on risk management tools to measure marginal risk, risk
measures other than variance; they work well contributions and triangular risk decomposition
with risk measures such as VaR, expected and to create trade risk profiles. Dembo and
shortfall and regret. In particular, these tools Rosen (1999) discuss the application of inverse
have proven very useful not only for market risk problems in portfolio replication.
analysis, but also for credit risk, for which the
exposure and loss distributions are generally far In what follows, we illustrate how these concepts
from normal (they are skewed and have fat tails), can be applied to construct optimal portfolios
and for ALM, which requires multi-period that trade off risk and reward efficiently.
stochastic decision making.
Portfolio optimization and risk/return efficient
frontiers
It is widely believed that simulation-based risk
management tools are impractical because they Since knowledge of actual product holdings is
require substantial additional computational not required to generate the MtF Cube, various
work (Dowd 1998). In fact, little or no additional risk management tools and portfolio optimization
simulation is required to obtain risk management techniques can be applied in the post-processing
analytics using efficient computational methods stage. Such methodologies typically involve
implemented in the MtF framework. constructing a series of portfolios that are
efficient. Efficient portfolios have the highest
An extensive literature demonstrates these reward for a given level of risk, or equivalently,
points. The basic concepts of MtF-based the lowest risk for a given level of reward. In all
optimization applications were first developed in approaches, the key input is a single MtF Cube,
Dembo (1991), Dembo (1992), Dembo and while the specific optimization models are applied
King (1992) and Dembo (1995). They were more as post-processing applications.
recently extended by Dembo (1998a),
Dembo (1998b), Dembo(1998c), Dembo and Applications for these models include
Freeman (1998), Dembo and Rosen(1999) and
• index tracking
Dembo and Mausser (2000). These papers show
that using an expected downside risk measure • hedging and pricing
leads to relevant and linear programming
• asset allocation
problems which can be readily solved. Konno and
Yamazaki (1991) also present applications of this • portfolio compression
type of risk measure in finance applications.
• asset-liability management
Rockefellar and Uryasev (2000) demonstrate • risk restructuring
that other measures such as conditional VaR
• capital allocation.
(also known as expected shortfall) are also
tractable and lead to linear programs. The efficient frontier traces the optimal trade-
Applications of scenario optimization tools to off between risk and reward. Figure 6.2 illustrates
credit risk are given in Mausser and a typical trade-off profile. By definition, no
Rosen (1999a, 1999b) and Anderson et portfolio can lie above this frontier. Efficient
al. (2000). While these efforts have largely portfolios are contained on the frontier.
focussed on one-period optimization problems, Portfolio A lies below the frontier and, thus, is
there is also a vast literature on multi-stage inefficient because alternative portfolios can be
stochastic programming applications (see for constructed with either a lower level of risk for
example, Carino and Ziemba (1998), Ziemba and the same reward or higher reward for the same
Mulvey (1998) and the references therein). risk.
72 Step 6: Advanced applications
T
max ( x, u, d ) p u
s.t.
T
p d ≤k
T T
u – d – M – rq x = 0
x ≥ xL
x ≤ xU
u ≥0
d ≥0
Ackworth, P., M. Broadie and P. Glasserman, Bollerslev, T., 1986, “Generalized autoregressive
1997, “A comparison of some Monte Carlo conditional heteroscedasticity,” Journal of
and quasi Monte Carlo techniques for option Econometrics, 31, 307–327.
pricing,” in Monte Carlo and Quasi Monte Carlo
Boyle, P., 1977, “Options: a Monte Carlo
in Scientific Computing, N.Y.: Springer Verlag;
approach,” Journal of Financial Economics, 4(2):
54–79.
323–338.
Ahn, H., M. Bouabci and A. Penaud, 2000,
Boyle, P., M. Broadie and P. Glasserman, 1997,
“Tailor-made for tails,” Risk, 13(2), 95–98.
“Monte Carlo methods for security pricing,”
Anderson, F., H. Mausser, D. Rosen and S. Journal of Economic Dynamics and Control, 21:
Uryasev, 2000, “Credit risk optimization with 1323–1352.
conditional Value-at-Risk criterion,” Working
Paper, University of Florida and Algorithmics Broadie and Glasserman, 1997, “Monte Carlo
Inc., forthcoming. methods for pricing high-dimensional
American options,” Global Derivatives ‘97.
Aziz, J and N. Charupat, 1998, “Calculating Paris: April 1997.
credit exposure and credit loss: a case study,”
Algo Research Quarterly, 1(1): 31–46. Caflisch, R., W. Morokoff and A. Owen, 1997,
“Valuation of mortgage-backed securities using
Basle Committee on Banking Supervision, 1988, Brownian bridges to reduce effective
International Convergence of Capital dimension,” Computational Finance, 1(1): 27–
Measurements and Capital Standards, July,
46.
(http://www.bis.org).
Cardenas, J., E. Fruchard, J. Picron, C. Reyes, K.
Basle Committee on Banking Supervision, 1997,
Walters and W. Yang, 1999, Monte Carlo
Explanatory note: modification of the Basle
within a day,” Risk, 12(2): 55–59.
Capital Accord of July 1988, as amended in
January 1966, (Accessed September 1999, Carino, D. and W. Ziemba, 1998, “Formulation of
http://www.bis.org). the Russel-Yasuda Kasai financial planning
Black, F. and M. Scholes, 1973, “The pricing of model,” Operations Research, 46 (4): 443–449.
options and corporate liabilities,” Journal of Carillo S., P. Fernandez, N. Hernandez and L.
Political Economy, 81 (May–June): 637–654. Seco, 2000, “Scenario generation techniques
Black, F., E. Derman and W. Toy, 1990, “A one- in Mark-to-Future analysis,” Working Paper,
factor model of interest rates and its RiskLab Madrid, Universidad Autonoma de
application to treasury bond options,” Financial Madrid, and RiskLab Toronto, University of
Analysts Journal, Jan/Feb, 33–339. Toronto.
Black, F. and P. Karasinski, 1991, “Bond and Cartolano, P. and S. Verma, 2000, “Using
option pricing when short rates are scenario banding to stress test counterparty
lognormal,” Financial Analyst Journal, July/ credit exposures,” Algo Research Quarterly,
August 1991: 52–59. 2(4): 27–36.
76 References
Cox, J., J. Ingersoll and S. Ross, 1985, “A theory in the evolution of risk, ” Draft Release,
of the term structure of interest rates,” Algorithmics Inc.
Econometrica, 53(2): 385–407. Dembo, R. and H. Mausser, 2000, “The Put/Call
CreditMetrics: The Benchmark for Understanding Efficient Frontier,” Algo Research Quarterly,
Credit Risk, Technical Document, 1997, New 3(1): 13–25.
York, N.Y.: J.P. Morgan Inc.
Dowd, K., 1998, “VaR by increments,” Enterprise
Crédit Suisse Financial Products, 1997, Wide Risk Management Special Report, Risk, 31–
CreditRisk+: A Credit Risk Management 32.
Framework, New York, N.Y.
Duffie, D. and K. Singleton, 1997, “Modeling
Crouhy, M. and R. Mark, 1998, “A comparative term structures of defaultable bonds,” Working
analysis of current credit risk models,” Paper Paper, Graduate School of Business, Stanford
presented at the conference Credit Modeling University, Forthcoming in Review of Financial
and Regulatory Implications, London, Studies.
September 1998.
Embrecht, P., S. Resnick and G. Samorodnitsky,
Dembo, R., 1991, “Scenario optimization,” 1998, “Living on the edge,” Risk, January, 96–
Annals of Operations Research, 8: 267–284.
100.
Dembo, R., 1992, “Scenario optimization,” US
Patent Number: 5148365, dated September Embrecht, P., A. McNeil and D. Straumann,
15, 1992, (filed August 15, 1989). 1999, “Correlation and dependency in risk
management: properties and pitfalls,” Working
Dembo, R. and A. King, 1992, “Tracking models paper, Dept. of Mathematics, ETHZ, Zurich,
and the optimal regret distribution in asset Switzerland.
allocation,” Applied Stochastic Models and Data
Analysis, 8: 151–157. Engle, R. and J. Mezrich, 1995, “Grappling with
GARCH,” Risk, September: 112–117.
Dembo, R., 1995, Optimal Portfolio Replication,
Research Paper Series 95–01 (24 Feb. 1997. Finger, C., 1999, “Conditional approaches for
Revised 4 Feb. 1999). CreditMetrics portfolio distributions,”
CreditMetrics Monitor, April: 14–33.
Dembo, R., 1998a, “Mark-to-Future,” Morgan
Stanley Dean Witter Global Equity and Garman, M., 1996, “Improving on VaR,” Risk,
Derivatives Markets, March 11, 1998: 6–16. 9(5): 61–63.
Dembo, R., 1998b, “Mark-to-Future: a consistent Garman, M., 1997, “Taking VaR to pieces,” Risk,
firm-wide paradigm for measuring risk and 10(10): 70–71.
return,” in Risk Management and Analysis, Vol. Gordy, M., 1998, “A comparative anatomy of
1: Measuring and Modeling Financial Risk, Carol credit risk models,” Federal Reserve Board,
Alexander (Ed.), New York: John Wiley & Finance and Economics Discussion Series,
Sons Ltd. 1998: 47.
Dembo, R., 1998c, “Method and apparatus for Greenspan, A., 1999, “The evolution of bank
optimal portfolio replication,” US Patent supervision,” Speech to the American Bankers
Number: 5799287, dated August 25, 1998, Association, Phoenix, Arizona, October 11,
(filed May 24, 1994). 1999, (http://www.bog.frb.fed.us/boarddocs/
Dembo, R. and A. Freeman, 1998, Seeing speeches/1999/).
Tomorrow, New York: John Wiley & Sons Inc. Hull J. and A. White , 1998a, “Value at risk when
Dembo, R. and D. Rosen, 1999, “The practice of daily changes in market variables are not
portfolio replication: a practical overview of normally distributed,” The Journal of
forward and inverse problems,” Annals of Derivatives, Spring, 9–19.
Operations Research, 85: 267–284. Hull J. and A. White, 1998b, “Incorporating
Dembo, R., A. Aziz, B. Boettcher, J. Farvolden, volatility in to the historical simulation
A. Shaw and M. Zerbs, 1999, “Mark-to- method for VaR,” The Journal of Risk, 1(1): 5–
Future: a comprehensive guide to a revolution 20.
References 77
Hull, J. and A. White, 1990, “Pricing interest- Litterman, R., 1996a, “Hot spots and hedges,”
rate derivative securities,” The Review of Risk Management Series, Goldman Sachs.
Financial Studies, 3(4): 573–592.
Litterman, R., 1996b, “Hot spots and hedges,
Iscoe, I., A. Kreinin and D. Rosen, 1999, “An Journal of Portfolio Management, (Special Issue):
integrated market and credit risk portfolio 52–75.
model,” Algo Research Quarterly, l(2): 3, 21–37. Litterman, R., 1997a, “Hot spots and hedges (I),”
Jamshidian, F. and V. Zhu, 1997, “Scenario Risk, 10(3): 42–45.
simulation theory and methodology,” Finance Litterman, R., 1997b, “Hot spots and hedges
and Stochastics, 4(1): 43–67. (II),” Risk, 10(5): 38–42.
Jarrow, R. and S. Turnbull, 2000, “The Locke, J., 1999, “A fear of floating,” Latin Risk,
intersection of market and credit risk,” Journal (Risk Special Report), July: 4–5.
of Banking and Finance, 24, 271–299.
Longerstaey, J. and P. Zangari, 1996, RiskMetrics-
J.P. Morgan, 1996, (see Longerstaey and Zangari, Technical Document, 4th ed., New York:
1996). Morgan Guaranty Trust Co. (Available at
J.P. Morgan, 1997, (see CreditMetrics, 1997). http://www.riskmetrics.com/rm/index.cgi).
Karatzas, I. and S. Shreve, 1994, Brownian Motion Lopez, J., 1999, “Regulatory evaluation of value-
and Stochastic Calculus, New York: Springer at-risk models,” Risk, 1(2), 37–64.
Verlag.
Lopez, J. and M. Saidenberg, 2000, “Evaluating
Kealhofer, S., 1996, “Managing default risk in credit risk models,” Journal of Banking and
portfolios of derivatives,” Derivative Credit Finance, 24(1/2): 151–165.
Risk, London: Risk Publications, 49–63.
Markowitz, H., 1952, “Portfolio selection,”
Kim, J., A. Malz and J. Mina, 1999, Long Run Journal of Finance, 7(1): 77–91.
Technical Document, New York: RiskMetrics
Group. Marshall, C. and M. Siegel, 1996, “Value-at-Risk:
in search of a risk measurement standard,”
Konno, H. and H. Yamazaki, 1991, “Mean- Draft Document, Massachusetts Inst. of
absolute deviation portfolio optimization and Technology.
its applications to Tokyo stock market,”
Management Science, 37: 519–531. Mausser, H. and D. Rosen, 1998, “Beyond VaR:
from measuring risk to managing risk,” Algo
Koyluoglu, H. and A. Hickman, 1998, Research Quarterly, 1(2): 5–20.
“Reconcilable differences,” Risk, 11(10): 56–
62. Mausser, H. and D. Rosen, 1999a, “Applying
scenario optimization to portfolio credit risk,”
Kreinin, A., L. Merkoulovitch, D. Rosen and M. Algo Research Quarterly, 2(2): 19–33.
Zerbs, 1998a, “Measuring portfolio risk using
quasi Monte Carlo methods,” Algo Research Mausser, H. and D. Rosen, 1999b, “Efficient risk/
Quarterly, 1(1): 17–26. return frontiers for credit risk,” Algo Research
Quarterly, 2(4): 35–48.
Kreinin, A., L. Merkoulovitch, D. Rosen and M.
Zerbs, 1998b, “Principal component analysis in Mausser, H. and D. Rosen, 1999c, “Beyond VaR:
quasi Monte Carlo simulation,” Algo Research triangular risk decomposition,” Algo Research
Quarterly, 1(2): 21–29. Quarterly, 2(1): 31–43.
Lando, D., 1997, Modeling Bonds and Derivatives Merton, R., 1974, “On the pricing of corporate
with Default Risk, (M. Dempster and S. Pliska, debt: the risk structure of interest rates,”
Eds.), 369–393, Cambridge, U.K.: Cambridge Journal of Finance, 29: 449–470.
University Press. Meyer, L., 1999, “Implications of recent global
Lando, D., 1998, “On Cox processes and credit financial crises for bank supervision and
risky securities,” Review of Derivatives Research, regulation,” in a speech before the
2, 99–120. International Finance Conference, Federal
78 References
Reserve Bank of Chicago, Chicago, Ill. Shaw J., 1997, “Beyond VaR and stress testing,”
October 1, 1999, (http://www.bog.frb.fed.us/ in VaR: Understanding and Applying Value-at-
boarddocs/speeches/1999/). Risk, RISK Publications, New York, N.Y., 211–
224.
Nagpal, K. and R. Bahar, 1999, “An analytical
aproach for credit risk analysis under Sobol, I., 1967, “On the distribution of points in
correlated defaults,” CreditMetrics Monitor, a cube and the approximate evaluation of
April: 51–74. integrals,” Computational Mathematics and
Mathematical Physics, 7(4): 86–112.
Niederreiter, H., 1992, “Random number
generation and quasi Monte Carlo methods,” Sundaresan, S., 2000, “Continuous-time
SIAM, Philadelphia. methods in finance: a review and an
Paskov, S. and J. Traub, 1995, “Faster valuation of assessment,” Journal of Finance, to appear in
financial derivatives,” Journal of Portfolio special issue of the American Finance
Management, Fall, 22 (1): 113–120. Association 2000 proceedings.
Peterson I., 1998, The Jungles of Randomness, New United States Federal Reserve Board, 1999,
York, N.Y.: John Wiley and Sons Inc. (http://www.bog.frb.fed.us/Releases/H15/):
select historical data for July 6, 1999.
Reimers, M. and M. Zerbs, 1999, “A multi-factor
statistical model for interest rates,” Algo Wilson, T., 1997a, “Portfolio credit risk I,” Risk,
Research Quarterly, 2(3): 53–64. 10(9): 111–117.
RiskMetrics, 1996, (see Longerstaey and Zangari, Wilson, T., 1997b, “Portfolio credit risk II,” Risk,
1996). 10(10): 56–61.
Rockefellar, R, and S. Uryasev, 2000,
Yung, E., 1999a, “Making a scene...,” Algo
“Optimization of conditional Value-at-Risk,”
Research Quarterly, 2(3): 5–8.
The Journal of Risk, forthcoming.
Samuelson, P., 1991, personal communication to Yung, E., 1999b, “Making a scene...,” Algo
Stephen A. Ross. Research Quarterly, 2(1): 5–7.
Schoenmakers, J. and A. Heemink, 1997, “Fast Yung, E., 1999c, “Making a scene...,” Algo
valuation of financial derivatives,” Research Quarterly, 2(2): 5–9.
Computational Finance, 1(1): 47–62.
Ziemba, W., and J. Mulvey, (Eds.), 1998,
Sharpe, W., 1964, “Capital asset prices: a theory Worldwide Asset and Liability Modeling,
of market equilibrium under conditions of Cambridge, U.K.: Cambridge University Press,
risk,” Journal of Finance, 19(3): 425–442. Publications of the Newton Institute.
Notation
g() function mapping instrument into MtF values into portfolio MtF values
Layout
Stephanie Collins
http://clients.algorithmics.com
Statement of Purpose: The Algo
Research Quarterly provides solutions
and information to clients based on
our experience and expertise in
enterprise risk methodologies.
2000 Algorithmics Incorporated
(“Algo”). All rights reserved. Errors
and omissions excepted. Permission
to make digital/hard copy of part or all
of this work for personal or classroom
use may be granted without fee
provided that no copy is made, used
or distributed for any commercial
purpose whatsoever and provided that
Algo’s prior written consent has been
obtained, and that Algo’s copyright
notice, the title of the publication and
its date appear on any such copy.
To request permission to use part or
all of this work contact:
Algorithmics Incorporated
185 Spadina Avenue
Toronto, Ontario
Canada M5T 2C6
Tel: (416) 217-1500
Fax: (416) 971-6100
ISSN: 1488-0539
There are no representations or
warranties, express or implied, as to
the applicability for any particular
purpose of any of the material(s)
contained herein, and Algo accepts no
liability for any loss or damages,
consequential or otherwise, arising
from any use of the said material(s).
In This Issue: Volume 1.1
Contributors ...................................................................................... 59
Contributors ...................................................................................... 75
Contributors ...................................................................................... 77
Mark-to-Future in Practice................................................................... 15
A Mark-to-Future feature that provides a practical example of the six rules.