Sie sind auf Seite 1von 218

Reading List

Mathematics
2011-2012



Institute and Faculty of Actuaries

December 2012



Compiled by Scott McLachlan
ii


iii


INSTITUTE AND FACULTY OF ACTUARIES

LIBRARY SERVICES

Maclaurin House
18 Dublin Street
EDINBURGH EH1 3PP

Tel 0131 240 1311
Fax 0131 240 1313

Staple Inn Hall
High Holborn
LONDON WC1V 7QJ

Tel 020 7632 2114
Fax 020 7632 2111




e-mail: libraries@actuaries.org.uk


**********

iv

THE LIBRARIES

The libraries of the Institute and Faculty of Actuaries offer a wide selection of resources, covering actuarial
science, mathematics, statistics, finance, investment, pensions, insurance, healthcare, social policy,
demography, business and risk management. Our extensive range of online resources are available to you
wherever you are.
The Libraries reserve the right to restrict the availability of any service to members of the Institute and Faculty of
Actuaries only.

ACCESS
The Libraries are open to all members of the Institute and Faculty of Actuaries. Opening hours are 9:00 to 17:00
Monday to Friday; the libraries are closed on public holidays. If you are planning a visit, please let us know so
we can ensure someone is available to welcome you.
Online access to electronic resources is available through Athens: www.openathens.net
Members are entitled to a free account. For an account please email the libraries, quoting your ARN number.

LENDING
We can post books to members and other approved borrowers in the UK and overseas. We hold multiple
copies of popular titles. If an item is not in stock we will usually buy it or obtain it from another library.

PHOTOCOPYING
We can post, fax or email single copies of periodical articles and extracts from books, subject to copyright
regulations.

ENQUIRIES
We can search for information, statistics and hard-to-trace references. We aim to respond within 24 hours.

ONLINE CATALOGUE
The online catalogue is available at: http://actuaries.soutron.net

READING LISTS
We produce topical lists of recent publications which you can download from the libraries area of the website.
We can compile customized lists on request (contact libraries@actuaries.org.uk) or you can search the library
catalogue.

THE HISTORICAL COLLECTION
The Institute's collection of historical material is housed at Staple Inn. This collection comprises all books
published before 1870, those of historical interest published 1870 - 1959 and historical studies published
subsequently. It also includes full sets of the Journal of the Institute of Actuaries, Journal of the Staple Inn
Actuarial Society, Transactions of the Faculty of Actuaries, Transactions of the International Congress of
Actuaries, the journals of many overseas actuarial bodies, copies of tuition material and a reference collection.
Opening hours are 9.00am to 5.00pm. Prospective visitors are advised to telephone in advance.

PUBLICATIONS SHOP
We stock all publications issued by the Institute and Faculty of Actuaries, including Core Reading, Formulae
and tables and titles from the list of suggested further reading for the CT and SA exams. We offer discounts on
a range of books and calculators approved for the professions exams. You can place orders and find news of
the latest discounts at http://www.actuaries.org.uk/research-and-resources/eshop

FEES FOR SERVICES
There may be a fee for some services. Please check the libraries pages on the website at:
www.actuaries.org.uk/research-and-resources/pages/borrowing
v

Contents
ACTUARIAL SCIENCE ...................................................................................................................................... 1
AGRICULTURAL INSURANCE.......................................................................................................................... 1
ALGORITHMS ................................................................................................................................................... 2
ALLAIS PARADOX ............................................................................................................................................ 2
ALTERNATIVE RISK TRANSFER ..................................................................................................................... 2
AMBIGUITY ...................................................................................................................................................... 3
ANALYSIS.......................................................................................................................................................... 3
ANNUITANTS ................................................................................................................................................... 4
ANNUITIES ....................................................................................................................................................... 4
ARCHIMEDEAN GENERATOR ......................................................................................................................... 6
ASSET ALLOCATION ........................................................................................................................................ 6
ASSET MANAGEMENT ..................................................................................................................................... 7
ASSET PRICES .................................................................................................................................................. 7
ASSETS .............................................................................................................................................................. 7
ASYMMETRIC INFORMATION ........................................................................................................................ 8
BANDWIDTH PARAMETER ............................................................................................................................. 8
BAYES THEOREM............................................................................................................................................. 8
BAYESIAN ANALYSIS ....................................................................................................................................... 9
BAYESIAN INFERENCE .................................................................................................................................. 10
BAYESIAN METHODS .................................................................................................................................... 10
BLACK-SCHOLES ............................................................................................................................................ 10
BONDS............................................................................................................................................................. 10
BONUS SYSTEMS ............................................................................................................................................ 11
BOOTSTRAP ................................................................................................................................................... 11
BROWNIAN MOTION ..................................................................................................................................... 13
CANCER .......................................................................................................................................................... 13
CAPITAL .......................................................................................................................................................... 14
CAPITAL ALLOCATION .................................................................................................................................. 14
CAPITAL ASSET PRICING MODEL ................................................................................................................ 15
CASH FLOW .................................................................................................................................................... 15
CATASTROPHE ............................................................................................................................................... 16
CATASTROPHE INSURANCE ......................................................................................................................... 16
CATASTROPHE REINSURANCE .................................................................................................................... 17
CHAIN LADDER METHODS ........................................................................................................................... 17
vi

CLAIM FREQUENCY ....................................................................................................................................... 19
CLAIM FREQUENCY MODELS ....................................................................................................................... 20
CLAIMS ........................................................................................................................................................... 20
CLAIMS RESERVES ........................................................................................................................................ 24
CLIMATE CHANGE ......................................................................................................................................... 25
COBWEB THEOREM ...................................................................................................................................... 25
COHORTS ........................................................................................................................................................ 26
COMONOTONICITY ........................................................................................................................................ 26
COMPETITION ................................................................................................................................................ 27
COMPOUND DISTRIBUTIONS ....................................................................................................................... 27
COMPOUND INTEREST ................................................................................................................................. 27
CONFIDENCE LIMITS..................................................................................................................................... 27
CONSUMER BEHAVIOUR ............................................................................................................................... 27
CONSUMER PRICE INDEX (CPI) ................................................................................................................... 28
CONTRACTS ................................................................................................................................................... 28
CONVEX PROGRAMMING .............................................................................................................................. 28
COPULAS ......................................................................................................................................................... 29
CORRELATION ............................................................................................................................................... 34
COSTS .............................................................................................................................................................. 35
CREDIBILITY .................................................................................................................................................. 35
CREDIBILITY THEORY................................................................................................................................... 36
CRITICAL ILLNESS INSURANCE ................................................................................................................... 36
CURRENCIES .................................................................................................................................................. 37
DATA ............................................................................................................................................................... 37
DATA FITTING ............................................................................................................................................... 38
DEATH BENEFIT ............................................................................................................................................ 38
DECISION MAKING ........................................................................................................................................ 38
DEFINED BENEFIT SCHEMES ....................................................................................................................... 39
DEFINED CONTRIBUTION SCHEMES ........................................................................................................... 39
DERIVATIVES ................................................................................................................................................. 40
DEVIATION ..................................................................................................................................................... 41
DIFFERENTIAL EQUATIONS ......................................................................................................................... 42
DIFFUSION PROCESSES................................................................................................................................. 42
DISABLEMENT ............................................................................................................................................... 43
DISCOUNT RATE ............................................................................................................................................ 43
vii

DISCOUNTING ................................................................................................................................................ 43
DISTRIBUTION THEORY ............................................................................................................................... 44
DIVIDENDS ..................................................................................................................................................... 45
DOWNSIDE RISK AVERSION ......................................................................................................................... 47
DYNAMIC PROGRAMMING ........................................................................................................................... 47
ECONOMIC CAPITAL ..................................................................................................................................... 47
ECONOMIC INDICATORS ............................................................................................................................... 48
EM ALGORITHM ............................................................................................................................................. 48
ENVIRONMENT .............................................................................................................................................. 48
EQUITIES ........................................................................................................................................................ 49
ERLAND RISK MODELS ................................................................................................................................. 50
ERLANG(2) ..................................................................................................................................................... 50
ERLANG MIXTURE ......................................................................................................................................... 50
ESTIMATION .................................................................................................................................................. 50
EUROPE .......................................................................................................................................................... 51
EXPECTATION ................................................................................................................................................ 52
EXPENSES ....................................................................................................................................................... 52
EXPERIENCE RATING .................................................................................................................................... 52
EXPOSURE RATING ....................................................................................................................................... 52
EXTREME EVENTS ......................................................................................................................................... 53
EXTREME VALUE THEORY ........................................................................................................................... 53
FAMILY ........................................................................................................................................................... 54
FINANCE ......................................................................................................................................................... 55
FINANCIAL MARKETS ................................................................................................................................... 55
FIRE INSURANCE ........................................................................................................................................... 55
FORECASTING ................................................................................................................................................ 55
FOREIGN EXCHANGE ..................................................................................................................................... 56
FOREIGN EXCHANGE MARKETS .................................................................................................................. 56
FORMULAE ..................................................................................................................................................... 56
FUND MANAGEMENT .................................................................................................................................... 57
GAME THEORY ............................................................................................................................................... 58
GAMES ............................................................................................................................................................ 58
GARCH ............................................................................................................................................................ 58
GENERAL INSURANCE .................................................................................................................................. 58
GENERAL INSURANCE COMPANY ................................................................................................................ 60
viii

GENERALISED LINEAR MODELS .................................................................................................................. 60
GERBER-SHIU FUNCTION ............................................................................................................................. 61
GLOBAL WARMING ....................................................................................................................................... 62
GRADUATION ................................................................................................................................................. 62
GRAM-CHARLIER ........................................................................................................................................... 62
GUARANTEES ................................................................................................................................................. 63
HASTINGS ALGORITHM ................................................................................................................................ 63
HEDGING ........................................................................................................................................................ 64
HURRICANES .................................................................................................................................................. 66
IMPERFECT INFORMATION ......................................................................................................................... 66
INCOME .......................................................................................................................................................... 67
INCOME PROTECTION .................................................................................................................................. 67
INCOME PROTECTION INSURANCE ............................................................................................................. 67
INDEPENDENCE ............................................................................................................................................ 68
INDEXATION .................................................................................................................................................. 68
INFLATION ..................................................................................................................................................... 68
INFORMATION ............................................................................................................................................... 69
INSURANCE .................................................................................................................................................... 69
INSURANCE COMPANIES .............................................................................................................................. 72
INSURANCE COMPANY ................................................................................................................................. 73
INSURANCE INDUSTRY ................................................................................................................................. 73
INTEREST ....................................................................................................................................................... 73
INTEREST RATES ........................................................................................................................................... 73
INTERNATIONAL MARKETING .................................................................................................................... 75
INVESTMENT ................................................................................................................................................. 75
INVESTMENT MANAGEMENT ...................................................................................................................... 77
INVESTMENT PERFORMANCE ..................................................................................................................... 77
INVESTMENT POLICY.................................................................................................................................... 78
IRELAND ......................................................................................................................................................... 78
JUMP DIFFUSION ........................................................................................................................................... 79
KALMAN FILTER ............................................................................................................................................ 79
KERNEL DESTINY ESTIMATOR .................................................................................................................... 79
LAPLACE TRANSFORM ................................................................................................................................. 79
LIABILITIES .................................................................................................................................................... 80
LIFE ASSURANCE ........................................................................................................................................... 80
ix

LIFE CONTINGENCIES ................................................................................................................................... 81
LIFE EXPECTATION ....................................................................................................................................... 81
LIFE INSURANCE ........................................................................................................................................... 82
LIFE PRODUCTS ............................................................................................................................................. 83
LIFE TABLES .................................................................................................................................................. 83
LINEAR EQUATIONS ...................................................................................................................................... 83
LINEAR PROGRAMMING ............................................................................................................................... 83
LONG-TAIL BUSINESS ................................................................................................................................... 84
LONG-TAIL LIABILITIES................................................................................................................................ 84
LONG TERM CARE COVER ............................................................................................................................ 84
LONGEVITY .................................................................................................................................................... 84
LONGEVITY RISK ........................................................................................................................................... 85
LONGITUDINAL STUDIES ............................................................................................................................. 87
LOSS ................................................................................................................................................................ 87
LOSS FUNCTIONS ........................................................................................................................................... 88
LOSSES ............................................................................................................................................................ 89
MARINE INSURANCE ..................................................................................................................................... 89
MARKOV CHAIN ............................................................................................................................................. 90
MARKOV PROCESSES .................................................................................................................................... 90
MARTINGALE METHODS .............................................................................................................................. 92
MATHEMATICAL MODELS ............................................................................................................................ 92
MATHEMATICS .............................................................................................................................................. 93
MATHEMATICS OF FINANCE ...................................................................................................................... 103
MODELLING ................................................................................................................................................. 103
MODELS ........................................................................................................................................................ 113
MONTE CARLO ............................................................................................................................................. 116
MONTE CARLO TECHNIQUES ..................................................................................................................... 116
MORTALITY .................................................................................................................................................. 117
MORTALITY PROJECTIONS ......................................................................................................................... 120
MORTALITY RATES ..................................................................................................................................... 120
MOTOR INSURANCE .................................................................................................................................... 122
MULTIVARIATE ANALYSIS ......................................................................................................................... 122
MULTIVARIATE TAIL .................................................................................................................................. 123
NORWAY ....................................................................................................................................................... 124
OPERATIONAL RISK .................................................................................................................................... 124
x

OPTIMAL REINSURANCE ............................................................................................................................ 125
OPTIMISATION ............................................................................................................................................ 125
OPTION PRICING ......................................................................................................................................... 125
OPTIONS ....................................................................................................................................................... 129
OUTSTANDING CLAIMS .............................................................................................................................. 130
PANJER'S CLASS OF COUNTING DISTRIBUTIONS .................................................................................... 130
PARETO DISTRIBUTION ............................................................................................................................. 131
PARTICIPATING POLICIES .......................................................................................................................... 131
PENSION FUND ADMINISTRATION ........................................................................................................... 131
PENSION FUNDS .......................................................................................................................................... 132
PENSION PLANS ........................................................................................................................................... 132
PENSION SCHEMES ..................................................................................................................................... 132
PENSIONS ..................................................................................................................................................... 133
PERIODICALS ............................................................................................................................................... 133
PERSONAL FINANCIAL PLANNING ............................................................................................................ 133
PHILOSOPHY ................................................................................................................................................ 134
POISSON DISTRIBUTION............................................................................................................................. 134
POISSON HIDDEN MARKOV ........................................................................................................................ 134
POISSON PROCESS ....................................................................................................................................... 134
POLICYHOLDERS ......................................................................................................................................... 137
POLLUTION .................................................................................................................................................. 137
PORTFOLIO INSURANCE ............................................................................................................................. 137
PORTFOLIO INVESTMENT .......................................................................................................................... 138
PORTFOLIO MANAGEMENT ....................................................................................................................... 139
PREDICTION ................................................................................................................................................. 140
PREMIUM CALCULATION ........................................................................................................................... 140
PREMIUM PRINCIPLES ................................................................................................................................ 141
PREMIUMS ................................................................................................................................................... 141
PRICE MECHANISM ..................................................................................................................................... 142
PRICES .......................................................................................................................................................... 142
PRICING ........................................................................................................................................................ 143
PRIVATE MEDICAL INSURANCE ................................................................................................................ 146
PROBABILITY ............................................................................................................................................... 146
PROBABILITY DISTRIBUTION .................................................................................................................... 147
PROBABILITY THEORY ............................................................................................................................... 147
xi

PRODUCTION LAG ....................................................................................................................................... 147
PROSPECT THEORY ..................................................................................................................................... 148
QUANTITATIVE ANALYSIS ......................................................................................................................... 148
QUANTITATIVE METHODS ......................................................................................................................... 148
RATEMAKING ............................................................................................................................................... 148
RATES AND RATING .................................................................................................................................... 149
RATIONAL NUMBERS .................................................................................................................................. 149
REGRESSION ................................................................................................................................................ 149
REGULATION ............................................................................................................................................... 149
REINSURANCE ............................................................................................................................................. 150
RENEWAL PROCESS .................................................................................................................................... 152
RESERVE RISK.............................................................................................................................................. 153
RESERVING................................................................................................................................................... 153
RETIREMENT ............................................................................................................................................... 153
RETURNS ...................................................................................................................................................... 154
REVERSIONARY ANNUITY .......................................................................................................................... 154
REVIEWS ...................................................................................................................................................... 155
RISK ............................................................................................................................................................... 155
RISK ANALYSIS ............................................................................................................................................ 161
RISK ASSESSMENT ...................................................................................................................................... 161
RISK AVERSION ........................................................................................................................................... 162
RISK-BASED CAPITAL ................................................................................................................................. 162
RISK MANAGEMENT ................................................................................................................................... 163
RISK MEASUREMENT .................................................................................................................................. 164
RISK MODELS ............................................................................................................................................... 167
RISK SHARING .............................................................................................................................................. 167
RISK THEORY ............................................................................................................................................... 168
RUIN PROBABILITY ..................................................................................................................................... 169
RUIN THEORY .............................................................................................................................................. 171
SAMPLING .................................................................................................................................................... 173
SCENARIO GENERATION ............................................................................................................................ 173
SECURITIES .................................................................................................................................................. 173
SHARE PRICES ............................................................................................................................................. 174
SIMULATION ................................................................................................................................................ 174
SINGAPORE .................................................................................................................................................. 174
xii

SOLVENCY .................................................................................................................................................... 174
SOLVENCY II ................................................................................................................................................. 175
SPARRE ANDERSEN MODEL ....................................................................................................................... 176
SPARRE ANDERSEN RISK MODEL.............................................................................................................. 176
STAPLE INN .................................................................................................................................................. 176
STATISTICAL ANALYSIS .............................................................................................................................. 176
STATISTICS................................................................................................................................................... 177
STOCHASTIC INVESTMENT MODELS ........................................................................................................ 177
STOCHASTIC MODELS ................................................................................................................................. 177
STOCHASTIC MORTALITY .......................................................................................................................... 180
STOCHASTIC PROCESSES ............................................................................................................................ 180
STOCHASTIC RESERVING ........................................................................................................................... 183
STOCHASTIC VOLATILITY .......................................................................................................................... 183
STOCK MARKET ........................................................................................................................................... 184
STOCKS AND SHARES .................................................................................................................................. 185
STOP LOSS .................................................................................................................................................... 185
STRATEGIC PLANNING ............................................................................................................................... 185
STRESS TESTS .............................................................................................................................................. 186
SURPLUS ....................................................................................................................................................... 186
SURRENDER RATES .................................................................................................................................... 187
SURVIVAL ANALYSIS ................................................................................................................................... 187
SWAPS .......................................................................................................................................................... 187
SYSTEMIC RISK ............................................................................................................................................ 188
TAIL DEPENDENCE ..................................................................................................................................... 188
TAIL RISK MEASURES ................................................................................................................................. 188
TAX ................................................................................................................................................................ 190
TERM ASSURANCE ...................................................................................................................................... 191
TERM STRUCTURE ...................................................................................................................................... 191
TERRORISM .................................................................................................................................................. 192
TIME .............................................................................................................................................................. 192
TIME SERIES ................................................................................................................................................ 193
TRADING ...................................................................................................................................................... 194
TRANSACTION COSTS ................................................................................................................................. 195
UNCERTAINTY ............................................................................................................................................. 195
UNDERWRITING .......................................................................................................................................... 195
xiii

UNIT LINKED LIFE ASSURANCE ................................................................................................................. 196
UNITED STATES ........................................................................................................................................... 196
UTILITY ......................................................................................................................................................... 197
UTILITY FUNCTION ..................................................................................................................................... 197
VALUATION .................................................................................................................................................. 197
VALUE-AT-RISK (VAR) ................................................................................................................................ 198
VARIABILITY ................................................................................................................................................ 201
VARIABLE ANNUITIES ................................................................................................................................ 201
VARIANCE ANALYSIS .................................................................................................................................. 202
VOLATILITY.................................................................................................................................................. 202
WEALTH ....................................................................................................................................................... 204
WEATHER ..................................................................................................................................................... 204
WITH PROFITS LIFE ASSURANCE .............................................................................................................. 205
WITH PROFITS POLICIES ............................................................................................................................ 205


1

ACTUARIAL SCIENCE

Editorial: European Actuarial Journal. Hipp, Christian [RKN: 44804]
Shelved at: online only
European Actuarial Journal (2011) 1(1) July : 1-2.
The editors welcome readers to this new international scientific journal which is published and edited by a cooperation of 13
actuarial associations of the following 11 countries: Austria, Belgium, France, Germany, Greece, Hungary, Italy, Poland, Portugal,
Slovenia, and Switzerland. EAJ is the successor of the following six actuarial journals: 1. Belgian Actuarial Bulletin, 2. Bltter der
Deutschen Gesellschaft fr Versicherungs- und Finanzmathematik, 3. Boletim do Instituto dos Acturios Portugueses, 4. Giornale
dellIstituto Italiano degli Attuari, 5. Mitteilungen der Schwweiserische Aktuarveringung/Bulletin de lAssociation Suisse des
Actuaires, 6. Mitteilungen der Aktuarveringung sterreichs (Austria)
Available via Athens: Springer

Fundamentals of actuarial mathematics. Promislow, S David (2011). - 2nd ed. - Chicester: John Wiley & Sons Ltd, 2011. - 449
pages. [RKN: 45062]
Shelved at: EM/VA Shelved at: 368.01 PRO
This book provides a comprehensive introduction to actuarial mathematics, covering both deterministic and stochastic models of
life contingencies, as well as more advanced topics such as risk theory, credibility theory and multistate models. This new edition
includes additional material on credibility theory, continuous time multistate models, more complex types of contingent
insurances, flexible contracts such as universal life, the risk measures VaR and TVaR. Key Features: Covers much of the syllabus
material on the modeling examinations of the Society of Actuaries, Canadian Institute of Actuaries and the Casualty Actuarial
Society. (SOACIA exams MLC and C, CSA exams 3L and 4.), Extensively revised and updated with new material, Orders the
topics specifically to facilitate learning, Provides a streamlined approach to actuarial notation, Employs modern computational
methods, Contains a variety of exercises, both computational and theoretical, together with answers, enabling use for selfstudy.
An ideal text for students planning for a professional career as actuaries, providing a solid preparation for the modeling
examinations of the major actuarial associations. Furthermore, this book is highly suitable reference for those wanting a sound
introduction to the subject, and for those working in insurance, annuities and pensions.

On the Lp-metric between a probability distribution and its distortion. Lpez-Daz, Miguel; Sordo, Miguel A; Surez-Llorens, Alfonso
[RKN: 44784]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51(2) : 257-264.
In actuarial theory, the [Lp]-metric is used to evaluate how well a probability distribution approximates another one. In the context
of the distorted expectation hypothesis, the actuary replaces the original probability distribution by a distorted probability, so it
makes sense to interpret the [Lp]-metric between them as a characteristic of the underlying random variable. We show in this
paper that this is a characteristic of the variability of the random variable, study its properties and give some applications.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Stochastic orders in time transformed exponential models with applications. Li, Xiaohu; Lin, Jianhua [RKN: 44974]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (1) : 47-52.
This paper studies expectations of a supermodular function of bivariate random risks following [Time Transformed Exponential]
TTE models. Comparison of such expectations are conducted based on some stochastic orders of the involved univariate survival
functions in the models, and also the upper orthant-convex order between two bivariate random risks in TTE models is built. This
corrects Theorem 2.3 of Mulero et al. (2010) and invalidates some results there. Some applications in actuarial science are
presented as well.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

AGRICULTURAL INSURANCE

A maximum-entropy approach to the linear credibility formula. Najafabadi, Amir T. Payandeh; Hatami, Hamid; Najafabadi, Maryam
Omidi [RKN: 45738]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 216-221.
Payandeh [Payandeh Najafabadi, A.T., 2010. A new approach to credibility formula. Insurance: Mathematics and Economy 46,
334338] introduced a new technique to approximate a Bayes estimator with the exact credibilitys form. This article employs a
well known and powerful maximum-entropy method (MEM) to extend results of Payandeh Najafabadi (2010) to a class of linear
credibility, whenever claim sizes have been distributed according to the logconcave distributions. Namely, (i) it employs the
maximum-entropy method to approximate an appropriate Bayes estimator (with respect to either the square-error or the Linex
loss functions and general increasing and bounded prior distribution) by a linear combination of claim sizes; (ii) it establishes that
such an approximation coincides with the exact credibility formula whenever the require conditions for the exact credibility (see
below) are held. Some properties of such an approximation are discussed. Application to crop insurance has been given.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

2

ALGORITHMS

Comparison of two methods for superreplication. Ekstrom, Erik; Tysk, Johan Routledge, [RKN: 45798]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (1-2) : 181-193.
We compare two methods for superreplication of options with convex pay-off functions. One method entails the overestimation of
an unknown covariance matrix in the sense of quadratic forms. With this method the value of the superreplicating portfolio is given
as the solution of a linear BlackScholes BS-type equation. In the second method, the choice of quadratic form is made pointwise.
This leads to a fully non-linear equation, the so-called BlackScholesBarenblatt (BSB) equation, for the value of the
superreplicating portfolio. In general, this value is smaller for the second method than for the first method. We derive estimates for
the difference between the initial values of the superreplicating strategies obtained using the two methods.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

The effect of correlation and transaction costs on the pricing of basket options. Atkinson, C; Ingpochai, P Routledge, [RKN:
45797]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (1-2) : 131-179.
In this article, we examine the problem of evaluating the option price of a European call option written on N underlying assets when
there are proportional transaction costs in the market. Since the portfolio under consideration consists of multiple risky assets,
which makes numerical methods formidable, we use perturbation analyses. The article extends the model for option pricing on
uncorrelated assets, which was proposed by Atkinson and Alexandropoulos (2006 Pricing a European basket option in the
presence of proportional transaction cost . Applied Mathematical Finance , 13 ( 3 ) : 191 214). We determine optimal hedging
strategies as well as option prices on both correlated and uncorrelated assets. The option valuation problem is obtained by
comparing the maximized utility of wealth with and without option liability. The two stochastic control problems, which arise from
the transaction costs, are transformed to free boundary and partial differential equation problems. Once the problems have been
formulated, we establish optimal trading strategies for each of the portfolios. In addition, the optimal hedging strategies can be
found by comparing the trading strategies of the two portfolios. We provide a general procedure for solving N risky assets, which
shows that for small correlations the N asset problem can be replaced by N (N-1)/2 two-dimensional problems and give numerical
examples for the two risky assets portfolios.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

Viterbi-based estimation for Markov switching GARCH model. Routledge, [RKN: 45838]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (3-4) : 219-231.
We outline a two-stage estimation method for a Markov-switching Generalized Autoregressive Conditional Heteroscedastic
(GARCH) model modulated by a hidden Markov chain. The first stage involves the estimation of a hidden Markov chain using the
Vitberi algorithm given the model parameters. The second stage uses the maximum likelihood method to estimate the model
parameters given the estimated hidden Markov chain. Applications to financial risk management are discussed through simulated
data.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

ALLAIS PARADOX

Allais for all: Revisiting the paradox in a large representative sample. Huck, Steffen; Mller, Wieland Springer, - 33 pages. [RKN:
73974]
Shelved at: Per: J Risk Uncrtnty
Journal of Risk and Uncertainty (2012) 44 (3) : 261-293.
We administer the Allais paradox questions to both a representative sample of the Dutch population and to student subjects.
Three treatments are implemented: one with the original high hypothetical payoffs, one with low hypothetical payoffs and a third
with low real payoffs. Our key findings are: (i) violations in the non-lab sample are systematic and a large bulk of violations is likely
to stem from non-familiarity with large payoffs, (ii) we can identify groups of the general population that have much higher than
average violation rates; this concerns mainly the lowly educated and unemployed, and (iii) the relative treatment differences in the
population at large are accurately predicted by the lab sample, but violation rates in all lab treatments are about 15 percentage
points lower than in the corresponding non-lab treatments.

ALTERNATIVE RISK TRANSFER

Computing bounds on the expected payoff of Alternative Risk Transfer products. Villegas, Andrs M; Medaglia, Andrs L; Zuluaga,
Luis F [RKN: 44786]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51(2) : 271-281.
The demand for integrated risk management solutions and the need for new sources of capital have led to the development of
innovative risk management products that mix the characteristics of traditional insurance and financial products. Such products,
usually referred as Alternative Risk Transfer (ART) products include: (re)insurance contracts that bundle several risks under a
single policy; multi-trigger products where the payment of benefits depends upon the occurrence of several events; and insurance
3

linked securities that place insurance risks in the capital market. Pricing of these complex products usually requires tailor-made
complex valuation methods that combine derivative pricing and actuarial science techniques for each product, as well as strong
distributional assumptions on the ARTs underlying risk factors. We present here an alternative methodology to compute bounds
on the price of ART products when there is limited information on the distribution of the underlying risk factors. In particular, we
develop a general optimization-based method that computes upper and lower price bounds for different ART products using
market data and possibly expert information about the underlying risk factors. These bounds are useful when the structure of the
product is too complex to develop analytical or simulation valuation methods, or when the scarcity of data makes it difficult to make
strong distributional assumptions on the risk factors. We illustrate our results by computing bounds on the price of a floating
retention insurance contract, and a catastrophe equity put (CatEPut) option.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

AMBIGUITY

Ambiguity aversion, higher-order risk attitude and optimal effort. Huang, Rachel J [RKN: 45637]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 338-345.
In this paper, we examine whether a more ambiguity-averse individual will invest in more effort to shift her initial starting wealth
distribution toward a better target distribution. We assume that the individual has ambiguous beliefs regarding two target (starting)
distributions and that one distribution is preferred to the other. We find that an increase in ambiguity aversion will decrease
(increase) the optimal effort when the cost of effort is non-monetary. When the cost of effort is monetary, the effect depends on
whether the individual would make more effort when the target (starting) distribution is the preferred distribution than the target
(starting) distributions, the inferior one. We further characterize the individuals higher-order risk preferences to examine the
sufficient conditions.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

ANALYSIS

Analyzing surplus appropriation schemes in participating life insurance from the insurers and the policyholders perspective.
Bohnert, Alexander; Gatzert, Nadine [RKN: 44991]
Insurance: Mathematics & Economics (2012) 50 (1) : 64-78.
This paper examines the impact of three surplus appropriation schemes often inherent in participating life insurance contracts on
the insurers shortfall risk and the net present value from an insureds viewpoint. (1) In case of the bonus system, surplus is used
to increase the guaranteed death and survival benefit, leading to higher reserves; (2) the interest-bearing accumulation increases
only the survival benefit by accumulating the surplus on a separate account; and (3) surplus can also be used to shorten the
contract term, which results in an earlier payment of the survival benefit and a reduced sum of premium payments. The pool of
participating life insurance contracts with death and survival benefit is modeled actuarially with annual premium payments;
mortality rates are generated based on an extension of the Lee-Carter (1992) model, and the asset process follows a geometric
Brownian motion. In a simulation analysis, we then compare the influence of different asset portfolios and shocks to mortality on
the insurers risk situation and the policyholders net present value for the three surplus schemes. Our findings demonstrate that,
even though the surplus distribution and thus the amount of surplus is calculated the same way, the type of surplus appropriation
scheme has a substantial impact on the insurers risk exposure and the policyholders net present value.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Characterization of the American put option using convexity. Xie, Dejun; Edwards, David A; Schleiniger, Gilberto; Zhu, Qinghua
[RKN: 45463]
Applied Mathematical Finance (2011) 18 (3-4) : 353-365.
Understanding the behaviour of the American put option is one of the classic problems in mathematical finance. Considerable
efforts have been made to understand the asymptotic expansion of the optimal early exercise boundary for small time near expiry.
Here we focus on the large-time expansion of the boundary. Based on a recent development of the convexity property, we are able
to establish two integral identities pertaining to the boundary, from which the upper bound of its large-time expansion is derived.
The bound includes parameter dependence in the exponential decay to its limiting value. In addition, these time explicit identities
provide very efficient numerical approximations to the true solution to the problem.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

Extending the Lee-Carter model: a three-way decomposition. Russolillo, Maria; Giordano, Giuseppe; Haberman, Steven [RKN:
45354]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 2 : 96-117.
In this paper, we focus on a Multi-dimensional Data Analysis approach to the Lee-Carter (LC) model of mortality trends. In
particular, we extend the bilinear LC model and specify a new model based on a three-way structure, which incorporates a further
component in the decomposition of the log-mortality rates. A multi-way component analysis is performed using the Tucker3 model.
The suggested methodology allows us to obtain combined estimates for the three modes: (1) time, (2) age groups and (3) different
populations. From the results obtained by the Tucker3 decomposition, we can jointly compare, in both a numerical and graphical
way, the relationships among all three modes and obtain a time-series component as a leading indicator of the mortality trend for
4

a group of populations. Further, we carry out a correlation analysis of the estimated trends in order to assess the reliability of the
results of the three-way decomposition. The model's goodness of fit is assessed using an analysis of the residuals. Finally, we
discuss how the synthesised mortality index can be used to build concise projected life tables for a group of populations. An
application which compares 10 European countries is used to illustrate the approach and provide a deeper insight into the model
and its implementation.

Stochastic expansion for the pricing of call options with discrete dividends. Etore, Pierre; Gobet, Emmanuel Routledge, [RKN:
45839]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (3-4) : 233-264.
In the context of an asset paying affine-type discrete dividends, we present closed analytical approximations for the pricing of
European vanilla options in the BlackScholes model with time-dependent parameters. They are obtained using a stochastic
Taylor expansion around a shifted lognormal proxy model. The final formulae are, respectively, first-, second- and third- order
approximations w.r.t. the fixed part of the dividends. Using CameronMartin transformations, we provide explicit representations
of the correction terms as Greeks in the BlackScholes model. The use of Malliavin calculus enables us to provide tight error
estimates for our approximations. Numerical experiments show that this approach yields very accurate results, in particular
compared with known approximations of Bos, Gairat and Shepeleva (20035. Bos , R. , Gairat , A. and Shepeleva , D. 2003 .
Dealing with discrete dividends . Risk Magazine , 16 : 109 112 and Veiga and Wystup (2009, Closed formula for options with
discrete dividends and its derivatives, Applied Mathematical Finance, 16(6): 517531) and quicker than the iterated integration
procedure of Haug, Haug and Lewis (2003, Back to basics: a new approach to the discrete dividend problem, Wilmott Magazine,
pp. 37 47) or than the binomial tree method of Vellekoop and Nieuwenhuis (2006, Efficient pricing of derivatives on assets with
discrete dividends, Applied Mathematical Finance, 13(3), pp. 265284).
Available via Athens: Taylor & Francis Online
http://www.openathens.net

Tail distortion risk and its asymptotic analysis. Zhu, Li; Li, Haijun [RKN: 45728]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 115-121.
A distortion risk measure used in finance and insurance is defined as the expected value of potential loss under a scenario
probability measure. In this paper, the tail distortion risk measure is introduced to assess tail risks of excess losses modeled by the
right tails of loss distributions. The asymptotic linear relation between tail distortion and value-at-risk is derived for heavy-tailed
losses with the linear proportionality constant depending only on the distortion function and the tail index. Various examples
involving tail distortions for location-invariant, scale-invariant, and shape-invariant loss distribution families are also presented to
illustrate the results.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

ANNUITANTS

An application of comonotonicity theory in a stochastic life annuity framework. Liu, Xiaoming; Jang, Jisoo; Kim, Sun Mee [RKN:
40022]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 271-279.
A life annuity contract is an insurance instrument which pays pre-scheduled living benefits conditional on the survival of the
annuitant. In order to manage the risk borne by annuity providers, one needs to take into account all sources of uncertainty that
affect the value of future obligations under the contract. In this paper, we define the concept of annuity rate as the conditional
expected present value random variable of future payments of the annuity, given the future dynamics of its risk factors. The
annuity rate deals with the non-diversifiable systematic risk contained in the life annuity contract, and it involves mortality risk as
well as investment risk. While it is plausible to assume that there is no correlation between the two risks, each affects the annuity
rate through a combination of dependent random variables. In order to understand the probabilistic profile of the annuity rate, we
apply comonotonicity theory to approximate its quantile function. We also derive accurate upper and lower bounds for prediction
intervals for annuity rates. We use the LeeCarter model for mortality risk and the Vasicek model for the term structure of interest
rates with an annually renewable fixed-income investment policy. Different investment strategies can be handled using this
framework.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

ANNUITIES

A computationally efficient algorithm for estimating the distribution of future annuity values under interest-rate and longevity
risks. Dowd, Kevin; Blake, David; Cairns, Andrew J G Society of Actuaries, - 11 pages. [RKN: 74836]
Shelved at: Per: NAAJ (Oxf) Per NAAJ (Lon) Shelved at: JOU
North American Actuarial Journal (2011) 15 (2) : 237-247.
This paper proposes a computationally efficient algorithm for quantifying the impact of interestrate risk and longevity risk on the
distribution of annuity values in the distant future. The algorithm simulates the state variables out to the end of the horizon period
and then uses a Taylor series approximation to compute approximate annuity values at the end of that period, thereby avoiding a
computationally expensive simulation-within-simulation problem. Illustrative results suggest that annuity values are likely to rise
considerably but are also quite uncertain. These findings have some unpleasant implications both for defined contribution pension
plans and for defined benefit plan sponsors considering using annuities to hedge their exposure to these risks at some point in the
5

future.
http://www.soa.org/news-and-publications/publications/journals/naaj/naaj-detail.aspx

Entropy, longevity and the cost of annuities. Haberman, Steven; Khalaf-Allah, Marwa; Verrall, Richard [RKN: 40013]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 197-204.
This paper presents an extension of the application of the concept of entropy to annuity costs. Keyfitz (1985) introduced the
concept of entropy, and analysed this in the context of continuous changes in life expectancy. He showed that a higher level of
entropy indicates that the life expectancy has a greater propensity to respond to a change in the force of mortality than a lower
level of entropy. In other words, a high level of entropy means that further reductions in mortality rates would have an impact on
measures like life expectancy. In this paper, we apply this to the cost of annuities and show how it allows the sensitivity of the cost
of a life annuity contract to changes in longevity to be summarized in a single figure index.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Gram-Charlier processes and equity-indexed annuities. Chateau, Jean-Pierre; Dufresne, Daniel (2012). - Victoria: University of
Melbourne, 2012. - 32 pages. [RKN: 73948]
A Gram-Charlier distribution has a density that is a polynomial times a normal density. The historical connection between actuarial
science and the Gram- Charlier expansions goes back to the 19th century. A critical review of the financial literature on the
Gram-Charlier distribution is made. Properties of the Gram-Charlier distributions are derived, including moments, tail estimates,
moment indeterminacy of the exponential of a Gram-Charlier distributed variable, non-existence of a continuoustime Levy process
with Gram-Charlier increments, as well as formulas for option prices and their sensitivities. A procedure for simulating
Gram-Charlier distributions is given. Multiperiod Gram-Charlier modelling of asset returns is described, apparently for the first
time. Formulas for equity indexed annuities premium option values are given, and a numerical illustration shows the importance of
skewness and kurtosis of the risk neutral density.
No.1 (1993) onwards available online. Download as PDF.
http://www.economics.unimelb.edu.au/ACT/papers.shtml

Longevity risk management for life and variable annuities: the effectiveness of static hedging using longevity bonds and
derivatives. Ngai, Andrew; Sherris, Michael [RKN: 44980]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (1) : 100-114.
For many years, the longevity risk of individuals has been underestimated, as survival probabilities have improved across the
developed world. The uncertainty and volatility of future longevity has posed significant risk issues for both individuals and product
providers of annuities and pensions. This paper investigates the effectiveness of static hedging strategies for longevity risk
management using longevity bonds and derivatives (q-forwards) for the retail products: life annuity, deferred life annuity, indexed
life annuity, and variable annuity with guaranteed lifetime benefits. Improved market and mortality models are developed for the
underlying risks in annuities. The market model is a regime-switching vector error correction model for GDP, inflation, interest
rates, and share prices. The mortality model is a discrete-time logit model for mortality rates with age dependence. Models were
estimated using Australian data. The basis risk between annuitant portfolios and population mortality was based on UK
experience. Results show that static hedging using q-forwards or longevity bonds reduces the longevity risk substantially for life
annuities, but significantly less for deferred annuities. For inflation-indexed annuities, static hedging of longevity is less effective
because of the inflation risk. Variable annuities provide limited longevity protection compared to life annuities and indexed
annuities, and as a result longevity risk hedging adds little value for these products.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Managing longevity and disability risks in life annuities with long term care. Levantesi, Susanna; Menzietti, Massimiliano [RKN:
45642]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 391-401.
The aim of the paper is twofold. Firstly, it develops a model for risk assessment in a portfolio of life annuities with long term care
benefits. These products are usually represented by a Markovian Multi-State model and are affected by both longevity and
disability risks. Here, a stochastic projection model is proposed in order to represent the future evolution of mortality and disability
transition intensities. Data from the Italian National Institute of Social Security (INPS) and from Human Mortality Database (HMD)
are used to estimate the model parameters. Secondly, it investigates the solvency in a portfolio of enhanced pensions. To this aim
a risk model based on the portfolio risk reserve is proposed and different rules to calculate solvency capital requirements for life
underwriting risk are examined. Such rules are then compared with the standard formula proposed by the Solvency II project.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Optimal commutable annuities to minimize the probability of lifetime ruin. Wang, Ting; Young, Virginia R [RKN: 45536]
Insurance: Mathematics & Economics (2012) 50 (1) : 200-216.
We find the minimum probability of lifetime ruin of an investor who can invest in a market with a risky and a riskless asset and who
can purchase a commutable life annuity. The surrender charge of a life annuity is a proportion of its value. Ruin occurs when the
total of the value of the risky and riskless assets and the surrender value of the life annuity reaches zero. We find the optimal
investment strategy and optimal annuity purchase and surrender strategies in two situations: (i) the value of the risky and riskless
assets is allowed to be negative, with the imputed surrender value of the life annuity keeping the total positive; (ii) the value of the
risky and riskless assets is required to be non-negative. In the first case, although the individual has the flexibility to buy or sell at
any time, we find that the individual will not buy a life annuity unless she can cover all her consumption via the annuity and she will
never sell her annuity. In the second case, the individual surrenders just enough annuity income to keep her total assets positive.
However, in this second case, the individuals annuity purchasing strategy depends on the size of the proportional surrender
charge. When the charge is large enough, the individual will not buy a life annuity unless she can cover all her consumption, the
so-called safe level. When the charge is small enough, the individual will buy a life annuity at a wealth lower than this safe level.
Available via Athens: Palgrave MacMillan
http://www.openathens.net
6


A utility-based comparison of pension funds and life insurance companies under regulatory constraints. Broeders, Dirk; Chen,
An; Koos, Birgit [RKN: 44969]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (1) : 1-10.
This paper compares two different types of annuity providers, i.e. defined benefit pension funds and life insurance companies.
One of the key differences is that the residual risk in pension funds is collectively borne by the beneficiaries and the sponsors
shareholders while in the case of life insurers it is borne by the external shareholders. First, this paper employs a contingent claim
approach to evaluate the risk return tradeoff for annuitants. For that, we take into account the differences in contract specifications
and in regulatory regimes. Second, a welfare analysis is conducted to examine whether a consumer with power utility experiences
utility gains if she chooses a defined benefit plan or a life annuity contract over a defined contribution plan. We demonstrate that
regulation can be designed to support a level playing field amongst different financial institutions.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

ARCHIMEDEAN GENERATOR

Archimedean copulas derived from Morgenstern utility functions. Spreeuw, Jaap (2012). - London: Cass Business School, 2012.
- 21 pages. [RKN: 70094]
The (additive) generator of an Archimedean copula - as well as the inverse of the generator- is a strictly decreasing and convex
function, while Morgenstern utility functions (applying to risk averse decision makers) are non decreasing and concave. This
provides a basis for deriving either a generator of Archimedean copulas, or its inverse, from a Morgenstern utility function. If we
derive the generator in this way, dependence properties of an Archimedean copula that are often taken to be desirable, match with
generally sought after properties of the corresponding utility function. It is shown how well known copula families are derived from
established utility functions. Also, some new copula families are derived, and their properties are discussed. If, on the other hand,
we instead derive the inverse of the generator from the utility function, there is a link between the magnitude of measures of risk
attitude (like the very common Arrow-Pratt coefficient of absolute risk aversion) and the strength of dependence featured by the
corresponding Archimedean copula.
1995 onwards available online. Download as PDF.
http://www.cass.city.ac.uk/research-and-faculty/faculties/faculty-of-actuarial-science-and-insurance/publications/actuarial-resear
ch-reports

ASSET ALLOCATION

On optimal pension management in a stochastic framework with exponential utility. Ma, Qing-Ping [RKN: 44976]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (1) : 61-69.
This paper reconsiders the optimal asset allocation problem in a stochastic framework for defined-contribution pension plans with
exponential utility, which has been investigated by Battocchio and Menoncin [Battocchio, P., Menoncin, F., 2004. Optimal pension
management in a stochastic framework. Insurance: Mathematics and Economics. 34, 7995]. When there are three types of
asset, cash, bond and stock, and a non-hedgeable wage risk, the optimal pension portfolio composition is horizon dependent for
pension plan members whose terminal utility is an exponential function of real wealth (nominal wealth-to-price index ratio). With
market parameters usually assumed, wealth invested in bond and stock increases as retirement approaches, and wealth invested
in cash asset decreases. The present study also shows that there are errors in the formulation of the wealth process and control
variables in solving the optimization problem in the study of Battocchio and Menoncin, which render their solution erroneous and
lead to wrong results in their numerical simulation.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

On the distribution of the (un)bounded sum of random variables. Cherubini, Umberto; Mulinacci, Sabrina; Romagnoli, Silvia [RKN:
62610]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 56-63.
We propose a general treatment of random variables aggregation accounting for the dependence among variables and bounded
or unbounded support of their sum. The approach is based on the extension to the concept of convolution to dependent variables,
involving copula functions. We show that some classes of copula functions (such as MarshallOlkin and elliptical) cannot be used
to represent the dependence structure of two variables whose sum is bounded, while Archimedean copulas can be applied only if
the generator becomes linear beyond some point. As for the application, we study the problem of capital allocation between risks
when the sum of losses is bounded.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Reactive investment strategies. Leung, Andrew P [RKN: 44979]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (1) : 89-99.
Asset liability management is a key aspect of the operation of all financial institutions. In this endeavor, asset allocation is
considered the most important element of investment management. Asset allocation strategies may be static, and as such are
usually assessed under asset models of various degrees of complexity and sophistication. In recent years attention has turned to
dynamic strategies, which promise to control risk more effectively. In this paper we present a new class of dynamic asset strategy,
7

which respond to actual events. Hence they are referred to as reactive strategies. They cannot be characterized as a series of
specific asset allocations over time, but comprise rules for determining such allocations as the world evolves. Though they depend
on how asset returns and other financial variables are modeled, they are otherwise objective in nature. The resulting strategies are
optimal, in the sense that they can be shown to outperform all other strategies of their type when no asset allocation constraints
are imposed. Where such constraints are imposed, the strategies may be demonstrated to be almost optimal, and dramatically
more effective than static strategies.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

ASSET MANAGEMENT

Markowitz's mean-variance asset-liability management with regime switching: a multi-period model. Chen, Peng; Yang, Hailiang
[RKN: 45252]
Applied Mathematical Finance (2011) 18 (1-2) : 29-50.
This paper considers an optimal portfolio selection problem under Markowitz's mean-variance portfolio selection problem in a
multi-period regime-switching model. We assume that there are n + 1 securities in the market. Given an economic state which is
modelled by a finite state Markov chain, the return of each security at a fixed time point is a random variable. The return random
variables may be different if the economic state is changed even for the same security at the same time point. We start our
analysis from the no-liability case, in the spirit of Li and Ng (2000), both the optimal investment strategy and the efficient frontier
are derived. Then we add uncontrollable liability into the model. By direct comparison with the no-liability case, the optimal strategy
can be derived explicitly.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

ASSET PRICES

On the spurious correlation between sample betas and mean returns. Levy, Moshe Routledge, [RKN: 45843]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (3-4) : 341-360.
Cornerstone asset pricing models, such as capital asset pricing model (CAPM) and arbitrage pricing theory (APT), yield
theoretical predictions about the relationship between expected returns and exposure to systematic risk, as measured by beta(s).
Numerous studies have investigated the empirical validity of these models. We show that even if no relationship holds between
true expected returns and betas in the population, the existence of low-probability extreme outcomes induces a spurious
correlation between the sample means and the sample betas. Moreover, the magnitude of this purely spurious correlation is
similar to the empirically documented correlation, and the regression slopes and intercepts are very similar as well. This result
does not necessarily constitute evidence against the theoretical asset pricing models, but it does shed new light on previous
empirical results, and it points to an issue that should be carefully considered in the empirical testing of these models. The analysis
points to the dangers of relying on simple least squares regression for drawing conclusions about the validity of equilibrium pricing
models.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

ASSETS

The implied market price of weather risk. Hardlea, Wolfgang Karl; Caberaa, Brenda Lopez Routledge, [RKN: 45795]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (1-2) : 59-95.
Weather derivatives (WD) are end-products of a process known as securitization that transforms non-tradable risk factors
(weather) into tradable financial assets. For pricing and hedging non-tradable assets, one essentially needs to incorporate the
market price of risk (MPR), which is an important parameter of the associated equivalent martingale measure (EMM). The majority
of papers so far has priced non-tradable assets assuming zero or constant MPR, but this assumption yields biased prices and has
never been quantified earlier under the EMM framework. Given that liquid-derivative contracts based on daily temperature are
traded on the Chicago Mercantile Exchange (CME), we infer the MPR from traded futures-type contracts (CAT, CDD, HDD and
AAT). The results show how the MPR significantly differs from 0, how it varies in time and changes in sign. It can be
parameterized, given its dependencies on time and temperature seasonal variation. We establish connections between the
market risk premium (RP) and the MPR.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

Portfolio selection problem with multiple risky assets under the constant elasticity of variance model. Zhao, Hui; Rong, Ximin
[RKN: 45534]
Insurance: Mathematics & Economics (2012) 50 (1) : 179-190.
This paper focuses on the constant elasticity of variance (CEV) model for studying the utility maximization portfolio selection
problem with multiple risky assets and a risk-free asset. The HamiltonJacobiBellman (HJB) equation associated with the
portfolio optimization problem is established. By applying a power transform and a variable change technique, we derive the
explicit solution for the constant absolute risk aversion (CARA) utility function when the elasticity coefficient is -1 or 0. In order to
8

obtain a general optimal strategy for all values of the elasticity coefficient, we propose a model with two risky assets and one
risk-free asset and solve it under a given assumption. Furthermore, we analyze the properties of the optimal strategies and
discuss the effects of market parameters on the optimal strategies. Finally, a numerical simulation is presented to illustrate the
similarities and differences between the results of the two models proposed in this paper.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

ASYMMETRIC INFORMATION

The endogenous price dynamics of emission allowances and an application to CO2 option pricing. Chesney, Marc; Taschini, Luca
[RKN: 45878]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (5-6) : 447-475.
Market mechanisms are increasingly being used as a tool for allocating somewhat scarce but unpriced rights and resources, and
the European Emission Trading Scheme is an example. By means of dynamic optimization in the contest of firms covered by such
environmental regulations, this article generates endogenously the price dynamics of emission permits under asymmetric
information, allowing inter-temporal banking and borrowing. In the market, there are a finite number of firms and each firm's
pollution emission follows an exogenously given stochastic process. We prove the discounted permit price is a martingale with
respect to the relevant filtration. The model is solved numerically. Finally, a closed-form pricing formula for European-style options
is derived.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

BANDWIDTH PARAMETER

A Bayesian approach to parameter estimation for kernel density estimation via transformations. Liu, Qing; Pitt, David; Zhang,
Xibin; Wu, Xueyuan Institute and Faculty of Actuaries; Cambridge University Press, - 13 pages. [RKN: 74950]
Shelved at: Per: AAS (Oxf) Per: AAS (Lon)
Annals of Actuarial Science (2011) 5(2) : 181-193.
In this paper, we present a Markov chain Monte Carlo (MCMC) simulation algorithm for estimating parameters in the kernel
density estimation of bivariate insurance claim data via transformations. Our data set consists of two types of auto insurance claim
costs and exhibits a high-level of skewness in the marginal empirical distributions. Therefore, the kernel density estimator based
on original data does not perform well. However, the density of the original data can be estimated through estimating the density of
the transformed data using kernels. It is well known that the performance of a kernel density estimator is mainly determined by the
bandwidth, and only in a minor way by the kernel. In the current literature, there have been some developments in the area of
estimating densities based on transformed data, where bandwidth selection usually depends on pre-determined transformation
parameters. Moreover, in the bivariate situation, the transformation parameters were estimated for each dimension individually.
We use a Bayesian sampling algorithm and present a Metropolis-Hastings sampling procedure to sample the bandwidth and
transformation parameters from their posterior density. Our contribution is to estimate the bandwidths and transformation
parameters simultaneously within a Metropolis-Hastings sampling procedure. Moreover, we demonstrate that the correlation
between the two dimensions is better captured through the bivariate density estimator based on transformed data.
http://www.actuaries.org.uk/research-and-resources/pages/access-journals

BAYES THEOREM

Bayesian multivariate Poisson models for insurance ratemaking. Bermudez, Lluis; Karlis, Dimitris [RKN: 40017]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 226-236.
When actuaries face the problem of pricing an insurance contract that contains different types of coverage, such as a motor
insurance or a homeowners insurance policy, they usually assume that types of claim are independent. However, this assumption
may not be realistic: several studies have shown that there is a positive correlation between types of claim. Here we introduce
different multivariate Poisson regression models in order to relax the independence assumption, including zero-inflated models to
account for excess of zeros and overdispersion. These models have been largely ignored to date, mainly because of their
computational difficulties. Bayesian inference based on MCMC helps to resolve this problem (and also allows us to derive, for
several quantities of interest, posterior summaries to account for uncertainty). Finally, these models are applied to an automobile
insurance claims database with three different types of claim. We analyse the consequences for pure and loaded premiums when
the independence assumption is relaxed by using different multivariate Poisson regression models together with their zero-inflated
versions.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Bayesian theory and methods with applications. Savchuk, Vladimir; Tsokos, Chris P (2011). Atlantis Press, 2011. - 317 pages.
[RKN: 74707]
Shelved at: 519.5
Bayesian methods are growing more and more popular, finding new practical applications in the fields of health sciences,
engineering, environmental sciences, business and economics and social sciences, among others. This book explores the use of
9

Bayesian analysis in the statistical estimation of the unknown phenomenon of interest. The contents demonstrate that where such
methods are applicable, they offer the best possible estimate of the unknown. Beyond presenting Bayesian theory and methods of
analysis, the text is illustrated with a variety of applications to real world problems.

BAYESIAN ANALYSIS

A Bayesian approach for estimating extreme quantiles under a semiparametric mixture model. Cabras, Stefano; Castellanos,
Maria Eugenia [RKN: 45301]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU/AST
ASTIN Bulletin (2011) 41 (1) : 87-106.
In this paper we propose an additive mixture model, where one component is the Generalized Pareto distribution (GPD) that
allows us to estimate extreme quantiles. GPD plays an important role in modeling extreme quantiles for the wide class of
distributions belonging to the maximum domain of attraction of an extreme value model. One of the main difficulty with this
modeling approach is the choice of the threshold u, such that all observations greater than u enter into the likelihood function of the
GPD model. Difficulties are due to the fact that GPD parameter estimators are sensible to the choice of u. In this work we estimate
u, and other parameters, using suitable priors in a Bayesian approach. In particular, we propose to model all data, extremes and
non-extremes, using a semiparametric model for data below u, and the GPD for the exceedances over u. In contrast to the usual
estimation techniques for u, in this setup we account for uncertainty on all GPD parameters, including u, via their posterior
distributions. A Monte Carlo study shows that posterior credible intervals also have frequentist coverages. We further illustrate the
advantages of our approach on two applications from insurance.
online access via International Actuarial Association:
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Bayesian modelling of the time delay between diagnosis and settlement for Critical Illness Insurance using a Burr
generalised-linear-type model. Ozkok, Erengul; Streftaris, George; Waters, Howard R; Wilkie, A David [RKN: 45600]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (2) : 266-279.
We discuss Bayesian modelling of the delay between dates of diagnosis and settlement of claims in Critical Illness Insurance
using a Burr distribution. The data are supplied by the UK Continuous Mortality Investigation and relate to claims settled in the
years 19992005. There are non-recorded dates of diagnosis and settlement and these are included in the analysis as missing
values using their posterior predictive distribution and MCMC methodology. The possible factors affecting the delay (age, sex,
smoker status, policy type, benefit amount, etc.) are investigated under a Bayesian approach. A 3-parameter Burr
generalised-linear-type model is fitted, where the covariates are linked to the mean of the distribution. Variable selection using
Bayesian methodology to obtain the best model with different prior distribution setups for the parameters is also applied. In
particular, Gibbs variable selection methods are considered, and results are confirmed using exact marginal likelihood findings
and related Laplace approximations. For comparison purposes, a lognormal model is also considered.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Bayesian stochastic mortality modelling for two populations. Cairns, Andrew J G; Blake, David; Dowd, Kevin; Coughlan, Guy D;
Khalaf-Allah, Marwa [RKN: 45299]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU/AST
ASTIN Bulletin (2011) 41 (1) : 29-59.
This paper introduces a new framework for modelling the joint development over time of mortality rates in a pair of related
populations with the primary aim of producing consistent mortality forecasts for the two populations. The primary aim is achieved
by combining a number of recent and novel developments in stochastic mortality modelling, but these, additionally, provide us with
a number of side benefits and insights for stochastic mortality modelling. By way of example, we propose an Age-Period-Cohort
model which incorporates a mean-reverting stochastic spread that allows for different trends in mortality improvement rates in the
short-run, but parallel improvements in the long run. Second, we fit the model using a Bayesian framework that allows us to
combine estimation of the unobservable state variables and the parameters of the stochastic processes driving them into a single
procedure. Key benefits of this include dampening down of the impact of Poisson variation in death counts, full allowance for
paramater uncertainty, and the flexibility to deal with missing data. The framework is designed for large populations coupled with
a small sub-population and is applied to the England & Wales national and Continuous Mortality Investigation assured lives males
populations. We compare and contrast results based on the two-population approach with single-population results.
online access via International Actuarial Association:
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Bayesian theory and methods with applications. Savchuk, Vladimir; Tsokos, Chris P (2011). Atlantis Press, 2011. - 317 pages.
[RKN: 74707]
Shelved at: 519.5
Bayesian methods are growing more and more popular, finding new practical applications in the fields of health sciences,
engineering, environmental sciences, business and economics and social sciences, among others. This book explores the use of
Bayesian analysis in the statistical estimation of the unknown phenomenon of interest. The contents demonstrate that where such
methods are applicable, they offer the best possible estimate of the unknown. Beyond presenting Bayesian theory and methods of
analysis, the text is illustrated with a variety of applications to real world problems.

10

BAYESIAN INFERENCE

Estimating copulas for insurance from scarce observations, expert opinion and prior information : A Bayesian approach.
Arbenz, Philipp; Canestrabo, Davide - 20 pages. [RKN: 70751]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2012) 42 (1) : 271-290.
A prudent assessment of dependence is crucial in many stochastic models for insurance risks. Copulas have become popular to
model such dependencies. However, estimation procedures for copulas often lead to large parameter uncertainty when
observations are scarce. In this paper, we propose a Bayesian method which combines prior information (e.g. from regulators),
observations and expert opinion in order to estimate copula parameters and determine the estimation uncertainty. The
combination of different sources of information can significantly reduce the parameter uncertainty compared to the use of only one
source. The model can also account for uncertainty in the marginal distributions. Furthermore, we describe the methodology for
obtaining expert opinion and explain involved psychological effects and popular fallacies. We exemplify the approach in a case
study.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

BAYESIAN METHODS

Automated graduation using Bayesian trans-dimensional models. Verrall, Richard J; Haberman, S Institute and Faculty of
Actuaries; Cambridge University Press, - 21 pages. [RKN: 74953]
Shelved at: Per: AAS (Oxf) Per: AAS (Lon)
Annals of Actuarial Science (2011) 5(2) : 231-251.
This paper presents a new method of graduation which uses parametric formulae together with Bayesian reversible jump Markov
chain Monte Carlo methods. The aim is to provide a method which can be applied to a wide range of data, and which does not
require a lot of adjustment or modification. The method also does not require one particular parametric formula to be selected:
instead, the graduated values are a weighted average of the values from a range of formulae. In this way, the new method can be
seen as an automatic graduation method which we believe can be applied in many cases without any adjustments and provide
satisfactory graduated values. An advantage of a Bayesian approach is that it allows for model uncertainty unlike standard
methods of graduation.
http://www.actuaries.org.uk/research-and-resources/pages/access-journals

A maximum-entropy approach to the linear credibility formula. Najafabadi, Amir T. Payandeh; Hatami, Hamid; Najafabadi, Maryam
Omidi [RKN: 45738]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 216-221.
Payandeh [Payandeh Najafabadi, A.T., 2010. A new approach to credibility formula. Insurance: Mathematics and Economy 46,
334338] introduced a new technique to approximate a Bayes estimator with the exact credibilitys form. This article employs a
well known and powerful maximum-entropy method (MEM) to extend results of Payandeh Najafabadi (2010) to a class of linear
credibility, whenever claim sizes have been distributed according to the logconcave distributions. Namely, (i) it employs the
maximum-entropy method to approximate an appropriate Bayes estimator (with respect to either the square-error or the Linex
loss functions and general increasing and bounded prior distribution) by a linear combination of claim sizes; (ii) it establishes that
such an approximation coincides with the exact credibility formula whenever the require conditions for the exact credibility (see
below) are held. Some properties of such an approximation are discussed. Application to crop insurance has been given.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

BLACK-SCHOLES

Fourier transforms, option pricing and controls. Joshi, Mark; Yang, Chao (2011). - Victoria: University of Melbourne, 2011. - 20
pages. [RKN: 74773]
We incorporate a simple and effective control-variate into Fourier inversion formulas for vanilla option prices. The control-variate
used in this paper is the Black-Scholes formula whose volatility parameter is determined in a generic non-arbitrary fashion. We
analyze contour dependence both in terms of value and speed of convergence. We use Gaussian quadrature rules to invert
Fourier integrals, and numerical results suggest that performing the contour integration along the real axis leads to the best pricing
performance
No.1 (1993) onwards available online. Download as PDF.
http://www.economics.unimelb.edu.au/ACT/papers.shtml

BONDS

Bonds and options in exponentially affine bond models. Bermin, Hans-Peter [RKN: 45881]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (5-6) : 513-534.
In this article we apply the FlesakerHughston approach to invert the yield curve and to price various options by letting the
randomness in the economy be driven by a process closely related to the short rate, called the abstract short rate. This process is
11

a pure deterministic translation of the short rate itself, and we use the deterministic shift to calibrate the models to the initial yield
curve. We show that we can solve for the shift needed in closed form by transforming the problem to a new probability measure.
Furthermore, when the abstract short rate follows a CoxIngersollRoss (CIR) process we compute bond option and swaption
prices in closed form. We also propose a short-rate specification under the risk-neutral measure that allows the yield curve to be
inverted and is consistent with the CIR dynamics for the abstract short rate, thus giving rise to closed form bond option and
swaption prices.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

Optimal asset allocation for DC pension plans under inflation. Han, Nan-wei; Hung, Mao-Wei [RKN: 45734]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 172-181.
In this paper, the stochastic dynamic programming approach is used to investigate the optimal asset allocation for a
defined-contribution pension plan with downside protection under stochastic inflation. The plan participant invests the fund wealth
and the stochastic interim contribution flows into the financial market. The nominal interest rate model is described by the
CoxIngersollRoss (Cox et al., 1985) dynamics. To cope with the inflation risk, the inflation indexed bond is included in the asset
menu. The retired individuals receive an annuity that is indexed by inflation and a downside protection on the amount of this
annuity is considered. The closed-form solution is derived under the CRRA utility function. Finally, a numerical application is
presented to characterize the dynamic behavior of the optimal investment strategy.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Statistical analysis of the spreads of catastrophe bonds at the time of issue. Papachristou, Dimitris [RKN: 45308]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU/AST
ASTIN Bulletin (2011) 41 (1) : 251-273.
In this paper the catastrophe bond prices, as determined by the market, are analysed. The limited published work in this area has
been carried out mainly by cat bond investors and is based either on intuition, or on simple linear regression on one factor or on
comparisons of the prices of cat bonds with similar features. In this paper a Generalised Additive Model is fitted to the market data.
The statistical significance of different factors which may affect the cat bond prices is examined and the effect of these factors on
the prices is measured. A statistical framework and analysis could provide insight into the cat bond pricing and could have
applications among other things in the construction of a cat bond portfolio, cat bond price indices and in understanding changes of
the price of risk over time.
online access via International Actuarial Association:
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

BONUS SYSTEMS

Risk comparison of different bonus distribution approaches in participating life insurance. Zemp, Alexandra [RKN: 44967]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (2) : 249-264.
The fair pricing of explicit and implicit options in life insurance products has received broad attention in the academic literature over
the past years. Participating life insurance (PLI) contracts have been the focus especially. These policies are typically
characterized by a term life insurance, a minimum interest rate guarantee, and bonus participation rules with regard to the
insurers asset returns or reserve situation. Researchers replicate these bonus policies quite differently. We categorize and
formally present the most common PLI bonus distribution mechanisms. These bonus models closely mirror the Danish, German,
British, and Italian regulatory framework. Subsequently, we perform a comparative analysis of the different bonus models with
regard to risk valuation. We calibrate contract parameters so that the compared contracts have a net present value of zero and the
same safety level as the initial position, using risk-neutral valuation. Subsequently, we analyze the effect of changes in the asset
volatility and in the initial reserve amount (per contract) on the value of the default put option (DPO), while keeping all other
parameters constant. Our results show that DPO values obtained with the PLI bonus distribution model of Bacinello (2001), which
replicates the Italian regulatory framework, are most sensitive to changes in volatility and initial reserves.

Available via Athens: Palgrave MacMillan
http://www.openathens.net

BOOTSTRAP

Bootstrapping individual claim histories. Rosenlund, Stig - 34 pages. [RKN: 70752]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2012) 42 (1) : 291-324.
The bootstrap method BICH is given for estimating mean square prediction errors and predictive distributions of non-life claim
reserves under weak conditions. The dates of claim occurrence, reporting and finalization and the payment dates and amounts of
individual finalized historic claims form a claim set from which samples with replacement are drawn. We assume that all claims are
independent and that the historic claims are distributed as the object claims, possibly after inflation adjustment and segmentation
on a background variable, whose distribution could have changed over time due to portfolio change. Also we introduce the new
reserving function RDC, using all these dates and payments for reserve predictions. We study three reserving functions: chain
ladder, the Schnieper (1991) method and RDC. Checks with simulated cases obeying the assumptions of Mack (1999) for chain
ladder and Liu and Verrall (2009) for Schniepers method, respectively, confirm the validity of our method. BICH is used to
12

compare the three reserving functions, of which RDC is found overall best in simulated cases.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Capital allocation using the bootstrap. Kim, Joseph H T Society of Actuaries, - 18 pages. [RKN: 74919]
Shelved at: Per: NAAJ (Oxf) Per NAAJ (Lon) Shelved at: JOU
North American Actuarial Journal (2011) 15 (4) : 499-516.
This paper investigates the use of the bootstrap in capital allocation. In particular, for the distortion risk measure (DRM) class, we
show that the exact bootstrap estimate is available in analytic form for the allocated capital. We then theoretically justify the
bootstrap bias correction for the allocated capital induced from the concave DRM when the conditional mean function is strictly
monotone. A numerical example shows a tradeoff exists between the bias reduction and variance increase in bootstrapping the
allocated capital. However, unlike the aggregate capital case, the variance increase of the bias-corrected allocated capital
estimate substantially outweighs the benefit of bias correction, making the bootstrap bias correction at the allocated capital level
not as useful. Overall, the exact bootstrap without bias correction offers an efficient method for determining allocation over the
ordinary resampling bootstrap estimate and the empirical counterpart.
http://www.soa.org/news-and-publications/publications/journals/naaj/naaj-detail.aspx

Dependent loss reserving using copulas. Shi, Peng; Frees, Edward W - 38 pages. [RKN: 74743]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2011) 41 (2) : 449-486.
Modeling dependencies among multiple loss triangles has important implications for the determination of loss reserves, a critical
element of risk management and capital allocation practices of property-casualty insurers. In this article, we propose a copula
regression model for dependent lines of business that can be used to predict unpaid losses and hence determine loss reserves.
The proposed method, relating the payments in different run-off triangles through a copula function, allows the analyst to use
flexible parametric families for the loss distribution and to understand the associations among lines of business. Based on the
copula model, a parametric bootstrap procedure is developed to incorporate the uncertainty in parameter estimates. To illustrate
this method, we consider an insurance portfolio consisting of personal and commercial automobile lines. When applied to the data
of a major US property-casualty insurer, our method provides comparable point prediction of unpaid losses with the industry's
standard practice, chain-ladder estimates. Moreover, our flexible structure allows us to easily compute the entire predictive
distribution of unpaid losses. This procedure also readily yields accident year reserves, calendar year reserves, as well as the
aggregate reserves. One important implication of the dependence modeling is that it allows analysts to quantify the diversification
effects in risk capital analysis. We demonstrate these effects by calculating commonly used risk measures, including value at risk
and conditional tail expectation, for the insurer's combined portfolio of personal and commercial automobile lines.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Double chain ladder. Martnez Miranda, Mara Dolores; Nielsen, Jens Perch; Verrall, Richard - 18 pages. [RKN: 70743]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2012) 42 (1) : 59-76.
By adding the information of reported count data to a classical triangle of reserving data, we derive a suprisingly simple method for
forecasting IBNR and RBNS claims. A simple relationship between development factors allows to involve and then estimate the
reporting and payment delay. Bootstrap methods provide prediction errors and make possible the inference about IBNR and
RBNS claims, separately.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

A generalized linear model with smoothing effects for claims reserving. Bjrkwall, Susanna; Hssjer, Ola; Ohlsson, Esbjrn; Verrall,
Richard [RKN: 44972]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (1) : 27-37.
In this paper, we continue the development of the ideas introduced in England and Verrall (2001) by suggesting the use of a
reparameterized version of the generalized linear model (GLM) which is frequently used in stochastic claims reserving. This model
enables us to smooth the origin, development and calendar year parameters in a similar way as is often done in practice, but still
keep the GLM structure. Specifically, we use this model structure in order to obtain reserve estimates and to systemize the model
selection procedure that arises in the smoothing process. Moreover, we provide a bootstrap procedure to achieve a full predictive
distribution.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

On the wrong foot : Letter to the editor. Cox, Andrew Staple Inn Actuarial Society, - 1 pages. [RKN: 73932]
Shelved at: Per: Actuary (Oxf) Per: Actuary (Lon) Shelved at: JOU
The Actuary (2012) May : 6.
Suggesting that Verrall and England should have accepted Leong's use of the phrase 'bootstrap model' as it has changed in
meaning to become a shorthand for the Poisson model in some actuarial circles.
http://www.theactuary.com/

Parameter uncertainty in exponential family tail estimation. Landsman, Z; Tsanakas, A - 30 pages. [RKN: 70746]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2012) 42 (1) : 123-152.
Actuaries are often faced with the task of estimating tails of loss distributions from just a few observations. Thus estimates of tail
probabilities (reinsurance prices) and percentiles (solvency capital requirements) are typically subject to substantial parameter
uncertainty. We study the bias and MSE of estimators of tail probabilities and percentiles, with focus on 1-parameter exponential
families. Using asymptotic arguments it is shown that tail estimates are subject to signifi cant positive bias. Moreover, the use of
bootstrap predictive distributions, which has been proposed in the actuarial literature as a way of addressing parameter
uncertainty, is seen to double the estimation bias. A bias corrected estimator is thus proposed. It is then shown that the MSE of the
MLE, the parametric bootstrap and the bias corrected estimators only differ in terms of order O(n2), which provides
decision-makers with some fl exibility as to which estimator to use. The accuracy of asymptotic methods, even for small samples,
is demonstrated exactly for the exponential and related distributions, while other 1-parameter distributions are considered in a
simulation study. We argue that the presence of positive bias may be desirable in solvency capital calculations, though not
13

necessarily in pricing problems.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

These boots are made for working : Letter to the editor. Verrall, Richard; England, Peter Staple Inn Actuarial Society, - 1 pages.
[RKN: 73929]
Shelved at: Per: Actuary (Oxf) Per: Actuary (Lon) Shelved at: JOU
The Actuary (2012) April : 6.
Corrects Jessica Leong (December edition) in her article referring to the Bootstrap model. This should be referred to as a Chain
Ladder model, as bootstrapping is a statistical procedure.
http://www.theactuary.com/

BROWNIAN MOTION

Arbitrage in skew Brownian motion models. Rossello, Damiano [RKN: 44989]
Insurance: Mathematics & Economics (2012) 50 (1) : 50-56.
Empirical skewness of asset returns can be reproduced by stochastic processes other than the Brownian motion with drift. Some
authors have proposed the skew Brownian motion for pricing as well as interest rate modelling. Although the asymmetric feature of
random return involved in the stock price process is driven by a parsimonious one-dimensional model, we will show how this is
intrinsically incompatible with a modern theory of arbitrage in continuous time. Application to investment performance and to the
BlackScholes pricing model clearly emphasize how this process can provide some kind of arbitrage.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

On a mean reverting dividend strategy with Brownian motion. Avanzi, Benjamin; Wong, Bernard [RKN: 44781]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51(2) : 229-238.
In actuarial risk theory, the introduction of dividend pay-outs in surplus models goes back to de Finetti (1957). Dividend strategies
that can be found in the literature often yield pay-out patterns that are inconsistent with actual practice. One issue is the high
variability of the dividend payment rates over time. We aim at addressing that problem by specifying a dividend strategy that yields
stable dividend pay-outs over time. In this paper, we model the surplus of a company with a Brownian risk model. Dividends are
paid at a constant rate g of the companys modified surplus (after distribution of dividends), which operates as a buffer reservoir to
yield a regular flow of shareholders income. The dividend payment rate reverts around the drift of the original process , whereas
the modified surplus itself reverts around the level l=/g. We determine the distribution of the present value of dividends when the
surplus process is never absorbed. After introducing an absorbing barrier a (inferior to the initial surplus) and stating the Laplace
transform of the time of absorption, we derive the expected present value of dividends until absorption. The latter is then also
determined if dividends are not paid whenever the surplus is too close to the absorbing barrier. The calculation of the optimal value
of the parameter l (and equivalently g) is discussed. We conclude by comparing both barrier and mean reverting dividend
strategies.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

A two-dimensional extension of Bougerols identity in law for the exponential functional of Brownian motion. Dufresne, D; Yor, M
(2011). - Victoria: University of Melbourne, 2011. - 15 pages. [RKN: 74772]
No.1 (1993) onwards available online. Download as PDF.
http://www.economics.unimelb.edu.au/ACT/papers.shtml
CANCER

The genetics of breast and ovarian cancer IV: a model of breast cancer progression. Lu, Baopeng; Macdonald, Angus S; Waters,
Howard R [RKN: 44921]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 4 : 239-266.
Gui et al. (2006) in Part III of a series of papers, proposed a dynamic family history model of breast cancer and ovarian cancer in
which the development of a family history was represented explicitly as a transition between states, and then applied this model to
life insurance and critical illness insurance. In this study, the authors extend the model to income protection insurance. In this
paper, Part IV of the series, the authors construct and parameterise a semi-Markov model for the life history of a woman with
breast cancer, in which events such as diagnosis, treatment, recovery and recurrence are incorporated. In Part V, we then show:
(a) estimates of premium ratings depending on genotype or family history; and (b) the impact of adverse selection under various
moratoria on the use of genetic information.

The genetics of breast and ovarian cancer V: application to income protection insurance. Lu, Baopeng; Macdonald, Angus S;
Waters, Howard R; Yu, Fei [RKN: 44922]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 4 : 267-291.
In part IV, we presented a comprehensive model of a life history of a woman at risk of breast cancer (BC), in which relevant events
such as diagnosis, treatment, recovery and recurrence were represented explicitly, and corresponding transition intensities werre
estimated. In this part, the authors study some applications to income protection insurance (IPI) business. The authors calculate
premiums based either on genetic test results or more practically on a family history of breast cancer. They then extend the model
into an Income Protection Insurance model by incorporating rates of insurance-buying behaviour, in order to estimate the possible
costs of adverse selection, in terms of increased premiums, under various moratoria on the use of genetic information.
14


CAPITAL

Optimal dividends and capital injections in the dual model with diffusion. Avanzi, Benjamin; Shen, Jonathan; Wong, Bernard - 34
pages. [RKN: 74748]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2011) 41 (2) : 611-644.
The dual model with diffusion is appropriate for companies with continuous expenses that are offset by stochastic and irregular
gains. Examples include research-based or commission-based companies. In this context, Avanzi and Gerber (2008) showed
how to determine the expected present value of dividends, if a barrier strategy is followed. In this paper, we further include capital
injections and allow for (proportional) transaction costs both on dividends and capital injections. We determine the optimal
dividend and (unconstrained) capital injection strategy (among all possible strategies) when jumps are hyperexponential. This
strategy happens to be either a dividend barrier strategy without capital injections, or another dividend barrier strategy with forced
injections when the surplus is null to prevent ruin. The latter is also shown to be the optimal dividend and capital injection strategy,
if ruin is not allowed to occur. Both the choice to inject capital or not and the level of the optimal barrier depend on the parameters
of the model. In all cases, we determine the optimal dividend barrier and show its existence and uniqueness. We also provide
closed form representations of the value functions when the optimal strategy is applied. Results are illustrated.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Stochastic comparisons of capital allocations with applications. Xu, Maochao; Hu, Taizhong [RKN: 45633]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 293-298.
This paper studies capital allocation problems using a general loss function. Stochastic comparisons are conducted for general
loss functions in several scenarios: independent and identically distributed risks; independent but non-identically distributed risks;
comonotonic risks. Applications in optimal capital allocations and policy limits allocations are discussed as well.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

TVaR-based capital allocation for multivariate compound distributions with positive continuous claim amounts. Cossette,
Helene; Mailhot, Melina; Marceau, Etienne [RKN: 45598]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (2) : 247-256.
In this paper, we consider a portfolio of n dependent risks X1,,Xn and we study the stochastic behavior of the aggregate claim
amount [unable to display]. Our objective is to determine the amount of economic capital needed for the whole portfolio and to
compute the amount of capital to be allocated to each risk X1,,Xn. To do so, we use a topdown approach. For (X1,,Xn), we
consider risk models based on multivariate compound distributions defined with a multivariate counting distribution. We use the
TVaR to evaluate the total capital requirement of the portfolio based on the distribution of S, and we use the TVaR-based capital
allocation method to quantify the contribution of each risk. To simplify the presentation, the claim amounts are assumed to be
continuously distributed. For multivariate compound distributions with continuous claim amounts, we provide general formulas for
the cumulative distribution function of S, for the TVaR of S and the contribution to each risk. We obtain closed-form expressions for
those quantities for multivariate compound distributions with gamma and mixed Erlang claim amounts. Finally, we treat in detail
the multivariate compound Poisson distribution case. Numerical examples are provided in order to examine the impact of the
dependence relation on the TVaR of S, the contribution to each risk of the portfolio, and the benefit of the aggregation of several
risks.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

CAPITAL ALLOCATION

Alarm system for insurance companies : A strategy for capital allocation. Das, S; Kratz, M [RKN: 45723]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 53-65.
One possible way of risk management for an insurance company is to develop an early and appropriate alarm system before the
possible ruin. The ruin is defined through the status of the aggregate risk process, which in turn is determined by premium
accumulation as well as claim settlement outgo for the insurance company. The main purpose of this work is to design an effective
alarm system, i.e. to define alarm times and to recommend augmentation of capital of suitable magnitude at those points to reduce
the chance of ruin. To draw a fair measure of effectiveness of alarm system, comparison is drawn between an alarm system, with
capital being added at the sound of every alarm, and the corresponding system without any alarm, but an equivalently higher initial
capital. Analytical results are obtained in general setup and this is backed up by simulated performances with various types of loss
severity distributions. This provides a strategy for suitably spreading out the capital and yet addressing survivability concerns at
factory level.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Analytic loss distributional approach models for operational risk from the a-stable doubly stochastic compound processes and
implications for capital allocation. Peters, Gareth W; Shevchenko, Pavel V; Young, Mark; Yip, Wendy [RKN: 44957]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 565-579.
Under the Basel II standards, the Operational Risk (OpRisk) advanced measurement approach is not prescriptive regarding the
15

class of statistical model utilized to undertake capital estimation. It has however become well accepted to utilize a Loss
Distributional Approach (LDA) paradigm to model the individual OpRisk loss processes corresponding to the Basel II Business
line/event type. In this paper we derive a novel class of doubly stochastic -stable family LDA models. These models provide the
ability to capture the heavy tailed loss processes typical of OpRisk, whilst also providing analytic expressions for the compound
processes annual loss density and distributions, as well as the aggregated compound processes annual loss models. In particular
we develop models of the annual loss processes in two scenarios. The first scenario considers the loss processes with a
stochastic intensity parameter, resulting in inhomogeneous compound Poisson processes annually. The resulting arrival
processes of losses under such a model will have independent counts over increments within the year. The second scenario
considers discretization of the annual loss processes into monthly increments with dependent time increments as captured by a
Binomial processes with a stochastic probability of success changing annually. Each of these models will be coupled under an
LDA framework with heavy-tailed severity models comprised of -stable severities for the loss amounts per loss event. In this paper
we will derive analytic results for the annual loss distribution density and distribution under each of these models and study their
properties.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

CAPITAL ASSET PRICING MODEL

Calibration of stock betas from skews of implied volatilities. Fouque, Jean-Pierre; Kollman, Eli [RKN: 45256]
Applied Mathematical Finance (2011) 18 (1-2) : 119-137.
We develop call option price approximations for both the market index and an individual asset using a singular perturbation of a
continuous-time capital asset pricing model in a stochastic volatility environment. These approximations show the role played by
the asset's beta parameter as a component of the parameters of the call option price of the asset. They also show how these
parameters, in combination with the parameters of the call option price for the market, can be used to extract the beta parameter.
Finally, a calibration technique for the beta parameter is derived using the estimated option price parameters of both the asset and
market index. The resulting estimator of the beta parameter is not only simple to implement but has the advantage of being
forward looking as it is calibrated from skews of implied volatilities.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

CASH FLOW

Cash flow simulation for a model of outstanding liabilities based on claim amounts and claim numbers. Martinez Miranda, Maria
Dolores; Nielsen, Bent; Nielsen, Jens Perch; Verrall, Richard [RKN: 45302]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU/AST
ASTIN Bulletin (2011) 41 (1) : 107-129.
In this paper we develop a full stochastic cash flow model of outstanding liabilities for the model developed in Verrall, Nielsen and
Jessen (2010). This model is based on the simple triangular data available in most non-life insurance companies. By using more
data, it is expected that the method will have less volatility than the celebrated chain ladder method. Eventually, our method will
lead to lower solvency requirements for those insurance companies that decide to collect counts data and replace their
conventional chain ladder method.
online access via International Actuarial Association:
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Classical and singular stochastic control for the optimal dividend policy when there is regime switching. Sotomayor, Luz R;
Cadenillas, Abel [RKN: 45128]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (3) : 344-354.
Motivated by economic and empirical arguments, we consider a company whose cash surplus is affected by macroeconomic
conditions. Specifically, we model the cash surplus as a Brownian motion with drift and volatility modulated by an observable
continuous-time Markov chain that represents the regime of the economy. The objective of the management is to select the
dividend policy that maximizes the expected total discounted dividend payments to be received by the shareholders. We study two
different cases: bounded dividend rates and unbounded dividend rates. These cases generate, respectively, problems of classical
stochastic control with regime switching and singular stochastic control with regime switching. We solve these problems, and
obtain the first analytical solutions for the optimal dividend policy in the presence of business cycles. We prove that the optimal
dividend policy depends strongly on macroeconomic conditions.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Convex order approximations in the case of cash flows of mixed signs. Dhaene, Jan; Goovaerts, Marc; Vanmaele, Michle; Van
Weert, Koen [RKN: 44783]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51(2) : 249-256.
In Van Weert et al. (2010) [K Van Weert, J Dhaene, M Goovaerts, Optimal portfolio selection for general provisioning and terminal
wealth problems, Insurance: Mathematics and Economics, 47 (1) (2010): 9097], results are obtained showing that, when
allowing some of the cash flows to be negative, convex order lower bound approximations can still be used to solve general
investment problems in a context of provisioning or terminal wealth. In this paper, a correction and further clarification of the
reasoning of Van Weert et al. (2010) are given, thereby significantly expanding the scope of problems and cash flow patterns for
16

which the terminal wealth or initial provision can be accurately approximated. Also an interval for the probability level is derived in
which the quantiles of the lower bound approximation can be computed. Finally, it is shown how one can move from a context of
provisioning of future obligations to a saving and terminal wealth problem by inverting the time axis.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

CATASTROPHE

Extreme Events: Robust Portfolio Construction in the Presence of Fat Tails. Kemp, Malcolm H D (2011). - Chichester: John Wiley
& Sons Ltd, 2011. - 312 pages. [RKN: 13141]
Shelved at: EF/JNH (Lon) Shelved at: 368.01 KEM
Markets are fat-tailed; extreme outcomes occur more often than many might hope, or indeed the statistics or normal distributions
might indicate. In this book, the author provides readers with the latest tools and techniques on how best to adapt portfolio
construction techniques to cope with extreme events. Beginning with an overview of portfolio construction and market drivers, the
book will analyze fat tails, what they are, their behavior, how they can differ and what their underlying causes are. The book will
then move on to look at portfolio construction techniques which take into account fat tailed behavior, and how to stress test your
portfolio against extreme events. Finally, the book will analyze really extreme events in the context of portfolio choice and
problems. The book will offer readers: Ways of understanding and analyzing sources of extreme events, Tools for analyzing the
key drivers of risk and return, their potential magnitude and how they might interact, Methodologies for achieving efficient portfolio
construction and risk budgeting, Approaches for catering for the time-varying nature of the world in which we live, Back-stop
approaches for coping with really extreme events, Illustrations and real life examples of extreme events across asset classes. This
will be an indispensible guide for portfolio and risk managers who will need to better protect their portfolios against extreme events
which, within the financial markets, occur more frequently than we might expect.

CATASTROPHE INSURANCE

Ambiguity aversion and an intertemporal equilibrium model of catastrophe-linked securities pricing. Zhu, Wenge [RKN: 44973]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (1) : 38-46.
To explain several stylized facts concerning catastrophe-linked securities premium spread, the author proposes an intertemporal
equilibrium model by allowing agents to act in a robust control framework against model misspecification with respect to rare
events. The author has presented closed-form pricing formulas in some special cases and tested the model using empirical data
and simulation.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Calculating catastrophe. Woo, Gordon (2011). - London: Imperial College Press, 2011. - 355 pages. [RKN: 73989]
Shelved at: 363.34
Calculating Catastrophe has been written to explain, to a general readership, the underlying philosophical ideas and scientific
principles that govern catastrophic events, both natural and man-made. Knowledge of the broad range of catastrophes deepens
understanding of individual modes of disaster. This book will be of interest to anyone aspiring to understand catastrophes better,
but will be of particular value to those engaged in public and corporate policy, and the financial markets.

Global warming, extreme weather events, and forecasting tropical cyclones. Chang, Carolyn W; Chang, Jack S K; Guan Lim, Kian -
25 pages. [RKN: 70744]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2012) 42 (1) : 77-101.
Global warming has more than doubled the likelihood of extreme weather events, e.g. the 2003 European heat wave, the growing
intensity of rain and snow in the Northern Hemisphere, and the increasing risk of flooding in the United Kingdom. It has also
induced an increasing number of deadly tropical cyclones with a continuing trend. Many individual meteorological dynamic
simulations and statistical models are available for forecasting hurricanes but they neither forecast well hurricane intensity nor
produce clear-cut consensus. We develop a novel hurricane forecasting model by straddling two seemingly unrelated disciplines
physical science and finance based on the well known price discovery function of trading in financial markets. Traders of
hurricane derivative contracts employ all available forecasting models, public or proprietary, to forecast hurricanes in order to
make their pricing and trading decisions. By using transactional price changes of these contracts that continuously clear the
market supply and demand as the predictor, and with calibration to extract the embedded hurricane information by developing
hurricane futures and futures option pricing models, one can gain a forward-looking market-consensus forecast out of all of the
individual forecasting models employed. Our model can forecast when a hurricane will make landfall, how destructive it will be,
and how this destructive power will evolve from inception to landing. While the NHC (National Hurricane Center) blends 50 plus
individual forecasting results for its consensus model forecasts using a subjective approach, our aggregate is market-based.
Believing their proprietary forecasts are sufficiently different from our market-based forecasts, traders could also examine the
discrepancy for a potential trading opportunity using hurricane derivatives. We also provide a real case analysis of Hurricane Irene
in 2011 using our methodology.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Pricing catastrophe swaps: a contingent claims approach. Braun, Alexander [RKN: 44954]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 520-536.
In this paper, we comprehensively analyze the catastrophe (cat) swap, a financial instrument which has attracted little scholarly
17

attention to date. We begin with a discussion of the typical contract design, the current state of the market, as well as major areas
of application. Subsequently, a two-stage contingent claims pricing approach is proposed, which distinguishes between the main
risk drivers ex-ante as well as during the loss reestimation phase and additionally incorporates counterparty default risk.
Catastrophe occurrence is modeled as a doubly stochastic Poisson process (Cox process) with mean-reverting
OrnsteinUhlenbeck intensity. In addition, we fit various parametric distributions to normalized historical loss data for hurricanes
and earthquakes in the US and find the heavy-tailed Burr distribution to be the most adequate representation for loss severities.
Applying our pricing model to market quotes for hurricane and earthquake contracts, we derive implied Poisson intensities which
are subsequently condensed into a common factor for each peril by means of exploratory factor analysis. Further examining the
resulting factor scores, we show that a first order autoregressive process provides a good fit. Hence, its continuous-time limit, the
OrnsteinUhlenbeck process should be well suited to represent the dynamics of the Poisson intensity in a cat swap pricing model.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

CATASTROPHE REINSURANCE

The influence of non-linear dependencies on the basis risk of industry loss warranties. Gatzert, Nadine; Kellner, Ralf [RKN: 44983]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (1) : 132-144.
Index-linked catastrophic loss instruments represent an alternative to traditional reinsurance to hedge against catastrophic losses.
The use of these instruments comes with benefits, such as a reduction of moral hazard and higher transparency. However, at the
same time, it introduces basis risk as a crucial key risk factor, since the index and the companys losses are usually not fully
dependent. The aim of this paper is to examine the impact of basis risk on an insurers solvency situation when an industry l oss
warranty contract is used for hedging. Since previous literature has consistently stressed the importance of a high degree of
dependence between the companys losses and the industry index, we extend previous studies by allowing for non-linear
dependencies between relevant processes (high-risk and low-risk assets, insurance companys loss and industry index). The
analysis shows that both the type and degree of dependence play a considerable role with regard to basis risk and solvency
capital requirements and that other factors, such as relevant contract parameters of index-linked catastrophic loss instruments,
should not be neglected to obtain a comprehensive and holistic view of their effect upon risk reduction.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

CHAIN LADDER METHODS

Chain ladder correlations. Taylor, Greg (2011). - Victoria: University of Melbourne, 2011. - 16 pages. [RKN: 74758]
Correlations of future observations are investigated within the recursive and non-recursive chain ladder models. The recursive
models considered are the Mack and ODP Mack models; the non-recursive models are the ODP cross-classified models. Distinct
similarities are found between the correlations within the recursive and non-recursive models, but distinct differences also emerge.
The ordering of corresponding correlations within the recursive and non-recursive models is also investigated.
No.1 (1993) onwards available online. Download as PDF.
http://www.economics.unimelb.edu.au/ACT/papers.shtml

Detection and correction of outliers in the bivariate chainladder method. Verdonck, T; Van Wouwe, M [RKN: 44961]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (2) : 188-193.
The expected profit or loss of a non-life insurance company is determined for the whole of its multiple business lines. This implies
the study of the claims reserving problem for a portfolio consisting of several correlated run-off triangles. A popular technique to
deal with such a portfolio is the multivariate chainladder method of . However, it is well known that the chainladder method is
very sensitive to outlying data. For the univariate case, we have already developed a robust version of the chainladder method.
In this article we propose two techniques to detect and correct outlying values in a bivariate situation. The methodologies are
illustrated and compared on real examples from practice.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Double chain ladder. Martnez Miranda, Mara Dolores; Nielsen, Jens Perch; Verrall, Richard - 18 pages. [RKN: 70743]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2012) 42 (1) : 59-76.
By adding the information of reported count data to a classical triangle of reserving data, we derive a suprisingly simple method for
forecasting IBNR and RBNS claims. A simple relationship between development factors allows to involve and then estimate the
reporting and payment delay. Bootstrap methods provide prediction errors and make possible the inference about IBNR and
RBNS claims, separately.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Higher moments of the claims development result in general insurance. Salzmann, Robert; Wthrich, Mario V; Merz, Michael - 30
pages. [RKN: 70755]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2012) 42 (1) : 355-384.
The claims development result (CDR) is one of the major risk drivers in the profit and loss statement of a general insurance
company. Therefore, the CDR has become a central object of interest under new solvency regulation. In current practice, simple
methods based on the first two moments of the CDR are implemented to find a proxy for the distribution of the CDR. Such
18

approximations based on the first two moments are rather rough and may fail to appropriately describe the shape of the
distribution of the CDR. In this paper we provide an analysis of higher moments of the CDR. Within a Bayes chain ladder
framework we consider two different models for which it is possible to derive analytical solutions for the higher moments of the
CDR. Based on higher moments we can e.g. calculate the skewness and the excess kurtosis of the distribution of the CDR and
obtain refined approximations. Moreover, a case study investigates and answers questions raised in IASB [9].
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

A homotopy class of semi-recursive chain ladder models. Taylor, Greg (2011). - Victoria: University of Melbourne, 2011. - 23
pages. [RKN: 74759]
The chain ladder algorithm is known to produce maximum likelihood estimates of the
parameters of certain recursive and non-recursive models. These types of models
represent two extremes of dependency within rows of a data array.
Whereas observations within a row of a non-recursive model are stochastically
independent, each observation of a recursive model is, in expectation, directly
proportional to the immediately preceding observation from the same row. The
correlation structures of forecasts also differ as between recursive and non-recursive
models.
The present paper constructs a family of models that forms a bridge between recursive
and non-recursive models and so provides a continuum of intermediate cases in terms of
dependency structure. The intermediate models are called semi-recursive.
No.1 (1993) onwards available online. Download as PDF.
http://www.economics.unimelb.edu.au/ACT/papers.shtml

The influence of individual claims on the chain-ladder estimates: Analysis and diagnostic tool. Verdonck, T; Debruyne, M [RKN:
39932]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 85-98.
The chain-ladder method is a widely used technique to forecast the reserves that have to be kept regarding claims that are known
to exist, but for which the actual size is unknown at the time the reserves have to be set. In practice it can be easily seen that even
one outlier can lead to a huge over- or underestimation of the overall reserve when using the chain-ladder method. This indicates
that individual claims can be very influential when determining the chain-ladder estimates. In this paper the effect of contamination
is mathematically analyzed by calculating influence functions in the generalized linear model framework corresponding to the
chain-ladder method. It is proven that the influence functions are unbounded, confirming the sensitivity of the chain-ladder method
to outliers. A robust alternative is introduced to estimate the generalized linear model parameters in a more outlier resistant way.
Finally, based on the influence functions and the robust estimators, a diagnostic tool is presented highlighting the influence of
every individual claim on the classical chain-ladder estimates. With this tool it is possible to detect immediately which claims have
an abnormally positive or negative influence on the reserve estimates. Further examination of these influential points is then
advisable. A study of artificial and real run-off triangles shows the good performance of the robust chain-ladder method and the
diagnostic tool.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Maximum likelihood and estimation efficiency of the chain ladder. Taylor, Greg [RKN: 45303]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU/AST
ASTIN Bulletin (2011) 41 (1) : 131-155.
The chain ladder is considered in relation to certain recursive and non-recursive models of claim observations. The recursive
models resemble the (distribution free) Mack model but are augmented with distributional assumptions. The nonrecursive models
are generalisations of Poisson cross-classified structure for which the chain ladder is known to be maximum likelihood. The error
distributions considered are drawn from the exponential dispersion family.
Each of these models is examined with respect to sufficient statistics and completeness (Section 5), minimum variance estimators
(Section 6) and maximum likelihood (Section 7). The chain ladder is found to provide maximum likelihood and minimum variance
unbiased estimates of loss reserves under a wide range of recursive models. Similar results are obtained for a much more
restricted range of non-recursive models.
These results lead to a full classification of this papers chain ladder models with respect to the estimation properties (bias,
minimum variance) of the chain ladder algorithm (Section 8).
online access via International Actuarial Association:
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

On the wrong foot : Letter to the editor. Cox, Andrew Staple Inn Actuarial Society, - 1 pages. [RKN: 73932]
Shelved at: Per: Actuary (Oxf) Per: Actuary (Lon) Shelved at: JOU
The Actuary (2012) May : 6.
Suggesting that Verrall and England should have accepted Leong's use of the phrase 'bootstrap model' as it has changed in
meaning to become a shorthand for the Poisson model in some actuarial circles.
http://www.theactuary.com/

Prediction of outstanding payments in a Poisson cluster model. Jessen, Anders Hedegaard; Mikosch, Thomas; Samorodnitsky,
Gennady [RKN: 45490]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 3 : 214-237.
We consider a simple Poisson cluster model for the payment numbers and the corresponding total payments for insurance claims
arriving in a given year. Due to the Poisson structure one can give reasonably explicit expressions for the prediction of the
payment numbers and total payments in future periods given the past observations of the payment numbers. One can also derive
reasonably explicit expressions for the corresponding prediction errors. In the (a, b) class of Panjer's claim size distributions, these
expressions can be evaluated by simple recursive algorithms. We study the conditions under which the predictions are
asymptotically linear as the number of past payments becomes large. We also demonstrate that, in other regimes, the prediction
19

may be far from linear. For example, a staircase-like pattern may arise as well. We illustrate how the theory works on real-life data,
also in comparison with the chain ladder method.

Prediction uncertainty in the Bornhuetter-Ferguson claims reserving method: revisited. Alai, D H; Merz, M; Wthrich, Mario V
Faculty of Actuaries and Institute of Actuaries; Cambridge University Press, [RKN: 39998]
Shelved at: Per: AAS (Oxf) Per: AAS (Lon)
Annals of Actuarial Science (2011) 5(1) : 7-17.
We revisit the stochastic model of Alai et al. (2009) for the Bornhuetter-Ferguson claims reserving method, Bornhuetter &
Ferguson (1972). We derive an estimator of its conditional mean square error of prediction (MSEP) using an approach that is
based on generalized linear models and maximum likelihood estimators for the model parameters. This approach leads to simple
formulas, which can easily be implemented in a spreadsheet.
http://www.actuaries.org.uk/research-and-resources/pages/access-journals

These boots are made for working : Letter to the editor. Verrall, Richard; England, Peter Staple Inn Actuarial Society, - 1 pages.
[RKN: 73929]
Shelved at: Per: Actuary (Oxf) Per: Actuary (Lon) Shelved at: JOU
The Actuary (2012) April : 6.
Corrects Jessica Leong (December edition) in her article referring to the Bootstrap model. This should be referred to as a Chain
Ladder model, as bootstrapping is a statistical procedure.
http://www.theactuary.com/

CLAIM FREQUENCY

The joint distribution of the time to ruin and the number of claims until ruin in the classical risk model. Dickson, David C M [RKN:
45636]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 334-337.
We use probabilistic arguments to derive an expression for the joint density of the time to ruin and the number of claims until ruin
in the classical risk model. From this we obtain a general expression for the probability function of the number of claims until ruin.
We also consider the moments of the number of claims until ruin and illustrate our results in the case of exponentially distributed
individual claims. Finally, we briefly discuss joint distributions involving the surplus prior to ruin and deficit at ruin.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Modeling of claim exceedances over random thresholds for related insurance portfolios. Eryilmaz, Serkan; Gebizlioglu, Omer L;
Tank, Fatih [RKN: 44951]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 496-500.
Large claims in an actuarial risk process are of special importance for the actuarial decision making about several issues like
pricing of risks, determination of retention treaties and capital requirements for solvency. This paper presents a model about claim
occurrences in an insurance portfolio that exceed the largest claim of another portfolio providing the same sort of insurance
coverages. Two cases are taken into consideration: independent and identically distributed claims and exchangeable dependent
claims in each of the portfolios. Copulas are used to model the dependence situations. Several theorems and examples are
presented for the distributional properties and expected values of the critical quantities under concern.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

A new look at the homogeneous risk model. Lefvre, Claude; Picard, Philippe [RKN: 44953]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 512-519.
The present paper aims to revisit the homogeneous risk model investigated by, first, a claim arrival process is defined on a fixed
time interval by assuming that the arrival times satisfy an order statistic property. Then, the variability and the covariance of an
aggregate claim amount process is discussed. The distribution of the aggregate discounted claims is also examined. Finally, a
closed-form expression for the non-ruin probability is derived in terms of a family of Appell polynomials. This formula holds for all
claim distributions, even dependent. It generalizes several results obtained so far.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

On maximum likelihood and pseudo-maximum likelihood estimation in compound insurance models with deductibles. Paulsen,
Jostein; Stubo, Knut [RKN: 45298]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU/AST
ASTIN Bulletin (2011) 41 (1) : 1-28.
Non-life insurance payouts consist of two factors: claimsizes and claim frequency. When calculating e.g. next years premium, it is
vital to correctly model these factors and to estimate the unknown parameters. A standard way is to separately estimate in the
claimsize and the claim frequency models.
Often there is a deductible with each single claim, and this deductible can be quite large, particularly in inhomogeneous cases
such as industrial fire insurance or marine insurance. Not taking the deductibles into account can lead to serious bias in the
estimates and consequent implications when applying the model.
When the deductibles are nonidentical, in a full maximum likelihood estimation all unknown parameters have to be estimated
simultaneously. An alternative is to use pseudo-maximum likelihood, i.e. first estimate the claimsize model, taking the deductibles
into account, and then use the estimated probability that a claim exceeds the deductible as an offset in the claim frequency
estimation. This latter method is less efficient, but due to complexity or time considerations, it may be the preferred option.
20

In this paper we will provide rather general formulas for the relative efficiency of the pseudo maximum likelihood estimators in the
i.i.d. case. Two special cases will be studied in detail, and we conclude the paper by comparing the methods on some marine
insurance data.
online access via International Actuarial Association:
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Risk models based on time series for count random variables. Cossette, Hlne; Marceau, tienne; Toureille, Florent [RKN: 10962]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 19-28.
In this paper, we generalize the classical discrete time risk model by introducing a dependence relationship in time between the
claim frequencies. The models used are the Poisson autoregressive model and the Poisson moving average model. In particular,
the aggregate claim amount and related quantities such as the stop-loss premium, value at risk and tail value at risk are discussed
within this framework.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Risk processes with shot noise Cox claim number process and reserve dependent premium rate. Macci, Claudio; Torrisi, Giovanni
Luca [RKN: 39936]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 134-145.
We consider a suitable scaling, called the slow Markov walk limit, for a risk process with shot noise Cox claim number process and
reserve dependent premium rate. We provide large deviation estimates for the ruin probability. Furthermore, we find an
asymptotically efficient law for the simulation of the ruin probability using importance sampling. Finally, we present asymptotic
bounds for ruin probabilities in the Bayesian setting.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

CLAIM FREQUENCY MODELS

Risk models based on time series for count random variables. Cossette, Hlne; Marceau, tienne; Toureille, Florent [RKN: 10962]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 19-28.
In this paper, we generalize the classical discrete time risk model by introducing a dependence relationship in time between the
claim frequencies. The models used are the Poisson autoregressive model and the Poisson moving average model. In particular,
the aggregate claim amount and related quantities such as the stop-loss premium, value at risk and tail value at risk are discussed
within this framework.
Available via Athens: Palgrave MacMillan
http://www.openathens.net
CLAIMS

Claims development result in the paid-incurred chain reserving method. Happ, Sebastian; Merz, Michael; Wthrich, Mario V [RKN:
45724]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 66-72.
We present the one-year claims development result (CDR) in the paid-incurred chain (PIC) reserving model. The PIC reserving
model presented in Merz and Wthrich (2010) is a Bayesian stochastic claims reserving model that considers simultaneously
claims payments and incurred losses information and allows for deriving the full predictive distribution of the outstanding loss
liabilities. In this model we study the conditional mean square error of prediction (MSEP) for the one-year CDR uncertainty, which
is the crucial uncertainty view under Solvency II.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Explicit ruin formulas for models with dependence among risks. Albrecher, Hansjrg; Constantinescu, Corina; Loisel, Stphane
[RKN: 40021]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 265-270.
We show that a simple mixing idea allows one to establish a number of explicit formulas for ruin probabilities and related quantities
in collective risk models with dependence among claim sizes and among claim inter-occurrence times. Examples include
compound Poisson risk models with completely monotone marginal claim size distributions that are dependent according to
Archimedean survival copulas as well as renewal risk models with dependent inter-occurrence times.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

The influence of individual claims on the chain-ladder estimates: Analysis and diagnostic tool. Verdonck, T; Debruyne, M [RKN:
39932]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 85-98.
The chain-ladder method is a widely used technique to forecast the reserves that have to be kept regarding claims that are known
21

to exist, but for which the actual size is unknown at the time the reserves have to be set. In practice it can be easily seen that even
one outlier can lead to a huge over- or underestimation of the overall reserve when using the chain-ladder method. This indicates
that individual claims can be very influential when determining the chain-ladder estimates. In this paper the effect of contamination
is mathematically analyzed by calculating influence functions in the generalized linear model framework corresponding to the
chain-ladder method. It is proven that the influence functions are unbounded, confirming the sensitivity of the chain-ladder method
to outliers. A robust alternative is introduced to estimate the generalized linear model parameters in a more outlier resistant way.
Finally, based on the influence functions and the robust estimators, a diagnostic tool is presented highlighting the influence of
every individual claim on the classical chain-ladder estimates. With this tool it is possible to detect immediately which claims have
an abnormally positive or negative influence on the reserve estimates. Further examination of these influential points is then
advisable. A study of artificial and real run-off triangles shows the good performance of the robust chain-ladder method and the
diagnostic tool.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Joint moments of discounted compound renewal sums. Lveill, Ghislain; Adkambi, Franck [RKN: 44930]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2012) 1 : 40-55.
The first two moments and the covariance of the aggregate discounted claims have been found for a stochastic interest rate, from
which the inflation rate has been subtracted, and for a claims number process that is an ordinary or a delayed renewal
process.Hereafter we extend the preceding results by presenting recursive formulas for the joint moments of this risk process, for
a constant interest rate, and non-recursive formulas for higher joint moments when the interest rate is stochastic. Examples are
given for exponential claims inter-arrival times and for the Ho-Lee-Merton interest rate model.

Mathematical investigation of the GerberShiu function in the case of dependent inter-claim time and claim size. Mihalyko, Eva
Orban; Mihalyko, Csaba [RKN: 45132]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (3) : 378-383.
In this paper we investigate the well-known GerberShiu expected discounted penalty function in the case of dependence
between the inter-claim times and the claim amounts. We set up an integral equation for it and we prove the existence and
uniqueness of its solution in the set of bounded functions. We show that if d>0, the limit property of the solution is not a regularity
condition, but the characteristic of the solution even in the case when the net profit condition is not fulfilled. It is the consequence
of the choice of the penalty function for a given density function. We present an example when the GerberShiu function is not
bounded, consequently, it does not tend to zero. Using an operator technique we also prove exponential boundedness.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

A maximum-entropy approach to the linear credibility formula. Najafabadi, Amir T. Payandeh; Hatami, Hamid; Najafabadi, Maryam
Omidi [RKN: 45738]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 216-221.
Payandeh [Payandeh Najafabadi, A.T., 2010. A new approach to credibility formula. Insurance: Mathematics and Economy 46,
334338] introduced a new technique to approximate a Bayes estimator with the exact credibilitys form. This article employs a
well known and powerful maximum-entropy method (MEM) to extend results of Payandeh Najafabadi (2010) to a class of linear
credibility, whenever claim sizes have been distributed according to the logconcave distributions. Namely, (i) it employs the
maximum-entropy method to approximate an appropriate Bayes estimator (with respect to either the square-error or the Linex
loss functions and general increasing and bounded prior distribution) by a linear combination of claim sizes; (ii) it establishes that
such an approximation coincides with the exact credibility formula whenever the require conditions for the exact credibility (see
below) are held. Some properties of such an approximation are discussed. Application to crop insurance has been given.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Maximum likelihood and estimation efficiency of the chain ladder. Taylor, Greg [RKN: 45303]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU/AST
ASTIN Bulletin (2011) 41 (1) : 131-155.
The chain ladder is considered in relation to certain recursive and non-recursive models of claim observations. The recursive
models resemble the (distribution free) Mack model but are augmented with distributional assumptions. The nonrecursive models
are generalisations of Poisson cross-classified structure for which the chain ladder is known to be maximum likelihood. The error
distributions considered are drawn from the exponential dispersion family.
Each of these models is examined with respect to sufficient statistics and completeness (Section 5), minimum variance estimators
(Section 6) and maximum likelihood (Section 7). The chain ladder is found to provide maximum likelihood and minimum variance
unbiased estimates of loss reserves under a wide range of recursive models. Similar results are obtained for a much more
restricted range of non-recursive models.
These results lead to a full classification of this papers chain ladder models with respect to the estimation properties (bias,
minimum variance) of the chain ladder algorithm (Section 8).
online access via International Actuarial Association:
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Modeling dependent yearly claim totals including zero claims in private health insurance. Erhardt, Vinzenz; Czado, Claudia [RKN:
45781]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2012) 2 : 106-129.
In insurance applications yearly claim totals of different coverage fields are often dependent. In many cases there are numerous
claim totals which are zero. A marginal claim distribution will have an additional point mass at zero, hence this probability function
(pf) will not be continuous at zero and the cumulative distribution functions will not be uniform. Therefore using a copula approach
to model dependency is not straightforward. We will illustrate how to express the joint pf by copulas with discrete and continuous
22

margins. A pair-copula construction will be used for the fit of the continuous copula allowing to choose appropriate copulas for
each pair of margins.
http://www.openathens.net/

Modelling claims run-off with reversible jump Markov chain Monte Carlo methods. Verrall, Richard; Hssjer, Ola; Bjrkwall,
Susanna - 24 pages. [RKN: 70742]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2012) 42 (1) : 35-58.
In this paper we describe a new approach to modelling the development of claims run-off triangles. This method replaces the usual
ad hoc practical process of extrapolating a development pattern to obtain tail factors with an objective procedure. An example is
given, illustrating the results in a practical context, and the WinBUGS code is supplied.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Multivariate insurance models: an overview. Anastasiadis, Simon; Chukova, Stefanka [RKN: 45739]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 222-227.
This literature review summarizes the results from a collection of research papers that relate to modeling insurance claims and the
processes associated with them. We consider work by more than 55 authors, published or presented between 1971 and 2008.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

On maximum likelihood and pseudo-maximum likelihood estimation in compound insurance models with deductibles. Paulsen,
Jostein; Stubo, Knut [RKN: 45298]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU/AST
ASTIN Bulletin (2011) 41 (1) : 1-28.
Non-life insurance payouts consist of two factors: claimsizes and claim frequency. When calculating e.g. next years premium, it is
vital to correctly model these factors and to estimate the unknown parameters. A standard way is to separately estimate in the
claimsize and the claim frequency models.
Often there is a deductible with each single claim, and this deductible can be quite large, particularly in inhomogeneous cases
such as industrial fire insurance or marine insurance. Not taking the deductibles into account can lead to serious bias in the
estimates and consequent implications when applying the model.
When the deductibles are nonidentical, in a full maximum likelihood estimation all unknown parameters have to be estimated
simultaneously. An alternative is to use pseudo-maximum likelihood, i.e. first estimate the claimsize model, taking the deductibles
into account, and then use the estimated probability that a claim exceeds the deductible as an offset in the claim frequency
estimation. This latter method is less efficient, but due to complexity or time considerations, it may be the preferred option.
In this paper we will provide rather general formulas for the relative efficiency of the pseudo maximum likelihood estimators in the
i.i.d. case. Two special cases will be studied in detail, and we conclude the paper by comparing the methods on some marine
insurance data.
online access via International Actuarial Association:
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

On the analysis of a general class of dependent risk processes. Willmot, Gordon E; Woo, Jae-Kyung [RKN: 45730]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 134-141.
A generalized Sparre Andersen risk process is examined, whereby the joint distribution of the interclaim time and the ensuing
claim amount is assumed to have a particular mathematical structure. This structure is present in various dependency models
which have previously been proposed and analyzed. It is then shown that this structure in turn often implies particular functional
forms for joint discounted densities of ruin related variables including some or all of the deficit at ruin, the surplus immediately prior
to ruin, and the surplus after the second last claim. Then, employing a fairly general interclaim time structure which involves a
combination of Erlang type densities, a complete identification of a generalized GerberShiu function is provided. An application is
given applying these results to a situation involving a mixed Erlang type of claim amount assumption. Various examples and
special cases of the model are then considered, including one involving a bivariate Erlang mixture model.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

On the moments of aggregate discounted claims with dependence introduced by a FGM copula. Bargs, Mathieu; Cossette,
Helene; Loisel, Stphane; Marceau, Etienne [RKN: 45306]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU/AST
ASTIN Bulletin (2011) 41 (1) : 215-238.
In this paper, we investigate the computation of the moments of the compound Poisson sums with discounted claims when
introducing dependence between the interclaim time and the subsequent claim size. The dependence structure between the two
random variables is defined by a Farlie-Gumbel-Morgenstern copula. Assuming that the claim distribution has finite moments, we
give expressions for the first and the second moments and then we obtain a general formula for any mth order moment. The
results are illustrated with applications to premium calculation and approximations based on moment matching methods.
online access via International Actuarial Association:
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Pricing catastrophe swaps: a contingent claims approach. Braun, Alexander [RKN: 44954]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 520-536.
In this paper, we comprehensively analyze the catastrophe (cat) swap, a financial instrument which has attracted little scholarly
attention to date. We begin with a discussion of the typical contract design, the current state of the market, as well as major areas
of application. Subsequently, a two-stage contingent claims pricing approach is proposed, which distinguishes between the main
risk drivers ex-ante as well as during the loss reestimation phase and additionally incorporates counterparty default risk.
23

Catastrophe occurrence is modeled as a doubly stochastic Poisson process (Cox process) with mean-reverting
OrnsteinUhlenbeck intensity. In addition, we fit various parametric distributions to normalized historical loss data for hurricanes
and earthquakes in the US and find the heavy-tailed Burr distribution to be the most adequate representation for loss severities.
Applying our pricing model to market quotes for hurricane and earthquake contracts, we derive implied Poisson intensities which
are subsequently condensed into a common factor for each peril by means of exploratory factor analysis. Further examining the
resulting factor scores, we show that a first order autoregressive process provides a good fit. Hence, its continuous-time limit, the
OrnsteinUhlenbeck process should be well suited to represent the dynamics of the Poisson intensity in a cat swap pricing model.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Robustefficient credibility models with heavy-tailed claims: A mixed linear models perspective. Dornheim, Harald; Brazauskas,
Vytaras [RKN: 39365]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 72-84.
In actuarial practice, regression models serve as a popular statistical tool for analyzing insurance data and tariff ratemaking. In this
paper, we consider classical credibility models that can be embedded within the framework of mixed linear models. For inference
about fixed effects and variance components, likelihood-based methods such as (restricted) maximum likelihood estimators are
commonly pursued. However, it is well-known that these standard and fully efficient estimators are extremely sensitive to small
deviations from hypothesized normality of random components as well as to the occurrence of outliers. To obtain better estimators
for premium calculation and prediction of future claims, various robust methods have been successfully adapted to credibility
theory in the actuarial literature. The objective of this work is to develop robust and efficient methods for credibility when
heavy-tailed claims are approximately log-locationscale distributed. To accomplish that, we first show how to express additive
credibility models such as BhlmannStraub and Hachemeister ones as mixed linear models with symmetric or asymmetric
errors. Then, we adjust adaptively truncated likelihood methods and compute highly robust credibility estimates for the ordinary
but heavy-tailed claims part. Finally, we treat the identified excess claims separately and find robustefficient credibility premiums.
Practical performance of this approach is examinedvia simulationsunder several contaminating scenarios. A widely studied
real-data set from workers compensation insurance is used to illustrate functional capabilities of the new robust credibility
estimators.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Ruin by dynamic contagion claims. Dassios, Angelos; Zhao, Hongbiao [RKN: 45726]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 93-106.
In this paper, we consider a risk process with the arrival of claims modelled by a dynamic contagion process, a generalisation of
the Cox process and Hawkes process introduced by Dassios and Zhao (2011). We derive results for the infinite horizon model that
are generalisations of the CramrLundberg approximation, Lundbergs fundamental equation, some asymptotics as well as
bounds for the probability of ruin. Special attention is given to the case of exponential jumps and a numerical example is provided.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Sensitivity of risk measures with respect to the normal approximation of total claim distributions. Krtschmer, Volker; Zhle,
Henryk [RKN: 44935]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 335-344.
A simple and commonly used method to approximate the total claim distribution of a (possibly weakly dependent) insurance
collective is the normal approximation. In this article, we investigate the error made when the normal approximation is plugged in
a fairly general distribution-invariant risk measure. We focus on the rate of convergence of the error relative to the number of
clients, we specify the relative errors asymptotic distribution, and we illustrate our results by means of a numerical example.
Regarding the risk measure, we take into account distortion risk measures as well as distribution-invariant coherent risk
measures.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

A statistical basis for claims experience monitoring. Taylor, Greg (2011). - Victoria: University of Melbourne, 2011. - 32 pages.
[RKN: 74757]
By claims experience monitoring is meant the systematic comparison of the forecasts from a claims model with claims experience
as it emerges subsequently. In the event that the stochastic properties of the forecasts are known, the comparison can be
represented as a collection of probabilistic statements. This is stochastic monitoring.
The paper defines this process rigorously in terms of statistical hypothesis testing. If the model is a regression model (which is the
case for most stochastic claims models), then the natural form of hypothesis test is a number of likelihood ratio tests, one for each
parameter in the valuation model. Such testing is shown to be very easily implemented by means of GLM software.
This tests the formal structure of the claims model and is referred to as micro-testing. There may be other quantities (e.g. amount
of claim payments in a defined interval) that require testing for practical reasons. This sort of testing is referred to as macro-testing,
and its formulation is also discussed.
No.1 (1993) onwards available online. Download as PDF.
http://www.economics.unimelb.edu.au/ACT/papers.shtml




24

CLAIMS RESERVES

Bootstrapping individual claim histories. Rosenlund, Stig - 34 pages. [RKN: 70752]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2012) 42 (1) : 291-324.
The bootstrap method BICH is given for estimating mean square prediction errors and predictive distributions of non-life claim
reserves under weak conditions. The dates of claim occurrence, reporting and finalization and the payment dates and amounts of
individual finalized historic claims form a claim set from which samples with replacement are drawn. We assume that all claims are
independent and that the historic claims are distributed as the object claims, possibly after inflation adjustment and segmentation
on a background variable, whose distribution could have changed over time due to portfolio change. Also we introduce the new
reserving function RDC, using all these dates and payments for reserve predictions. We study three reserving functions: chain
ladder, the Schnieper (1991) method and RDC. Checks with simulated cases obeying the assumptions of Mack (1999) for chain
ladder and Liu and Verrall (2009) for Schniepers method, respectively, confirm the validity of our method. BICH is used to
compare the three reserving functions, of which RDC is found overall best in simulated cases.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Detection and correction of outliers in the bivariate chainladder method. Verdonck, T; Van Wouwe, M [RKN: 44961]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (2) : 188-193.
The expected profit or loss of a non-life insurance company is determined for the whole of its multiple business lines. This implies
the study of the claims reserving problem for a portfolio consisting of several correlated run-off triangles. A popular technique to
deal with such a portfolio is the multivariate chainladder method of . However, it is well known that the chainladder method is
very sensitive to outlying data. For the univariate case, we have already developed a robust version of the chainladder method.
In this article we propose two techniques to detect and correct outlying values in a bivariate situation. The methodologies are
illustrated and compared on real examples from practice.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Diagonal effects in claims reserving. Jessen, Anders Hedegaard; Rietdorf, Niels [RKN: 45147]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 1 : 21-37.
In this paper we present two different approaches to how one can include diagonal effects in non-life claims reserving based on
run-off triangles. Empirical analyses suggest that the approaches in Zehnwirth (2003) and Kuang et al. (2008a, 2008b) do not work
well with low-dimensional run-off triangles because estimation uncertainty is too large. To overcome this problem we consider
similar models with a smaller number of parameters. These are closely related to the framework considered in Verbeek (1972) and
Taylor (1977, 2000); the separation method. We explain that these models can be interpreted as extensions of the multiplicative
Poisson models introduced by Hachemeister & Stanard (1975) and Mack (1991).

A generalized linear model with smoothing effects for claims reserving. Bjrkwall, Susanna; Hssjer, Ola; Ohlsson, Esbjrn; Verrall,
Richard [RKN: 44972]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (1) : 27-37.
In this paper, we continue the development of the ideas introduced in England and Verrall (2001) by suggesting the use of a
reparameterized version of the generalized linear model (GLM) which is frequently used in stochastic claims reserving. This model
enables us to smooth the origin, development and calendar year parameters in a similar way as is often done in practice, but still
keep the GLM structure. Specifically, we use this model structure in order to obtain reserve estimates and to systemize the model
selection procedure that arises in the smoothing process. Moreover, we provide a bootstrap procedure to achieve a full predictive
distribution.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

The influence of individual claims on the chain-ladder estimates: Analysis and diagnostic tool. Verdonck, T; Debruyne, M [RKN:
39932]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 85-98.
The chain-ladder method is a widely used technique to forecast the reserves that have to be kept regarding claims that are known
to exist, but for which the actual size is unknown at the time the reserves have to be set. In practice it can be easily seen that even
one outlier can lead to a huge over- or underestimation of the overall reserve when using the chain-ladder method. This indicates
that individual claims can be very influential when determining the chain-ladder estimates. In this paper the effect of contamination
is mathematically analyzed by calculating influence functions in the generalized linear model framework corresponding to the
chain-ladder method. It is proven that the influence functions are unbounded, confirming the sensitivity of the chain-ladder method
to outliers. A robust alternative is introduced to estimate the generalized linear model parameters in a more outlier resistant way.
Finally, based on the influence functions and the robust estimators, a diagnostic tool is presented highlighting the influence of
every individual claim on the classical chain-ladder estimates. With this tool it is possible to detect immediately which claims have
an abnormally positive or negative influence on the reserve estimates. Further examination of these influential points is then
advisable. A study of artificial and real run-off triangles shows the good performance of the robust chain-ladder method and the
diagnostic tool.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Modeling of claim exceedances over random thresholds for related insurance portfolios. Eryilmaz, Serkan; Gebizlioglu, Omer L;
Tank, Fatih [RKN: 44951]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 496-500.
Large claims in an actuarial risk process are of special importance for the actuarial decision making about several issues like
25

pricing of risks, determination of retention treaties and capital requirements for solvency. This paper presents a model about claim
occurrences in an insurance portfolio that exceed the largest claim of another portfolio providing the same sort of insurance
coverages. Two cases are taken into consideration: independent and identically distributed claims and exchangeable dependent
claims in each of the portfolios. Copulas are used to model the dependence situations. Several theorems and examples are
presented for the distributional properties and expected values of the critical quantities under concern.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Prediction uncertainty in the Bornhuetter-Ferguson claims reserving method: revisited. Alai, D H; Merz, M; Wthrich, Mario V
Faculty of Actuaries and Institute of Actuaries; Cambridge University Press, [RKN: 39998]
Shelved at: Per: AAS (Oxf) Per: AAS (Lon)
Annals of Actuarial Science (2011) 5(1) : 7-17.
We revisit the stochastic model of Alai et al. (2009) for the Bornhuetter-Ferguson claims reserving method, Bornhuetter &
Ferguson (1972). We derive an estimator of its conditional mean square error of prediction (MSEP) using an approach that is
based on generalized linear models and maximum likelihood estimators for the model parameters. This approach leads to simple
formulas, which can easily be implemented in a spreadsheet.
http://www.actuaries.org.uk/research-and-resources/pages/access-journals

CLIMATE CHANGE

Future building water loss projections posed by climate change. Haug, Ola; Dimakos, Xeni K; Vardal, Jofrid F; Aldrin, Magne;
Meze-Hausken, Elisabeth [RKN: 45146]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 1 : 1-20.
The insurance industry, like other parts of the financial sector, is vulnerable to climate change. Life as well as non-life products are
affected and knowledge of future loss levels is valuable. Risk and premium calculations may be updated accordingly, and
dedicated loss-preventive measures can be communicated to customers and regulators. We have established statistical claims
models for the coherence between externally inflicted water damage to private buildings in Norway and selected meteorological
variables. Based on these models and downscaled climate predictions from the Hadley centre HadAM3H climate model, the
estimated loss level of a future scenario period (2071-2100) is compared to that of a control period (1961-1990). In spite of
substantial estimation uncertainty, our analyses identify an incontestable increase in the claims level along with some regional
variability. Of the uncertainties inherently involved in such predictions, only the error due to model fit is quantifiable.

Global warming, extreme weather events, and forecasting tropical cyclones. Chang, Carolyn W; Chang, Jack S K; Guan Lim, Kian -
25 pages. [RKN: 70744]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2012) 42 (1) : 77-101.
Global warming has more than doubled the likelihood of extreme weather events, e.g. the 2003 European heat wave, the growing
intensity of rain and snow in the Northern Hemisphere, and the increasing risk of flooding in the United Kingdom. It has also
induced an increasing number of deadly tropical cyclones with a continuing trend. Many individual meteorological dynamic
simulations and statistical models are available for forecasting hurricanes but they neither forecast well hurricane intensity nor
produce clear-cut consensus. We develop a novel hurricane forecasting model by straddling two seemingly unrelated disciplines
physical science and finance based on the well known price discovery function of trading in financial markets. Traders of
hurricane derivative contracts employ all available forecasting models, public or proprietary, to forecast hurricanes in order to
make their pricing and trading decisions. By using transactional price changes of these contracts that continuously clear the
market supply and demand as the predictor, and with calibration to extract the embedded hurricane information by developing
hurricane futures and futures option pricing models, one can gain a forward-looking market-consensus forecast out of all of the
individual forecasting models employed. Our model can forecast when a hurricane will make landfall, how destructive it will be,
and how this destructive power will evolve from inception to landing. While the NHC (National Hurricane Center) blends 50 plus
individual forecasting results for its consensus model forecasts using a subjective approach, our aggregate is market-based.
Believing their proprietary forecasts are sufficiently different from our market-based forecasts, traders could also examine the
discrepancy for a potential trading opportunity using hurricane derivatives. We also provide a real case analysis of Hurricane Irene
in 2011 using our methodology.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

COBWEB THEOREM

Cobweb Theorems with production lags and price forecasting. Dufresne, Daniel; Vazquez-Abad, Felisa J (2012). - Victoria:
University of Melbourne, 2012. - 27 pages. [RKN: 73800]
The classical cobweb theorem is extended to include production lags and price forecasts. Price forecasting based on a longer
period has a stabilizing effect on prices. Longer production lags do not necessarily lead to unstable prices; very long lags lead to
cycles of constant amplitude. The classical cobweb requires elasticity of demand to be greater than that of supply; this is not
necessarily the case in a more general setting, price forecasting has a stabilizing effect. Random shocks are also considered.
No.1 (1993) onwards available online. Download as PDF.
http://www.economics.unimelb.edu.au/ACT/papers.shtml

26

COHORTS

A dynamic parameterization modeling for the age-period-cohort mortality. Hatzopoulos, P; Haberman, Steven [RKN: 44959]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (2) : 155-174.
An extended version of dynamic parametric model is proposed for analyzing mortality structures, incorporating the cohort effect. A
one-factor parameterized exponential polynomial in age effects within the generalized linear models (GLM) framework is used.
Sparse principal component analysis (SPCA) is then applied to time-dependent GLM parameter estimates and provides
(marginal) estimates for a two-factor principal component (PC) approach structure. Modeling the two-factor residuals in the same
way, in age-cohort effects, provides estimates for the (conditional) three-factor ageperiodcohort model. The age-time and
cohort related components are extrapolated using dynamic linear regression (DLR) models. An application is presented for
England & Wales males (18412006).
Available via Athens: Palgrave MacMillan
http://www.openathens.net

COMONOTONICITY

An application of comonotonicity theory in a stochastic life annuity framework. Liu, Xiaoming; Jang, Jisoo; Kim, Sun Mee [RKN:
40022]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 271-279.
A life annuity contract is an insurance instrument which pays pre-scheduled living benefits conditional on the survival of the
annuitant. In order to manage the risk borne by annuity providers, one needs to take into account all sources of uncertainty that
affect the value of future obligations under the contract. In this paper, we define the concept of annuity rate as the conditional
expected present value random variable of future payments of the annuity, given the future dynamics of its risk factors. The
annuity rate deals with the non-diversifiable systematic risk contained in the life annuity contract, and it involves mortality risk as
well as investment risk. While it is plausible to assume that there is no correlation between the two risks, each affects the annuity
rate through a combination of dependent random variables. In order to understand the probabilistic profile of the annuity rate, we
apply comonotonicity theory to approximate its quantile function. We also derive accurate upper and lower bounds for prediction
intervals for annuity rates. We use the LeeCarter model for mortality risk and the Vasicek model for the term structure of interest
rates with an annually renewable fixed-income investment policy. Different investment strategies can be handled using this
framework.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Approximation of bivariate copulas by patched bivariate Frchet copulas. Zheng, Yanting; Yang, Jingping; Huang, Jianhua Z [RKN:
40019]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 246-256.
Bivariate Frchet (BF) copulas characterize dependence as a mixture of three simple structures: comonotonicity, independence
and countermonotonicity. They are easily interpretable but have limitations when used as approximations to general dependence
structures. To improve the approximation property of the BF copulas and keep the advantage of easy interpretation, we develop a
new copula approximation scheme by using BF copulas locally and patching the local pieces together. Error bounds and a
probabilistic interpretation of this approximation scheme are developed. The new approximation scheme is compared with several
existing copula approximations, including shuffle of min, checkmin, checkerboard and Bernstein approximations and exhibits
better performance, especially in characterizing the local dependence. The utility of the new approximation scheme in insurance
and finance is illustrated in the computation of the rainbow option prices and stop-loss premiums.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Characterization of upper comonotonicity via tail convex order. Nam, Hee Seok; Tang, Qihe; Yang, Fan [RKN: 45130]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (3) : 368-373.
In this paper, we show a characterization of upper comonotonicity via tail convex order. For any given marginal distributions, a
maximal random vector with respect to tail convex order is proved to be upper comonotonic under suitable conditions. As an
application, we consider the computation of the Haezendonck risk measure of the sum of upper comonotonic random variables
with exponential marginal distributions.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Measuring comonotonicity in M-dimensional vectors. Koch, Inge; De Schepper, Ann [RKN: 45305]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU/AST
ASTIN Bulletin (2011) 41 (1) : 191-213.
In this contribution, a new measure of comonotonicity for m-dimensional vectors is introduced, with values between zero,
representing the independent situation, and one, reflecting a completely comonotonic situation. The main characteristics of this
coefficient are examined, and the relations with common dependence measures are analysed. A sample-based version of the
comonotonicity coefficient is also derived. Special attention is paid to the explanation of the accuracy of the convex order bound
method of Goovaerts, Dhaene et al. in the case of cash flows with Gaussian discounting processes.
online access via International Actuarial Association:
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

27

COMPETITION

Competitive insurance market in the presence of ambiguity. Anwar, Sajid; Zheng, Mingli [RKN: 44992]
Insurance: Mathematics & Economics (2012) 50 (1) : 79-84.
Within the context of a competitive insurance market, this paper examines the impact of ambiguity on the behavior of buyers and
sellers. Ambiguity is described through a probability measure on an extended state space that includes extra ambiguous states. It
is shown that if insurers face the same or less ambiguity than their customers, a unique equilibrium exists where customers are
fully insured. On the other hand, if insurers face more ambiguity than their customers, customers will be under insured and it is
even possible that customers may not purchase any insurance.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

COMPOUND DISTRIBUTIONS

Matrix-form recursive evaluation of the aggregate claims distribution revisited. Siaw, Kok Keng; Wu, Xueyuan; Pitt, David; Wang,
Yan Institute and Faculty of Actuaries; Cambridge University Press, - 17 pages. [RKN: 74949]
Shelved at: Per: AAS (Oxf) Per: AAS (Lon)
Annals of Actuarial Science (2011) 5(2) : 163-179.
This paper aims to evaluate the aggregate claims distribution under the collective risk model when the number of claims follows a
so-called generalised (a, b, 1) family distribution. The definition of the generalised (a, b, 1) family of distributions is given first, then
a simple matrix-form recursion for the compound generalised (a, b, 1) distributions is derived to calculate the aggregate claims
distribution with discrete non-negative individual claims. Continuous individual claims are discussed as well and an integral
equation of the aggregate claims distribution is developed. Moreover, a recursive formula for calculating the moments of
aggregate claims is also obtained in this paper. With the recursive calculation framework being established, members that belong
to the generalised (a, b, 1) family are discussed. As an illustration of potential applications of the proposed generalised (a, b, 1)
distribution family on modelling insurance claim numbers, two numerical examples are given. The first example illustrates the
calculation of the aggregate claims distribution using a matrix-form Poisson for claim frequency with logarithmic claim sizes. The
second example is based on real data and illustrates maximum likelihood estimation for a set of distributions in the generalised (a,
b, 1) family.
http://www.actuaries.org.uk/research-and-resources/pages/access-journals

COMPOUND INTEREST

On a multi-threshold compound Poisson surplus process with interest. Mitric, Ilie-Radu [RKN: 45353]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 2 : 75-95.
We consider a multi-threshold compound Poisson surplus process. When the initial surplus is between any two consecutive
thresholds, the insurer has the option to choose the respective premium rate and interest rate. Also, the model allows for
borrowing the current amount of deficit whenever the surplus falls below zero. Starting from the integro-differential equations
satisfied by the Gerber-Shiu function that appear in Yang et al. (2008), we consider exponentially and phase-type(2) distributed
claim sizes, in which cases we are able to transform the integro-differential equations into ordinary differential equations. As a
result, we obtain explicit expressions for the Gerber-Shiu function.

CONFIDENCE LIMITS

Jackknife empirical likelihood method for some risk measures and related quantities. Peng, Liang; Qi, Yongcheng; Wang, Ruodu;
Yang, Jingping [RKN: 45731]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 142-150.
Quantifying risks is of importance in insurance. In this paper, we employ the jackknife empirical likelihood method to construct
confidence intervals for some risk measures and related quantities studied by Jones and Zitikis (2003). A simulation study shows
the advantages of the new method over the normal approximation method and the naive bootstrap method.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

CONSUMER BEHAVIOUR

Optimal investment and consumption decision of a family with life insurance. Kwak, Minsuk; Shin, Yong Hyun; Choi, U Jin [RKN:
40011]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 176-188.
28

We study an optimal portfolio and consumption choice problem of a family that combines life insurance for parents who receive
deterministic labor income until the fixed time T. We consider utility functions of parents and children separately and assume that
parents have an uncertain lifetime. If parents die before time T, children have no labor income and they choose the optimal
consumption and portfolio with remaining wealth and life insurance benefit. The object of the family is to maximize the weighted
average of utility of parents and that of children. We obtain analytic solutions for the value function and the optimal policies, and
then analyze how the changes of the weight of the parents utility function and other factors affect the optimal policies.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

CONSUMER PRICE INDEX (CPI)

Muddling one's means? : Letter to the Editor. Sibbett, Trevor A Staple Inn Actuarial Society, - 1 pages. [RKN: 70912]
Shelved at: Per: Actuary (Oxf) Per: Actuary (Lon) Shelved at: JOU
The Actuary (2012) October : 6.
Further remarks to the letter by Roy Colbran regarding the use of geometric means in the CPI.
http://www.theactuary.com/

CONTRACTS

A performance analysis of participating life insurance contracts. Faust, Roger; Schmeiser, Hato; Zemp, Alexandra [RKN: 45733]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 158-171.
Participating life insurance contracts are one of the most important products in the European life insurance market. Even though
these contract forms are very common, only very little research has been conducted in respect to their performance. Hence, we
conduct a performance analysis to provide a decision support for policyholders. We decompose a participating life insurance
contract in a term life insurance and a savings part and simulate the cash flow distribution of the latter. Simulation results are
compared with cash flows resulting from two benchmarks investing in the same portfolio of assets but without investment
guarantees and bonus distribution schemes, in order to measure the impact of these two product features. To provide a realistic
picture within the two alternatives, we take transaction costs and wealth transfers between different groups of policyholders into
account. We show that the payoff distribution strongly depends on the initial reserve situation and managerial discretion. Results
indicate that policyholders will in general profit from a better payoff distribution of the participating life insurance compared to a
mutual fund benchmark but not compared to an exchange-traded fund benchmark portfolio.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

CONVEX PROGRAMMING

Characterization of upper comonotonicity via tail convex order. Nam, Hee Seok; Tang, Qihe; Yang, Fan [RKN: 45130]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (3) : 368-373.
In this paper, we show a characterization of upper comonotonicity via tail convex order. For any given marginal distributions, a
maximal random vector with respect to tail convex order is proved to be upper comonotonic under suitable conditions. As an
application, we consider the computation of the Haezendonck risk measure of the sum of upper comonotonic random variables
with exponential marginal distributions.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Convex order approximations in the case of cash flows of mixed signs. Dhaene, Jan; Goovaerts, Marc; Vanmaele, Michle; Van
Weert, Koen [RKN: 44783]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51(2) : 249-256.
In Van Weert et al. (2010) [K Van Weert, J Dhaene, M Goovaerts, Optimal portfolio selection for general provisioning and terminal
wealth problems, Insurance: Mathematics and Economics, 47 (1) (2010): 9097], results are obtained showing that, when
allowing some of the cash flows to be negative, convex order lower bound approximations can still be used to solve general
investment problems in a context of provisioning or terminal wealth. In this paper, a correction and further clarification of the
reasoning of Van Weert et al. (2010) are given, thereby significantly expanding the scope of problems and cash flow patterns for
which the terminal wealth or initial provision can be accurately approximated. Also an interval for the probability level is derived in
which the quantiles of the lower bound approximation can be computed. Finally, it is shown how one can move from a context of
provisioning of future obligations to a saving and terminal wealth problem by inverting the time axis.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

29

COPULAS

Adaptive Importance Sampling for simulating copula-based distributions. Bee, Marco [RKN: 40018]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 237-245.
In this paper, we propose a generalization of importance sampling, called Adaptive Importance Sampling, to approximate
simulation of copula-based distributions. Unlike existing methods for copula simulation that have appeared in the literature, this
algorithm is broad enough to be used for any absolutely continuous copula. We provide details of the algorithm including rules for
stopping the iterative process and consequently assess its performance using extensive Monte Carlo experiments. To assist in its
extension to several dimensions, we discuss procedures for identifying the crucial parameters in order to achieve desirable results
especially as the size of the dimension increases. Finally, for practical illustration, we demonstrate the use of the algorithm to price
First-to-Default credit swap, an important credit derivative instrument in the financial market. The method works exquisitely well
even for large dimensions making it a valuable tool for simulating from many different classes of copulas including those which
have been difficult to sample from using traditional techniques.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Approximation of bivariate copulas by patched bivariate Frchet copulas. Zheng, Yanting; Yang, Jingping; Huang, Jianhua Z [RKN:
40019]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 246-256.
Bivariate Frchet (BF) copulas characterize dependence as a mixture of three simple structures: comonotonicity, independence
and countermonotonicity. They are easily interpretable but have limitations when used as approximations to general dependence
structures. To improve the approximation property of the BF copulas and keep the advantage of easy interpretation, we develop a
new copula approximation scheme by using BF copulas locally and patching the local pieces together. Error bounds and a
probabilistic interpretation of this approximation scheme are developed. The new approximation scheme is compared with several
existing copula approximations, including shuffle of min, checkmin, checkerboard and Bernstein approximations and exhibits
better performance, especially in characterizing the local dependence. The utility of the new approximation scheme in insurance
and finance is illustrated in the computation of the rainbow option prices and stop-loss premiums.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Archimedean copulas in finite and infinite dimensions - with application to ruin problems. Constantinescu, Corina; Hashorva,
Enkelejd; Ji, Lanpeng [RKN: 44950]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 487-495.
In this paper the authors discuss the link between Archimedean copulas and Dirichlet distributions for both finite and infinite
dimensions. With motivation from the recent papers and the authors apply their results to certain ruin problems.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Comparison of increasing directionally convex transformations of random vectors with a common copula. Belzunce, Felix;
Suarez-Llorens, Alfonso; Sordo, Miguel A [RKN: 45641]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 385-390.
Let X and Y be two random vectors in Rn sharing the same dependence structure, that is, with a common copula. As many authors
have pointed out, results of the following form are of interest: under which conditions, the stochastic comparison of the marginals
of X and Y is a sufficient condition for the comparison of the expected values for some transformations of these random vectors?
Assuming that the components are ordered in the univariate dispersive orderwhich can be interpreted as a multivariate
dispersion ordering between the vectorsthe main purpose of this work is to show that a weak positive dependence property, such
as the positive association property, is enough for the comparison of the variance of any increasing directionally convex
transformation of the vectors. Some applications in premium principles, optimization and multivariate distortions are described.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

A copula approach to test asymmetric information with applications to predictive modeling. Shi, Peng; Valdez, Emiliano A [RKN:
44965]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (2) : 226-239.
In this article, we present a copula regression model for testing asymmetric information as well as for predictive modeling
applications in automobile insurance market. We use the Frank copula to jointly model the type of coverage and the number of
accidents, with the dependence parameter providing for evidence of the relationship between the choice of coverage and the
frequency of accidents. This dependence therefore provides an indication of the presence (or absence) of asymmetric information.
The type of coverage is in some sense ordered so that coverage with higher ordinals indicate the most comprehensive coverage.
Henceforth, a positive relationship would indicate that more coverage is chosen by high risk policyholders, and vice versa. This
presence of asymmetric information could be due to either adverse selection or moral hazard, a distinction often made in the
economics or insurance literature, or both. We calibrated our copula model using a one-year cross-sectional observation of claims
arising from a major automobile insurer in Singapore. Our estimation results indicate a significant positive coveragerisk
relationship. However, when we correct for the bias resulting from possible underreporting of accidents, we find that the positive
association vanishes. We further used our estimated model for other possible actuarial applications. In particular, we are able to
demonstrate the effect of coverage choice on the incidence of accidents, and based on which, the pure premium is derived. In
general, a positive margin is observed when compared with the gross premium available in our empirical database.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

30

Copula based hierarchical risk aggregation through sample reordering. Arbenz, Philipp; Hummel, Christoph; Mainik, Georg [RKN:
45729]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 122-133.
For high-dimensional risk aggregation purposes, most popular copula classes are too restrictive in terms of attainable
dependence structures. These limitations aggravate with increasing dimension. We study a hierarchical risk aggregation method
which is flexible in high dimensions. With this method it suffices to specify a low dimensional copula for each aggregation step in
the hierarchy. Copulas and margins of arbitrary kind can be combined. We give an algorithm for numerical approximation which
introduces dependence between originally independent marginal samples through reordering.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Copula models for insurance claim numbers with excess zeros and time-dependence. Zhao, XiaoBing; Zhou, Xian [RKN: 45535]
Insurance: Mathematics & Economics (2012) 50 (1) : 191-199.
This paper develops two copula models for fitting the insurance claim numbers with excess zeros and time-dependence. The joint
distribution of the claims in two successive periods is modeled by a copula with discrete or continuous marginal distributions. The
first model fits two successive claims by a bivariate copula with discrete marginal distributions. In the second model, a copula is
used to model the random effects of the conjoint numbers of successive claims with continuous marginal distributions.
Zero-inflated phenomenon is taken into account in the above copula models. The maximum likelihood is applied to estimate the
parameters of the discrete copula model. A two-step procedure is proposed to estimate the parameters in the second model, with
the first step to estimate the marginals, followed by the second step to estimate the unobserved random effect variables and the
copula parameter. Simulations are performed to assess the proposed models and methodologies.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Dependence modeling in non-life insurance using the Bernstein copula. Diers, Dorothea; Eling, Martin; Marek, Sebastian D [RKN:
45646]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 430-436.
This paper illustrates the modeling of dependence structures of non-life insurance risks using the Bernstein copula. We conduct a
goodness-of-fit analysis and compare the Bernstein copula with other widely used copulas. Then, we illustrate the use of the
Bernstein copula in a value-at-risk and tail-value-at-risk simulation study. For both analyses we utilize German claims data on
storm, flood, and water damage insurance for calibration. Our results highlight the advantages of the Bernstein copula, including
its flexibility in mapping inhomogeneous dependence structures and its easy use in a simulation context due to its representation
as mixture of independent Beta densities. Practitioners and regulators working toward appropriate modeling of dependences in a
risk management and solvency context can benefit from our results.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Dependent loss reserving using copulas. Shi, Peng; Frees, Edward W - 38 pages. [RKN: 74743]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2011) 41 (2) : 449-486.
Modeling dependencies among multiple loss triangles has important implications for the determination of loss reserves, a critical
element of risk management and capital allocation practices of property-casualty insurers. In this article, we propose a copula
regression model for dependent lines of business that can be used to predict unpaid losses and hence determine loss reserves.
The proposed method, relating the payments in different run-off triangles through a copula function, allows the analyst to use
flexible parametric families for the loss distribution and to understand the associations among lines of business. Based on the
copula model, a parametric bootstrap procedure is developed to incorporate the uncertainty in parameter estimates. To illustrate
this method, we consider an insurance portfolio consisting of personal and commercial automobile lines. When applied to the data
of a major US property-casualty insurer, our method provides comparable point prediction of unpaid losses with the industry's
standard practice, chain-ladder estimates. Moreover, our flexible structure allows us to easily compute the entire predictive
distribution of unpaid losses. This procedure also readily yields accident year reserves, calendar year reserves, as well as the
aggregate reserves. One important implication of the dependence modeling is that it allows analysts to quantify the diversification
effects in risk capital analysis. We demonstrate these effects by calculating commonly used risk measures, including value at risk
and conditional tail expectation, for the insurer's combined portfolio of personal and commercial automobile lines.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Efficient algorithms for basket default swap pricing with multivariate Archimedean copulas. Choe, Geon Ho; Jang, Hyun Jin [RKN:
40014]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 205-213.
We introduce a new importance sampling method for pricing basket default swaps employing exchangeable Archimedean copulas
and nested Gumbel copulas. We establish more realistic dependence structures than existing copula models for credit risks in the
underlying portfolio, and propose an appropriate density for importance sampling by analyzing multivariate Archimedean copulas.
To justify efficiency and accuracy of the proposed algorithms, we present numerical examples and compare them with the crude
Monte Carlo simulation, and finally show that our proposed estimators produce considerably smaller variances.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Estimating copulas for insurance from scarce observations, expert opinion and prior information : A Bayesian approach.
Arbenz, Philipp; Canestrabo, Davide - 20 pages. [RKN: 70751]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2012) 42 (1) : 271-290.
A prudent assessment of dependence is crucial in many stochastic models for insurance risks. Copulas have become popular to
model such dependencies. However, estimation procedures for copulas often lead to large parameter uncertainty when
observations are scarce. In this paper, we propose a Bayesian method which combines prior information (e.g. from regulators),
31

observations and expert opinion in order to estimate copula parameters and determine the estimation uncertainty. The
combination of different sources of information can significantly reduce the parameter uncertainty compared to the use of only one
source. The model can also account for uncertainty in the marginal distributions. Furthermore, we describe the methodology for
obtaining expert opinion and explain involved psychological effects and popular fallacies. We exemplify the approach in a case
study.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Extreme value behavior of aggregate dependent risks. Chen, Die; Mao, Tiantian; Pan, Xiaoming; Hu, Taizhong [RKN: 44995]
Insurance: Mathematics & Economics (2012) 50 (1) : 99-108.
Consider a portfolio of n identically distributed risks with dependence structure modeled by an Archimedean survival copula.
Wthrich (2003) and Alink et al. (2004) proved that the probability of a large aggregate loss scales like the probability of a large
individual loss, times a proportionality factor. This factor depends on the dependence strength and the tail behavior of the
individual risk, denoted by , and according to whether the tail behavior belongs to the maximum domain of attraction of the
Frchet, the Weibull or the Gumbel distribution, respectively. We investigate properties of the factors and with respect to the
dependence parameter and/or the tail behavior parameter, and revisit the asymptotic behavior of conditional tail expectations of
aggregate risks for the Weibull and the Gumbel cases by using a different method. The main results strengthen and complement
some results in [Alink et al., 2004] and [Alink et al., 2005]Barbe et al. (2006), and Embrechts et al. (2009).
Available via Athens: Palgrave MacMillan
http://www.openathens.net

A generalized beta copula with applications in modeling multivariate long-tailed data. Yang, Xipei; Frees, Edward W; Zhang,
Zhengjun [RKN: 44968]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (2) : 265-284.
This work proposes a new copula class that the authors call the MGB2 copula. The new copula originates from extracting the
dependence function of the multivariate GB2 distribution (MGB2) whose marginals follow the univariate generalized beta
distribution of the second kind (GB2). The MGB2 copula can capture non-elliptical and asymmetric dependencies among marginal
coordinates and provides a simple formulation for multi-dimensional applications. This new class features positive tail dependence
in the upper tail and tail independence in the lower tail. Furthermore, it includes some well-known copula classes, such as the
Gaussian copula, as special or limiting cases.
To illustrate the usefulness of the MGB2 copula, the authors build a trivariate MGB2 copula model of bodily injury liability closed
claims. Extended GB2 distributions are chosen to accommodate the right-skewness and the long-tailedness of the outcome
variables. For the regression component, location parameters with continuous predictors are introduced using a nonlinear additive
function. For comparison purposes, we also consider the Gumbel and t copulas, alternatives that capture the upper tail
dependence. The paper introduces a conditional plot graphical tool for assessing the validation of the MGB2 copula. Quantitative
and graphical assessment of the goodness of fit demonstrate the advantages of the MGB2 copula over the other copulas.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Measurement and modelling of dependencies in economic capital. Shaw, R A; Smith, A D; Spivak, G S - 99 pages. [RKN: 73863]
Shelved at: Per: BAJ (Oxf) Per: BAJ (Lon) Shelved at: REF
BAJ (2011) 16 (3) : 601-699.
This paper covers a number of different topics related to the measurement and modelling of dependency within economic capital
models. The scope of the paper is relatively wide. We address in some detail the different approaches to modelling dependencies
ranging from the more common variance-covariance matrix approach, to the consideration of the use of copulas and the more
sophisticated causal models that feature feedback loops and other systems design ideas.
There are many data and model uncertainties in modelling dependency and so we have also endeavoured to cover topics such as
spurious relationships and wrong-way risk to highlight some of the uncertainties involved.
With the advent of the internal model approval process under Solvency II, senior management needs to have a greater
understanding of dependency methodology. We have devoted a section of this paper to a discussion of possible different ways to
communicate the results of modelling to the board, senior management and other interested parties within an insurance company.
We have endeavoured throughout this paper to include as many numerical examples as possible to help in the understanding of
the key points, including our discussion of model parameterisation and the communication to an insurance executive of the impact
of dependency on economic capital modelling results.
The economic capital model can be seen as a combination of two key components: the marginal risk distribution of each risk and
the aggregation methodology which combines these into a single aggregate distribution or capital number. This paper is
concerned with the aggregation part, the methods and assumptions employed and the issues arising, and not the determination of
the marginal risk distributions which is equally of importance and in many cases equally as complex.
http://www.actuaries.org.uk/research-and-resources/pages/members-access-journals

Measurement and modelling of dependencies in economic capital : Abstract of the London discussion. Shaw, Richard - 21 pages.
[RKN: 73864]
Shelved at: Per: BAJ (Oxf) Per: BAJ (Lon) Shelved at: REF
BAJ (2011) 16 (3) : 701-721.
This abstract relates to the following paper:
Shaw, R.A., Smith, A.D. & Spivak, G.S. Measurement and modelling of dependencies in economic
capital. British Actuarial Journal, 16 (3).
http://www.actuaries.org.uk/research-and-resources/pages/members-access-journals

Measuring comonotonicity in M-dimensional vectors. Koch, Inge; De Schepper, Ann [RKN: 45305]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU/AST
ASTIN Bulletin (2011) 41 (1) : 191-213.
In this contribution, a new measure of comonotonicity for m-dimensional vectors is introduced, with values between zero,
representing the independent situation, and one, reflecting a completely comonotonic situation. The main characteristics of this
coefficient are examined, and the relations with common dependence measures are analysed. A sample-based version of the
comonotonicity coefficient is also derived. Special attention is paid to the explanation of the accuracy of the convex order bound
32

method of Goovaerts, Dhaene et al. in the case of cash flows with Gaussian discounting processes.
online access via International Actuarial Association:
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Modeling dependence dynamics through copulas with regime switching. Silvo Filho, Osvaldo Candido da; Ziegelmann, Flavio
Augusto; Dueker, Michael J [RKN: 45638]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 346-356.
Measuring dynamic dependence between international financial markets has recently attracted great interest in financial
econometrics because the observed correlations rose dramatically during the 200809 global financial crisis. Here, we propose a
novel approach for measuring dependence dynamics. We include a hidden Markov chain (MC) in the equation describing
dependence dynamics, allowing the unobserved time-varying dependence parameter to vary according to both a restricted ARMA
process and an unobserved two-state MC. Estimation is carried out via the inference for the margins in conjunction with
filtering/smoothing algorithms. We use block bootstrapping to estimate the covariance matrix of our estimators. Monte Carlo
simulations compare the performance of regime switching and no switching models, supporting the regime-switching
specification. Finally the proposed approach is applied to empirical data, through the study of the S&P500 (USA), FTSE100 (UK)
and BOVESPA (Brazil) stock market indexes.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Modeling of claim exceedances over random thresholds for related insurance portfolios. Eryilmaz, Serkan; Gebizlioglu, Omer L;
Tank, Fatih [RKN: 44951]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 496-500.
Large claims in an actuarial risk process are of special importance for the actuarial decision making about several issues like
pricing of risks, determination of retention treaties and capital requirements for solvency. This paper presents a model about claim
occurrences in an insurance portfolio that exceed the largest claim of another portfolio providing the same sort of insurance
coverages. Two cases are taken into consideration: independent and identically distributed claims and exchangeable dependent
claims in each of the portfolios. Copulas are used to model the dependence situations. Several theorems and examples are
presented for the distributional properties and expected values of the critical quantities under concern.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Multivariate longitudinal modeling of insurance company expenses. Shi, Peng [RKN: 45737]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 204-215.
Insurers, investors and regulators are interested in understanding the behaviour of insurance company expenses, due to the high
operating cost of the industry. Expense models can be used for prediction, to identify unusual behavior, and to measure firm
efficiency. Current literature focuses on the study of total expenses that consist of three components: underwriting, investment and
loss adjustment. A joint study of expenses by type is to deliver more information and is critical in understanding their relationship.

This paper introduces a copula regression model to examine the three types of expenses in a longitudinal context. In our method,
elliptical copulas are employed to accommodate the between-subject contemporaneous and lag dependencies, as well as the
within-subject serial correlations of the three types. Flexible distributions are allowed for the marginals of each type with covariates
incorporated in distribution parameters. A model validation procedure based on a t-plot method is proposed for in-sample and
out-of-sample validation purposes. The multivariate longitudinal model effectively addresses the typical features of expenses
data: the heavy tails, the strong individual effects and the lack of balance.

The analysis is performed using propertycasualty insurance company expenses data from the National Association of Insurance
Commissioners of years 20012006. A unique set of covariates is determined for each type of expenses. We found that
underwriting expenses and loss adjustment expenses are complements rather than substitutes. The model is shown to be
successful in efficiency classification. Also, a multivariate predictive density is derived to quantify the future values of an insurers
expenses.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

On allocation of upper limits and deductibles with dependent frequencies and comonotonic severities. Li, Xiaohu; You, Yinping
[RKN: 45645]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 423-429.
With the assumption of Archimedean copula for the occurrence frequencies of the risks covered by an insurance policy, this note
further investigates the allocation problem of upper limits and deductibles addressed in Hua and Cheung (2008a). Sufficient
conditions for a risk averse policyholder to well allocate the upper limits and the deductibles are built, respectively.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

On the distortion of a copula and its margins. Valdez, Emiliano A; Xiao, Yugu [RKN: 44923]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 4 : 292-237.
This article examines the notion of distortion of copulas, a natural extension of distortion within the univariate framework. We study
three approaches to this extension: (1) distortion of the margins alone while keeping the original copula structure; (2) distortion of
the margins while simultaneously altering the copula structure; and (3) synchronized distortion of the copula and its margins.
When applying distortion within the multivariate framework, it is important to preserve the properties of a copula function. For the
first two approaches, this is a rather straightforward result; however, for the third approach, the proof has been exquisitely
constructed in Morillas (2005). These three approaches unify the different types of multivariate distortion that have scarcely
scattered in the literature. Our contribution in this paper is to further consider this unifying framework: we give numerous examples
33

to illustrate and we examine their properties particularly with some aspects of ordering multivariate risks. The extension of
multivariate distortion can be practically implemented in risk management where there is a need to perform aggregation and
attribution of portfolios of correlated risks. Furthermore, ancillary to the results discussed in this article, we are able to generalize
the formula developed by Genest & Rivest (2001) for computing the distribution of the probability integral transformation of a
random vector and extend it to the case within the distortion framework. For purposes of illustration, we applied the distortion
concept to value excess of loss reinsurance for an insurance policy where the loss amount could vary by type of loss.


On the distribution of the (un)bounded sum of random variables. Cherubini, Umberto; Mulinacci, Sabrina; Romagnoli, Silvia [RKN:
62610]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 56-63.
We propose a general treatment of random variables aggregation accounting for the dependence among variables and bounded
or unbounded support of their sum. The approach is based on the extension to the concept of convolution to dependent variables,
involving copula functions. We show that some classes of copula functions (such as MarshallOlkin and elliptical) cannot be used
to represent the dependence structure of two variables whose sum is bounded, while Archimedean copulas can be applied only if
the generator becomes linear beyond some point. As for the application, we study the problem of capital allocation between risks
when the sum of losses is bounded.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

On the invariant properties of notions of positive dependence and copulas under increasing transformations. Cai, Jun; Wei, Wei
[RKN: 44988]
Insurance: Mathematics & Economics (2012) 50 (1) : 43-49.
Notions of positive dependence and copulas play important roles in modeling dependent risks. The invariant properties of notions
of positive dependence and copulas under increasing transformations are often used in the studies of economics, finance,
insurance and many other fields. In this paper, we examine the notions of the conditionally increasing (CI), the conditionally
increasing in sequence (CIS), the positive dependence through the stochastic ordering (PDS), and the positive dependence
through the upper orthant ordering (PDUO). We first use counterexamples to show that the statements in Theorem 3.10.19 of
Mller and Stoyan (2002) about the invariant properties of CIS and CI under increasing transformations are not true. We then
prove that the invariant properties of CIS and CI hold under strictly increasing transformations. Furthermore, we give rigorous
proofs for the invariant properties of PDS and PDUO under increasing transformations. These invariant properties enable us to
show that a continuous random vector is PDS (PDUO) if and only of its copula is PDS (PDUO). In addition, using the properties of
generalized left-continuous and right-continuous inverse functions, we give a rigorous proof for the invariant property of copulas
under increasing transformations on the components of any random vector. This result generalizes Proposition 4.7.4 of Denuit et
al. (2005) and Proposition 5.6. of McNeil et al. (2005).
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Optimal control and dependence modeling of insurance portfolios with Lvy dynamics. Bauerle, Nicole; Blatter, Anja [RKN: 45134]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (3) : 398-405.
In this paper we are interested in optimizing proportional reinsurance and investment policies in a multidimensional Lvy-driven
insurance model. The criterion is that of maximizing exponential utility. Solving the classical HamiltonJacobiBellman equation
yields that the optimal retention level keeps a constant amount of claims regardless of time and the companys wealth level. A
special feature of our construction is to allow for dependencies of the risk reserves in different business lines. Dependence is
modeled via an Archimedean Lvy copula. We derive a sufficient and necessary condition for an Archimedean Lvy generator to
create a multidimensional positive Lvy copula in arbitrary dimension. Based on these results we identify structure conditions for
the generator and the Lvy measure of an Archimedean Lvy copula under which an insurance company reinsures a larger
fraction of claims from one business line than from another.

Available via Athens: Palgrave MacMillan
http://www.openathens.net

Pricing compound Poisson processes with the FarlieGumbelMorgenstern dependence structure. Marri, Fouad; Furman,
Edward [RKN: 45732]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 151-157.
Convenient expressions for the Esscher pricing functional in the context of the compound Poisson processes with dependent loss
amounts and loss inter-arrival times are developed. To this end, the moment generating function of the aforementioned dependent
processes is derived and studied. Various implications of the dependence are discussed and exemplified numerically.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Risk concentration of aggregated dependent risks: The second-order properties. Tong, Bin; Wu, Chongfeng; Xu, Weidong [RKN:
44999]
Insurance: Mathematics & Economics (2012) 50 (1) : 139-149.
Under the current regulatory guidelines for banks and insurance companies, the quantification of diversification benefits due to risk
aggregation plays a prominent role. In this paper we establish second-order approximation of risk concentration associated with a
random vector in terms of Value at Risk (VaR) within the methodological framework of second-order regular variation and the
theory of Archimedean copula. Moreover, we find that the rate of convergence of the first-order approximation of risk concentration
depends on the the interplay between the tail behavior of the marginal loss random variables and their dependence structure.
Specifically, we find that the rate of convergence is determined by either the second-order parameter ( 1) of Archimedean copula
generator or the second-order parameter ( ) of the tail margins, leading to either the so-called dependence dominated case or
margin dominated case.
Available via Athens: Palgrave MacMillan
34

http://www.openathens.net

Tails of correlation mixtures of elliptical copulas. Manner, Hans; Segers, Johan [RKN: 39938]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 153-160.
Correlation mixtures of elliptical copulas arise when the correlation parameter is driven itself by a latent random process. For such
copulas, both penultimate and asymptotic tail dependence are much larger than for ordinary elliptical copulas with the same
unconditional correlation. Furthermore, for Gaussian and Student t-copulas, tail dependence at sub-asymptotic levels is generally
larger than in the limit, which can have serious consequences for estimation and evaluation of extreme risk. Finally, although
correlation mixtures of Gaussian copulas inherit the property of asymptotic independence, at the same time they fall in the newly
defined category of near asymptotic dependence. The consequences of these findings for modeling are assessed by means of a
simulation study and a case study involving financial time series.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

CORRELATION

The effect of correlation and transaction costs on the pricing of basket options. Atkinson, C; Ingpochai, P Routledge, [RKN:
45797]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (1-2) : 131-179.
In this article, we examine the problem of evaluating the option price of a European call option written on N underlying assets when
there are proportional transaction costs in the market. Since the portfolio under consideration consists of multiple risky assets,
which makes numerical methods formidable, we use perturbation analyses. The article extends the model for option pricing on
uncorrelated assets, which was proposed by Atkinson and Alexandropoulos (2006 Pricing a European basket option in the
presence of proportional transaction cost . Applied Mathematical Finance , 13 ( 3 ) : 191 214). We determine optimal hedging
strategies as well as option prices on both correlated and uncorrelated assets. The option valuation problem is obtained by
comparing the maximized utility of wealth with and without option liability. The two stochastic control problems, which arise from
the transaction costs, are transformed to free boundary and partial differential equation problems. Once the problems have been
formulated, we establish optimal trading strategies for each of the portfolios. In addition, the optimal hedging strategies can be
found by comparing the trading strategies of the two portfolios. We provide a general procedure for solving N risky assets, which
shows that for small correlations the N asset problem can be replaced by N (N-1)/2 two-dimensional problems and give numerical
examples for the two risky assets portfolios.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

Estimating copulas for insurance from scarce observations, expert opinion and prior information : A Bayesian approach.
Arbenz, Philipp; Canestrabo, Davide - 20 pages. [RKN: 70751]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2012) 42 (1) : 271-290.
A prudent assessment of dependence is crucial in many stochastic models for insurance risks. Copulas have become popular to
model such dependencies. However, estimation procedures for copulas often lead to large parameter uncertainty when
observations are scarce. In this paper, we propose a Bayesian method which combines prior information (e.g. from regulators),
observations and expert opinion in order to estimate copula parameters and determine the estimation uncertainty. The
combination of different sources of information can significantly reduce the parameter uncertainty compared to the use of only one
source. The model can also account for uncertainty in the marginal distributions. Furthermore, we describe the methodology for
obtaining expert opinion and explain involved psychological effects and popular fallacies. We exemplify the approach in a case
study.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Measuring comonotonicity in M-dimensional vectors. Koch, Inge; De Schepper, Ann [RKN: 45305]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU/AST
ASTIN Bulletin (2011) 41 (1) : 191-213.
In this contribution, a new measure of comonotonicity for m-dimensional vectors is introduced, with values between zero,
representing the independent situation, and one, reflecting a completely comonotonic situation. The main characteristics of this
coefficient are examined, and the relations with common dependence measures are analysed. A sample-based version of the
comonotonicity coefficient is also derived. Special attention is paid to the explanation of the accuracy of the convex order bound
method of Goovaerts, Dhaene et al. in the case of cash flows with Gaussian discounting processes.
online access via International Actuarial Association:
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

On the spurious correlation between sample betas and mean returns. Levy, Moshe Routledge, [RKN: 45843]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (3-4) : 341-360.
Cornerstone asset pricing models, such as capital asset pricing model (CAPM) and arbitrage pricing theory (APT), yield
theoretical predictions about the relationship between expected returns and exposure to systematic risk, as measured by beta(s).
Numerous studies have investigated the empirical validity of these models. We show that even if no relationship holds between
true expected returns and betas in the population, the existence of low-probability extreme outcomes induces a spurious
correlation between the sample means and the sample betas. Moreover, the magnitude of this purely spurious correlation is
similar to the empirically documented correlation, and the regression slopes and intercepts are very similar as well. This result
does not necessarily constitute evidence against the theoretical asset pricing models, but it does shed new light on previous
35

empirical results, and it points to an issue that should be carefully considered in the empirical testing of these models. The analysis
points to the dangers of relying on simple least squares regression for drawing conclusions about the validity of equilibrium pricing
models.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

Tails of correlation mixtures of elliptical copulas. Manner, Hans; Segers, Johan [RKN: 39938]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 153-160.
Correlation mixtures of elliptical copulas arise when the correlation parameter is driven itself by a latent random process. For such
copulas, both penultimate and asymptotic tail dependence are much larger than for ordinary elliptical copulas with the same
unconditional correlation. Furthermore, for Gaussian and Student t-copulas, tail dependence at sub-asymptotic levels is generally
larger than in the limit, which can have serious consequences for estimation and evaluation of extreme risk. Finally, although
correlation mixtures of Gaussian copulas inherit the property of asymptotic independence, at the same time they fall in the newly
defined category of near asymptotic dependence. The consequences of these findings for modeling are assessed by means of a
simulation study and a case study involving financial time series.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

COSTS

Assessing the costs of protection in a context of switching stochastic regimes. Barrieu, Pauline; Bellamy, Nadine; Sahut,
Jean-Michel [RKN: 45880]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (5-6) : 495-511.
We consider the problem of cost assessment in the context of switching stochastic regimes. The dynamics of a given asset include
a background noise, described by a Brownian motion and a random shock, the impact of which is characterized by changes in the
coefficient diffusions. A particular economic agent that is directly exposed to variations in the underlying asset price, incurs some
costs, F(L), when the underlying asset price reaches a certain threshold, L. Ideally, the agent would make advance provision, or
hedge, for these costs at time 0. We evaluate the amount of provision, or the hedging premium, M(L), for these costs in the
disrupted environment, with changes in the regime for a given time horizon, and analyse the sensitivity of this amount to possible
model misspecifications.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

Insurance pricing with complete information, state-dependent utility, and production costs. Ramsay, Colin M; Oguledo, Victor I
[RKN: 45649]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 462-469.
We consider a group of identical risk-neutral insurers selling single-period indemnity insurance policies. The insurance market
consists of individuals with common state-dependent utility function who are identical except for their known accident probability q.
Insurers incur production costs (commonly called expenses or transaction costs by actuaries) that are proportional to the amount
of insurance purchased and to the premium charged. By introducing the concept of insurance desirability, we prove that the
existence of insurer expenses generates a pair of constants qmin and qmax that naturally partitions the applicant pool into three
mutually exclusive and exhaustive groups of individuals: those individuals with accident probability q [0,qmin) are insurable but do
not desire insurance, those individuals with accident probability q [qmin,qmax] are insurable and desire insurance, and those
individuals with accident probability q (qmax,1] desire insurance but are uninsurable. We also prove that, depending on the level of
q and the marginal rate of substitution between states, it may be optimal for individuals to buy complete (full) insurance, partial
insurance, or no insurance at all. Finally, we prove that when q is known in monopolistic markets (i.e., markets with a single
insurer), applicants may be induced to over insure whenever partial insurance is bought.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

CREDIBILITY

Portfolio adjusting optimization with added assets and transaction costs based on credibility measures. Zhang, Wei-Guo; Zhang,
Xi-Li; Chen, Yunxia [RKN: 44937]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 353-360.
In response to changeful financial markets and investors capital, we discuss a portfolio adjusting problem with additional risk
assets and a riskless asset based on credibility theory. We propose two credibilistic meanvariance portfolio adjusting models with
general fuzzy returns, which take lending, borrowing, transaction cost, additional risk assets and capital into consideration in
portfolio adjusting process. We present crisp forms of the models when the returns of risk assets are some deterministic fuzzy
variables such as trapezoidal, triangular and interval types. We also employ a quadratic programming solution algorithm for
obtaining optimal adjusting strategy. The comparisons of numeral results from different models illustrate the efficiency of the
proposed models and the algorithm.
Available via Athens: Palgrave MacMillan
http://www.openathens.net
36


Robustefficient credibility models with heavy-tailed claims: A mixed linear models perspective. Dornheim, Harald; Brazauskas,
Vytaras [RKN: 39365]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 72-84.
In actuarial practice, regression models serve as a popular statistical tool for analyzing insurance data and tariff ratemaking. In this
paper, we consider classical credibility models that can be embedded within the framework of mixed linear models. For inference
about fixed effects and variance components, likelihood-based methods such as (restricted) maximum likelihood estimators are
commonly pursued. However, it is well-known that these standard and fully efficient estimators are extremely sensitive to small
deviations from hypothesized normality of random components as well as to the occurrence of outliers. To obtain better estimators
for premium calculation and prediction of future claims, various robust methods have been successfully adapted to credibility
theory in the actuarial literature. The objective of this work is to develop robust and efficient methods for credibility when
heavy-tailed claims are approximately log-locationscale distributed. To accomplish that, we first show how to express additive
credibility models such as BhlmannStraub and Hachemeister ones as mixed linear models with symmetric or asymmetric
errors. Then, we adjust adaptively truncated likelihood methods and compute highly robust credibility estimates for the ordinary
but heavy-tailed claims part. Finally, we treat the identified excess claims separately and find robustefficient credibility premiums.
Practical performance of this approach is examinedvia simulationsunder several contaminating scenarios. A widely studied
real-data set from workers compensation insurance is used to illustrate functional capabilities of the new robust credibility
estimators.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

CREDIBILITY THEORY

Evolutionary credibility theory: A generalized linear mixed modeling approach. Lai, Tze Leung; Sun, Kevin Haoyu Society of
Actuaries, - 12 pages. [RKN: 70146]
Shelved at: Per: NAAJ (Oxf) Per NAAJ (Lon) Shelved at: JOU
North American Actuarial Journal (2012) 16 (2) : 273-284.
The conventional approach to evolutionary credibility theory assumes a linear state-space model for the longitudinal claims data
so that Kalman filters can be used to estimate the claims expected values, which are assumed to form an autoregressive time
series. We propose a class of linear mixed models as an alternative to linear state-space models for evolutionary credibility and
show that the predictive performance is comparable to that of the Kalman filter when the claims are generated by a linear
state-space model. More importantly, this approach can be readily extended to generalized linear mixed models for the
longitudinal claims data. We illustrate its applications by addressing the excess zeros issue that a substantial fraction of policies
does not have claims at various times in the period under consideration.
http://www.soa.org/news-and-publications/publications/journals/naaj/naaj-detail.aspx

CRITICAL ILLNESS INSURANCE

Bayesian modelling of the time delay between diagnosis and settlement for Critical Illness Insurance using a Burr
generalised-linear-type model. Ozkok, Erengul; Streftaris, George; Waters, Howard R; Wilkie, A David [RKN: 45600]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (2) : 266-279.
We discuss Bayesian modelling of the delay between dates of diagnosis and settlement of claims in Critical Illness Insurance
using a Burr distribution. The data are supplied by the UK Continuous Mortality Investigation and relate to claims settled in the
years 19992005. There are non-recorded dates of diagnosis and settlement and these are included in the analysis as missing
values using their posterior predictive distribution and MCMC methodology. The possible factors affecting the delay (age, sex,
smoker status, policy type, benefit amount, etc.) are investigated under a Bayesian approach. A 3-parameter Burr
generalised-linear-type model is fitted, where the covariates are linked to the mean of the distribution. Variable selection using
Bayesian methodology to obtain the best model with different prior distribution setups for the parameters is also applied. In
particular, Gibbs variable selection methods are considered, and results are confirmed using exact marginal likelihood findings
and related Laplace approximations. For comparison purposes, a lognormal model is also considered.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

The genetics of breast and ovarian cancer IV: a model of breast cancer progression. Lu, Baopeng; Macdonald, Angus S; Waters,
Howard R [RKN: 44921]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 4 : 239-266.
Gui et al. (2006) in Part III of a series of papers, proposed a dynamic family history model of breast cancer and ovarian cancer in
which the development of a family history was represented explicitly as a transition between states, and then applied this model to
life insurance and critical illness insurance. In this study, the authors extend the model to income protection insurance. In this
paper, Part IV of the series, the authors construct and parameterise a semi-Markov model for the life history of a woman with
breast cancer, in which events such as diagnosis, treatment, recovery and recurrence are incorporated. In Part V, we then show:
(a) estimates of premium ratings depending on genotype or family history; and (b) the impact of adverse selection under various
moratoria on the use of genetic information.

37

CURRENCIES

On cross-currency models with stochastic volatility and correlated interest rates. Grzelak, Lech A; Oosterleeac, Cornelis W
Routledge, [RKN: 45793]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (1-2) : 1-35.
We construct multi-currency models with stochastic volatility (SV) and correlated stochastic interest rates with a full matrix of
correlations. We first deal with a foreign exchange (FX) model of Heston-type, in which the domestic and foreign interest rates are
generated by the short-rate process of HullWhite (Hull, J. and White, A. [1990] Pricing interest-rate derivative securities, Review
of Financial Studies, 3, pp. 573592). We then extend the framework by modelling the interest rate by an SV displaced-diffusion
(DD) Libor Market Model (Andersen, L. B. G. and Andreasen, J. [2000] Volatility skews and extensions of the libor market model,
Applied Mathematics Finance, 1[7], pp. 132), which can model an interest rate smile. We provide semi-closed form
approximations which lead to efficient calibration of the multi-currency models. Finally, we add a correlated stock to the framework
and discuss the construction, model calibration and pricing of equityFXinterest rate hybrid pay-offs.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

The stochastic intrinsic currency volatility model : A consistent framework for multiple FX rates and their volatilities. Doust, Paul
Routledge, [RKN: 45876]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (5-6) : 381-445.
The SABR and the Heston stochastic volatility models are widely used for foreign exchange (FX) option pricing. Although they are
able to reproduce the market's volatility smiles and skews for single FX rates, they cannot be extended to model multiple FX rates
in a consistent way. This is because when two FX rates with a common currency are described by either SABR or Heston
processes, the stochastic process for the third FX rate associated with the three currencies will not be a SABR or a Heston
process. A consistent description of the FX market should be symmetric so that all FX rates are described by the same type of
stochastic process. This article presents a way of doing that using the concept of intrinsic currency values. To model FX volatility
curves, the intrinsic currency framework is extended by allowing the volatility of each intrinsic currency value to be stochastic. This
makes the framework more realistic, while preserving all FX market symmetries. The result is a new SABR-style option pricing
formula and a model that can simultaneously be calibrated to the volatility smiles of all possible currency pairs under
consideration. Consequently, it is more powerful than modelling the volatility curves of each currency pair individually and offers a
methodology for comparing the volatility curves of different currency pairs against each other. One potential application of this is in
the pricing of FX options, such as vanilla options on less liquid currency pairs. Given the volatilities of less liquid currencies
against, for example, USD and EUR, the model could then be used to calculate the volatility smiles of those less liquid currencies
against all other currencies.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

DATA

Assessing the performance of different volatility estimators : A Monte Carlo analysis. Cartea, Alvaro; Karyampas, Dimitrios [RKN:
45882]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (5-6) : 535-552.
We test the performance of different volatility estimators that have recently been proposed in the literature and have been
designed to deal with problems arising when ultra high-frequency data are employed: microstructure noise and price
discontinuities. Our goal is to provide an extensive simulation analysis for different levels of noise and frequency of jumps to
compare the performance of the proposed volatility estimators. We conclude that the maximum likelihood estimator filter (MLE-F),
a two-step parametric volatility estimator proposed by Cartea and Karyampas (2011a10. Cartea , . and Karyampas , D. 2011a .
The relationship between the volatility of returns and the number of jumps in financial markets, SSRN eLibrary , Working Paper
Series, SSRN .
View all references; The relationship between the volatility returns and the number of jumps in financial markets, SSRN eLibrary,
Working Paper Series, SSRN), outperforms most of the well-known high-frequency volatility estimators when different
assumptions about the path properties of stock dynamics are used.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

Extending the Lee-Carter model: a three-way decomposition. Russolillo, Maria; Giordano, Giuseppe; Haberman, Steven [RKN:
45354]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 2 : 96-117.
In this paper, we focus on a Multi-dimensional Data Analysis approach to the Lee-Carter (LC) model of mortality trends. In
particular, we extend the bilinear LC model and specify a new model based on a three-way structure, which incorporates a further
component in the decomposition of the log-mortality rates. A multi-way component analysis is performed using the Tucker3 model.
The suggested methodology allows us to obtain combined estimates for the three modes: (1) time, (2) age groups and (3) different
populations. From the results obtained by the Tucker3 decomposition, we can jointly compare, in both a numerical and graphical
way, the relationships among all three modes and obtain a time-series component as a leading indicator of the mortality trend for
a group of populations. Further, we carry out a correlation analysis of the estimated trends in order to assess the reliability of the
results of the three-way decomposition. The model's goodness of fit is assessed using an analysis of the residuals. Finally, we
discuss how the synthesised mortality index can be used to build concise projected life tables for a group of populations. An
application which compares 10 European countries is used to illustrate the approach and provide a deeper insight into the model
and its implementation.
38


Fitting insurance claims to skewed distributions: are the skew-normal and skew-student good models?. Eling, Martin [RKN:
44782]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51(2) : 239-248.
This paper analyzes whether the skew-normal and skew-student distributions recently discussed in the finance literature are
reasonable models for describing claims in property-liability insurance. We consider two well-known datasets from actuarial
science and fit a number of parametric distributions to these data. Also the non-parametric transformation kernel approach is
considered as a benchmark model. We find that the skew-normal and skew-student are reasonably competitive compared to other
models in the literature when describing insurance data. In addition to goodness-of-fit tests, tail risk measures such as value at risk
and tail value at risk are estimated for the datasets under consideration.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

DATA FITTING

A new class of models for heavy tailed distributions in finance and insurance risk. Ahn, Soohan; Kim, Joseph H T; Ramaswami,
Vaidyanathan [RKN: 45722]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 43-52.
Many insurance loss data are known to be heavy-tailed. In this article we study the class of Log phase-type (LogPH) distributions
as a parametric alternative in fitting heavy tailed data. Transformed from the popular phase-type distribution class, the LogPH
introduced by Ramaswami exhibits several advantages over other parametric alternatives. We analytically derive its tail related
quantities including the conditional tail moments and the mean excess function, and also discuss its tail thickness in the context of
extreme value theory. Because of its denseness proved herein, we argue that the LogPH can offer a rich class of heavy-tailed loss
distributions without separate modeling for the tail side, which is the case for the generalized Pareto distribution (GPD). As a
numerical example we use the well-known Danish fire data to calibrate the LogPH model and compare the result with that of the
GPD. We also present fitting results for a set of insurance guarantee loss data.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

DEATH BENEFIT

Valuing equity-linked death benefits and other contingent options : A discounted density approach. Gerber, Hans U; Shiu, Elias S
W; Yang, Hailiang [RKN: 45725]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 73-92.
Motivated by the Guaranteed Minimum Death Benefits in various deferred annuities, we investigate the calculation of the expected
discounted value of a payment at the time of death. The payment depends on the price of a stock at that time and possibly also on
the history of the stock price. If the payment turns out to be the payoff of an option, we call the contract for the payment a (life)
contingent option. Because each time-until-death distribution can be approximated by a combination of exponential distributions,
the analysis is made for the case where the time until death is exponentially distributed, i.e., under the assumption of a constant
force of mortality. The time-until-death random variable is assumed to be independent of the stock price process which is a
geometric Brownian motion. Our key tool is a discounted joint density function. A substantial series of closed-form formulas is
obtained, for the contingent call and put options, for lookback options, for barrier options, for dynamic fund protection, and for
dynamic withdrawal benefits. In a section on several stocks, the method of Esscher transforms proves to be useful for finding
among others an explicit result for valuing contingent Margrabe options or exchange options. For the case where the contracts
have a finite expiry date, closed-form formulas are found for the contingent call and put options. From these, results for De
Moivres law are obtained as limits. We also discuss equity-linked death benefit reserves and investment strategies for maintaining
such reserves. The elasticity of the reserve with respect to the stock price plays an important role. Whereas in the most important
applications the stopping time is the time of death, it could be different in other applications, for example, the time of the next
catastrophe.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

DECISION MAKING

Behavioral optimal insurance. Sung, K C; Yam, S C P; Yung, S P; Zhou, J H [RKN: 44944]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 418-428.
The present work studies the optimal insurance policy offered by an insurer adopting a proportional premium principle to an
insured whose decision-making behavior is modeled by Kahneman and Tverskys Cumulative Prospect Theory with convex
probability distortions. We show that, under a fixed premium rate, the optimal insurance policy is a generalized insurance layer
(that is, either an insurance layer or a stoploss insurance). This optimal insurance decision problem is resolved by first converting
it into three different sub-problems similar to those in ; however, as we now demand a more regular optimal solution, a completely
different approach has been developed to tackle them. When the premium is regarded as a decision variable and there is no risk
39

loading, the optimal indemnity schedule in this form has no deductibles but a cap; further results also suggests that the deductible
amount will be reduced if the risk loading is decreased. As a whole, our paper provides a theoretical explanation for the popularity
of limited coverage insurance policies in the market as observed by many socio-economists, which serves as a mathematical
bridge between behavioral finance and actuarial science.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

DEFINED BENEFIT SCHEMES

A utility-based comparison of pension funds and life insurance companies under regulatory constraints. Broeders, Dirk; Chen,
An; Koos, Birgit [RKN: 44969]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (1) : 1-10.
This paper compares two different types of annuity providers, i.e. defined benefit pension funds and life insurance companies.
One of the key differences is that the residual risk in pension funds is collectively borne by the beneficiaries and the sponsors
shareholders while in the case of life insurers it is borne by the external shareholders. First, this paper employs a contingent claim
approach to evaluate the risk return tradeoff for annuitants. For that, we take into account the differences in contract specifications
and in regulatory regimes. Second, a welfare analysis is conducted to examine whether a consumer with power utility experiences
utility gains if she chooses a defined benefit plan or a life annuity contract over a defined contribution plan. We demonstrate that
regulation can be designed to support a level playing field amongst different financial institutions.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

DEFINED CONTRIBUTION SCHEMES

Optimal asset allocation for DC pension plans under inflation. Han, Nan-wei; Hung, Mao-Wei [RKN: 45734]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 172-181.
In this paper, the stochastic dynamic programming approach is used to investigate the optimal asset allocation for a
defined-contribution pension plan with downside protection under stochastic inflation. The plan participant invests the fund wealth
and the stochastic interim contribution flows into the financial market. The nominal interest rate model is described by the
CoxIngersollRoss (Cox et al., 1985) dynamics. To cope with the inflation risk, the inflation indexed bond is included in the asset
menu. The retired individuals receive an annuity that is indexed by inflation and a downside protection on the amount of this
annuity is considered. The closed-form solution is derived under the CRRA utility function. Finally, a numerical application is
presented to characterize the dynamic behavior of the optimal investment strategy.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Three retirement decision models for defined contribution pension plan members: A simulation study. MacDonald,
Bonnie-Jeanne; Cairns, Andrew J G [RKN: 14542]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 1-18.
This paper examines the hypothetical retirement behavior of defined contribution (DC) pension plan participants. Using a Monte
Carlo simulation approach, we compare and discuss three retirement decision models: the two-thirds replacement ratio
benchmark model, the option-value of continued work model and a newly-developed one-year retirement decision model. Unlike
defined benefit (DB) pension plans where economic incentives create spikes in retirement at particular ages, all three retirement
decision models suggest that the retirement ages of DC participants are much more smoothly distributed over a wide range of
ages. We find that the one-year model possesses several advantages over the other two models when representing the
theoretical retirement choice of a DC pension plan participant. First, its underlying theory for retirement decision-making is more
feasible given the distinct features and pension drivers of a DC plan. Second, its specifications produce a more logical relationship
between an individuals decision to retire and his/her age and accumulated retirement wealth. Lastly, although the one-year model
is less complex than the option-value model as the DC participants scope is only one year, the retirement decision is optimal over
all future projected years if projections are made using reasonable financial assumptions.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

A utility-based comparison of pension funds and life insurance companies under regulatory constraints. Broeders, Dirk; Chen,
An; Koos, Birgit [RKN: 44969]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (1) : 1-10.
This paper compares two different types of annuity providers, i.e. defined benefit pension funds and life insurance companies.
One of the key differences is that the residual risk in pension funds is collectively borne by the beneficiaries and the sponsors
shareholders while in the case of life insurers it is borne by the external shareholders. First, this paper employs a contingent claim
approach to evaluate the risk return tradeoff for annuitants. For that, we take into account the differences in contract specifications
and in regulatory regimes. Second, a welfare analysis is conducted to examine whether a consumer with power utility experiences
utility gains if she chooses a defined benefit plan or a life annuity contract over a defined contribution plan. We demonstrate that
regulation can be designed to support a level playing field amongst different financial institutions.
Available via Athens: Palgrave MacMillan
http://www.openathens.net
40


DERIVATIVES

Accelerating pathwise Greeks in the Libor market model. Joshi, Mark; Wiguna, Alexander (2011). - Victoria: University of
Melbourne, 2011. - 27 pages. [RKN: 73754]
In the framework of the displaced-diffusion LIBOR market model,
we derive the pathwise adjoint method for the iterative predictor-corrector and
Glasserman-Zhao drift approximations in the spot measure. This allows us to
compute fast deltas and vegas under these schemes. We compare the discretisation
bias obtained when computing Greeks with these methods to those obtained
under the log-Euler and predictor-corrector approximations by performing tests
with interest rate caplets and cancellable receiver swaps. The two predictorcorrector
type methods were the most accurate by far. In particular, we found the
iterative predictor-corrector method to be more accurate and slightly faster than
the predictor-corrector method, the Glasserman-Zhao method to be relatively fast
but highly inconsistent, and the log-Euler method to be reasonably accurate but
only at low volatilities. Standard errors were not significantly different across all
four discretisations.
No.1 (1993) onwards available online. Download as PDF.
http://www.economics.unimelb.edu.au/ACT/papers.shtml

Bias reduction for pricing American options by least-squares Monte Carlo. Kan, Kin Hung Felix; Reesorb, Mark R Routledge,
[RKN: 45837]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (3-4) : 195-217.
We derive an approximation to the bias in regression-based Monte Carlo estimators of American option values. This derivation
holds for general asset-price processes of any dimensionality and for general pay-off structures. It uses the large sample
properties of least-squares regression estimators. Bias-corrected estimators result by subtracting the bias approximation from the
uncorrected estimator at each exercise opportunity. Numerical results show that the bias-corrected estimator outperforms its
uncorrected counterpart across all combinations of number of exercise opportunities, option moneyness and sample size. Finally,
the results suggest significant computational efficiency increases can be realized through trivial parallel implementations using the
corrected estimator.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

Corrections to the prices of derivatives due to market incompleteness. German, David [RKN: 45258]
Applied Mathematical Finance (2011) 18 (1-2) : 155-187.
We compute the first-order corrections to marginal utility-based prices with respect to a 'small' number of random endowments in
the framework of three incomplete financial models. They are a stochastic volatility model, a basis risk and market portfolio model
and a credit-risk model with jumps and stochastic recovery rate.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

Effects of risk management on cost efficiency and cost function of the U.S. Property and liability insurers. Lin, Hong-Jen; Wen,
Min-Ming; Yang, Charles C Society of Actuaries, - 12 pages. [RKN: 74918]
Shelved at: Per: NAAJ (Oxf) Per NAAJ (Lon) Shelved at: JOU
North American Actuarial Journal (2011) 15 (4) : 487-498.
This paper adopts the one-step stochastic frontier approach to investigate the impact of risk management tools of derivatives and
reinsurance on cost efficiency of U.S. property-liability insurance companies. The stochastic frontier approach considers both the
mean and variance of cost efficiency. The sample includes both stock and mutual insurers. Among the findings, the cost function
of the entire sample carries the concavity feature, and insurers tend to use financial derivatives for firm value creation. The results
also show that for the entire sample the use of derivatives enhances the mean of cost efficiency but accompanied with larger
efficiency volatility. Nevertheless, the utilization of financial derivatives mitigates efficiency volatility for mutual insurers. This
research provides important insights for the practice of risk management in the property-liability insurance industry.
http://www.soa.org/news-and-publications/publications/journals/naaj/naaj-detail.aspx

The implied market price of weather risk. Hardlea, Wolfgang Karl; Caberaa, Brenda Lopez Routledge, [RKN: 45795]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (1-2) : 59-95.
Weather derivatives (WD) are end-products of a process known as securitization that transforms non-tradable risk factors
(weather) into tradable financial assets. For pricing and hedging non-tradable assets, one essentially needs to incorporate the
market price of risk (MPR), which is an important parameter of the associated equivalent martingale measure (EMM). The majority
of papers so far has priced non-tradable assets assuming zero or constant MPR, but this assumption yields biased prices and has
never been quantified earlier under the EMM framework. Given that liquid-derivative contracts based on daily temperature are
traded on the Chicago Mercantile Exchange (CME), we infer the MPR from traded futures-type contracts (CAT, CDD, HDD and
AAT). The results show how the MPR significantly differs from 0, how it varies in time and changes in sign. It can be
parameterized, given its dependencies on time and temperature seasonal variation. We establish connections between the
market risk premium (RP) and the MPR.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

41

Longevity risk management for life and variable annuities: the effectiveness of static hedging using longevity bonds and
derivatives. Ngai, Andrew; Sherris, Michael [RKN: 44980]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (1) : 100-114.
For many years, the longevity risk of individuals has been underestimated, as survival probabilities have improved across the
developed world. The uncertainty and volatility of future longevity has posed significant risk issues for both individuals and product
providers of annuities and pensions. This paper investigates the effectiveness of static hedging strategies for longevity risk
management using longevity bonds and derivatives (q-forwards) for the retail products: life annuity, deferred life annuity, indexed
life annuity, and variable annuity with guaranteed lifetime benefits. Improved market and mortality models are developed for the
underlying risks in annuities. The market model is a regime-switching vector error correction model for GDP, inflation, interest
rates, and share prices. The mortality model is a discrete-time logit model for mortality rates with age dependence. Models were
estimated using Australian data. The basis risk between annuitant portfolios and population mortality was based on UK
experience. Results show that static hedging using q-forwards or longevity bonds reduces the longevity risk substantially for life
annuities, but significantly less for deferred annuities. For inflation-indexed annuities, static hedging of longevity is less effective
because of the inflation risk. Variable annuities provide limited longevity protection compared to life annuities and indexed
annuities, and as a result longevity risk hedging adds little value for these products.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

On modelling and pricing rainfall derivatives with seasonality. Leobacher, Gunther; Ngare, Philip [RKN: 45254]
Applied Mathematical Finance (2011) 18 (1-2) : 71-91.
We are interested in pricing rainfall options written on precipitation at specific locations. We assume the existence of a tradeable
financial instrument in the market whose price process is affected by the quantity of rainfall. We then construct a suitable
'Markovian gamma' model for the rainfall process which accounts for the seasonal change of precipitation and show how
maximum likelihood estimators can be obtained for its parameters.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

A recursive approach to mortality-linked derivative pricing. Shang, Zhaoning; Goovaerts, Marc; Dhaene, Jan [RKN: 44966]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (2) : 240-248.
In this paper, we develop a recursive method to derive an exact numerical and nearly analytical representation of the Laplace
transform of the transition density function with respect to the time variable for time-homogeneous diffusion processes. We further
apply this recursion algorithm to the pricing of mortality-linked derivatives. Given an arbitrary stochastic future lifetime T, the
probability distribution function of the present value of a cash flow depending on T can be approximated by a mixture of
exponentials, based on Jacobi polynomial expansions. In case of mortality-linked derivative pricing, the required Laplace inversion
can be avoided by introducing this mixture of exponentials as an approximation of the distribution of the survival time T in the
recursion scheme. This approximation significantly improves the efficiency of the algorithm.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Valuation of two-factor interest rate contingent claims using Green's theorem. Sorwar, Ghulam; Barone-Adesi, Giovanni [RKN:
45460]
Applied Mathematical Finance (2011) 18 (3-4) : 277-289.
Over the years a number of two-factor interest rate models have been proposed that have formed the basis for the valuation of
interest rate contingent claims. This valuation equation often takes the form of a partial differential equation that is solved using the
finite difference approach. In the case of two-factor models this has resulted in solving two second-order partial derivatives leading
to boundary errors, as well as numerous first-order derivatives. In this article we demonstrate that using Green's theorem,
second-order derivatives can be reduced to first-order derivatives that can be easily discretized; consequently, two-factor partial
differential equations are easier to discretize than one-factor partial differential equations. We illustrate our approach by applying
it to value contingent claims based on the two-factor CIR model. We provide numerical examples that illustrate that our approach
shows excellent agreement with analytical prices and the popular CrankNicolson method.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

DEVIATION

Risk processes with shot noise Cox claim number process and reserve dependent premium rate. Macci, Claudio; Torrisi, Giovanni
Luca [RKN: 39936]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 134-145.
We consider a suitable scaling, called the slow Markov walk limit, for a risk process with shot noise Cox claim number process and
reserve dependent premium rate. We provide large deviation estimates for the ruin probability. Furthermore, we find an
asymptotically efficient law for the simulation of the ruin probability using importance sampling. Finally, we present asymptotic
bounds for ruin probabilities in the Bayesian setting.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

42

DIFFERENTIAL EQUATIONS

Arithmetic Asian options under stochastic delay models. McWilliams, Nairn; Sabanis, Sotirios Routledge, [RKN: 45522]
Applied Mathematical Finance (2011) 18 (5-6) : 423-446.
Motivated by the increasing interest in past-dependent asset pricing models, shown in recent years by market practitioners and
prominent authors such as Hobson and Rogers (1998, Complete models with stochastic volatility . Mathematical Finance , 8 ( 1 )
: 27 48), we explore option pricing techniques for arithmetic Asian options under a stochastic delay differential equation
approach. We obtain explicit closed-form expressions for a number of lower and upper bounds and compare their accuracy
numerically.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

Small-time asymptotics for an uncorrelated local-stochastic volatility model. Forde, Martin; Jacquier, Antoine Routledge, [RKN:
45526]
Applied Mathematical Finance (2011) 18 (5-6) : 517-535.
We add some rigour to the work of Henry-Labordre (Henry-Labordre, P. 2009. "Analysis, Geometry, and Modeling in Finance:
Advanced Methods in Option Pricing", New York, London : Chapman & Hall) on the small-time behaviour of a local-stochastic
volatility model with zero correlation at leading order. We do this using the FreidlinWentzell (FW) theory of large deviations for
stochastic differential equations (SDEs), and then converting to a differential geometry problem of computing the shortest
geodesic from a point to a vertical line on a Riemmanian manifold, whose metric is induced by the inverse of the diffusion
coefficient. The solution to this variable endpoint problem is obtained using a transversality condition, where the geodesic is
perpendicular to the vertical line under the aforementioned metric. We then establish the corresponding small-time asymptotic
behaviour for call options using Hlder's inequality, and the implied volatility (using a general result in Roper and Rutkowski
(Roper, M. and Rutkowski , M. forthcoming. "A note on the behaviour of the BlackScholes implied volatility close to expiry".
International Journal of Theoretical and Applied Finance)
We also derive a series expansion for the implied volatility in the small-maturity limit, in powers of the log-moneyness, and we
show how to calibrate such a model to the observed implied volatility smile in the small-maturity limit.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

DIFFUSION PROCESSES

Good-deal bounds in a regime-switching diffusion market. Donnelly, Catherine Routledge, [RKN: 45525]
Applied Mathematical Finance (2011) 18 (5-6) : 491-515.
We consider option pricing in a regime-switching diffusion market. As the market is incomplete, there is no unique price for a
derivative. We apply the good-deal pricing bounds idea to obtain ranges for the price of a derivative. As an illustration, we calculate
the good-deal pricing bounds for a European call option and we also examine the stability of these bounds when we change the
generator of the Markov chain which drives the regime-switching. We find that the pricing bounds depend strongly on the choice of
the generator.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

On absolute ruin minimization under a diffusion approximation model. Luo, Shangzhen; Taksar, Michael [RKN: 39935]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 123-133.
In this paper, we assume that the surplus process of an insurance entity is represented by a pure diffusion. The company can
invest its surplus into a BlackScholes risky asset and a risk free asset. We impose investment restrictions that only a limited
amount is allowed in the risky asset and that no short-selling is allowed. We further assume that when the surplus level becomes
negative, the company can borrow to continue financing. The ultimate objective is to seek an optimal investment strategy that
minimizes the probability of absolute ruin, i.e. the probability that the liminf of the surplus process is negative infinity. The
corresponding HamiltonJacobiBellman (HJB) equation is analyzed and a verification theorem is proved; applying the HJB
method we obtain explicit expressions for the S-shaped minimal absolute ruin function and its associated optimal investment
strategy. In the second part of the paper, we study the optimization problem with both investment and proportional reinsurance
control. There the minimal absolute ruin function and the feedback optimal investmentreinsurance control are found explicitly as
well.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

On the approximation of the SABR model : A probabilistic approach. Kennedy, Joanne E; Mitra, Subhankar; Pham, Duy [RKN:
45883]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (5-6) : 553-586.
In this article, we derive a probabilistic approximation for three different versions of the SABR model: Normal, Log-Normal and a
displaced diffusion version for the general case. Specifically, we focus on capturing the terminal distribution of the underlying
process (conditional on the terminal volatility) to arrive at the implied volatilities of the corresponding European options for all
strikes and maturities. Our resulting method allows us to work with a variety of parameters that cover the long-dated options and
highly stress market condition. This is a different feature from other current approaches that rely on the assumption of very small
total volatility and usually fail for longer than 10 years maturity or large volatility of volatility (Volvol).
Available via Athens: Taylor & Francis Online
http://www.openathens.net
43


Optimal dividends and capital injections in the dual model with diffusion. Avanzi, Benjamin; Shen, Jonathan; Wong, Bernard - 34
pages. [RKN: 74748]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2011) 41 (2) : 611-644.
The dual model with diffusion is appropriate for companies with continuous expenses that are offset by stochastic and irregular
gains. Examples include research-based or commission-based companies. In this context, Avanzi and Gerber (2008) showed
how to determine the expected present value of dividends, if a barrier strategy is followed. In this paper, we further include capital
injections and allow for (proportional) transaction costs both on dividends and capital injections. We determine the optimal
dividend and (unconstrained) capital injection strategy (among all possible strategies) when jumps are hyperexponential. This
strategy happens to be either a dividend barrier strategy without capital injections, or another dividend barrier strategy with forced
injections when the surplus is null to prevent ruin. The latter is also shown to be the optimal dividend and capital injection strategy,
if ruin is not allowed to occur. Both the choice to inject capital or not and the level of the optimal barrier depend on the parameters
of the model. In all cases, we determine the optimal dividend barrier and show its existence and uniqueness. We also provide
closed form representations of the value functions when the optimal strategy is applied. Results are illustrated.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

DISABLEMENT

Managing longevity and disability risks in life annuities with long term care. Levantesi, Susanna; Menzietti, Massimiliano [RKN:
45642]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 391-401.
The aim of the paper is twofold. Firstly, it develops a model for risk assessment in a portfolio of life annuities with long term care
benefits. These products are usually represented by a Markovian Multi-State model and are affected by both longevity and
disability risks. Here, a stochastic projection model is proposed in order to represent the future evolution of mortality and disability
transition intensities. Data from the Italian National Institute of Social Security (INPS) and from Human Mortality Database (HMD)
are used to estimate the model parameters. Secondly, it investigates the solvency in a portfolio of enhanced pensions. To this aim
a risk model based on the portfolio risk reserve is proposed and different rules to calculate solvency capital requirements for life
underwriting risk are examined. Such rules are then compared with the standard formula proposed by the Solvency II project.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

DISCOUNT RATE

Avoiding the curves: Direct elicitation of time preferences. Laury, Susan K; McInnes, Melayne Morgan; Swarthout, J Todd Springer,
- 36 pages. [RKN: 73971]
Shelved at: Per: J Risk Uncrtnty
Journal of Risk and Uncertainty (2012) 44 (3) : 181-217.
We propose and test a new method for eliciting curvature-controlled discount rates that are invariant to the form of the utility
function. Our method uses a single elicitation task and obtains individual discount rates without knowledge of risk attitude or
parametric assumptions about the form of the utility function. We compare our method to a double elicitation technique in which
the utility function and discount rate are jointly estimated. Our experiment shows that these methods yield consistent estimates of
the discount rate, which is reassuring given the wide range of estimates in the literature. We find little evidence of probability
weighting, but in a second experiment, we observe that discount rates are sensitive to the length of the front-end delay, suggesting
present bias. When the front-end delay is at least two weeks, we estimate average discount rates to be 11.3 and 12.2% in the two
experiments.

DISCOUNTING

Joint moments of discounted compound renewal sums. Lveill, Ghislain; Adkambi, Franck [RKN: 44930]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2012) 1 : 40-55.
The first two moments and the covariance of the aggregate discounted claims have been found for a stochastic interest rate, from
which the inflation rate has been subtracted, and for a claims number process that is an ordinary or a delayed renewal
process.Hereafter we extend the preceding results by presenting recursive formulas for the joint moments of this risk process, for
a constant interest rate, and non-recursive formulas for higher joint moments when the interest rate is stochastic. Examples are
given for exponential claims inter-arrival times and for the Ho-Lee-Merton interest rate model.

On the moments of aggregate discounted claims with dependence introduced by a FGM copula. Bargs, Mathieu; Cossette,
Helene; Loisel, Stphane; Marceau, Etienne [RKN: 45306]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU/AST
ASTIN Bulletin (2011) 41 (1) : 215-238.
In this paper, we investigate the computation of the moments of the compound Poisson sums with discounted claims when
introducing dependence between the interclaim time and the subsequent claim size. The dependence structure between the two
random variables is defined by a Farlie-Gumbel-Morgenstern copula. Assuming that the claim distribution has finite moments, we
44

give expressions for the first and the second moments and then we obtain a general formula for any mth order moment. The
results are illustrated with applications to premium calculation and approximations based on moment matching methods.
online access via International Actuarial Association:
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Valuing equity-linked death benefits and other contingent options : A discounted density approach. Gerber, Hans U; Shiu, Elias S
W; Yang, Hailiang [RKN: 45725]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 73-92.
Motivated by the Guaranteed Minimum Death Benefits in various deferred annuities, we investigate the calculation of the expected
discounted value of a payment at the time of death. The payment depends on the price of a stock at that time and possibly also on
the history of the stock price. If the payment turns out to be the payoff of an option, we call the contract for the payment a (life)
contingent option. Because each time-until-death distribution can be approximated by a combination of exponential distributions,
the analysis is made for the case where the time until death is exponentially distributed, i.e., under the assumption of a constant
force of mortality. The time-until-death random variable is assumed to be independent of the stock price process which is a
geometric Brownian motion. Our key tool is a discounted joint density function. A substantial series of closed-form formulas is
obtained, for the contingent call and put options, for lookback options, for barrier options, for dynamic fund protection, and for
dynamic withdrawal benefits. In a section on several stocks, the method of Esscher transforms proves to be useful for finding
among others an explicit result for valuing contingent Margrabe options or exchange options. For the case where the contracts
have a finite expiry date, closed-form formulas are found for the contingent call and put options. From these, results for De
Moivres law are obtained as limits. We also discuss equity-linked death benefit reserves and investment strategies for maintaining
such reserves. The elasticity of the reserve with respect to the stock price plays an important role. Whereas in the most important
applications the stopping time is the time of death, it could be different in other applications, for example, the time of the next
catastrophe.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

DISTRIBUTION THEORY

Convolutions of multivariate phase-type distributions. Berdel, Jasmin; Hipp, Christian [RKN: 45131]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (3) : 374-377.
This paper is concerned with multivariate phase-type distributions introduced by Assaf et al. (1984). We show that the sum of two
independent bivariate vectors each with a bivariate phase-type distribution is again bivariate phase-type and that this is no longer
true for higher dimensions. Further, we show that the distribution of the sum over different components of a vector with multivariate
phase-type distribution is not necessarily multivariate phase-type either, if the dimension of the components is two or larger.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Folded and log-folded-t distributions as models for insurance loss data. Brazauskas, Vytaras; Kleefeld, Andreas [RKN: 45149]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 1 : 59-74.
A rich variety of probability distributions has been proposed in the actuarial literature for fitting of insurance loss data. Examples
include: lognormal, log-t, various versions of Pareto, loglogistic, Weibull, gamma and its variants, and generalized beta of the
second kind distributions, among others. In this paper, we supplement the literature by adding the log-folded-normal and
log-folded-t families. Shapes of the density function and key distributional properties of the 'folded' distributions are presented
along with three methods for the estimation of parameters: method of maximum likelihood; method of moments; and method of
trimmed moments. Further, large and small-sample properties of these estimators are studied in detail. Finally, we fit the newly
proposed distributions to data which represent the total damage done by 827 fires in Norway for the year 1988. The fitted models
are then employed in a few quantitative risk management examples, where point and interval estimates for several value-at-risk
measures are calculated.

A new discrete distribution with actuarial applications. Gomez-Deniz, Emilio; Sarabia, Jose Maria; Caldern-Ojeda, Enrique [RKN:
45135]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (3) : 406-412.
A new discrete distribution depending on two parameters, a<1,a 0 and 0< <1, is introduced in this paper. The new distribution is
unimodal with a zero vertex and overdispersion (mean larger than the variance) and underdispersion (mean lower than the
variance) are encountered depending on the values of its parameters. Besides, an equation for the probability density function of
the compound version, when the claim severities are discrete is derived. The particular case obtained when a tends to zero is
reduced to the geometric distribution. Thus, the geometric distribution can be considered as a limiting case of the new distribution.
After reviewing some of its properties, we investigated the problem of parameter estimation. Expected frequencies were
calculated for numerous examples, including short and long tailed count data, providing a very satisfactory fit.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

A new proof of Cheungs characterization of comonotonicity. Mao, Tiantian; Hu, Taizhong [RKN: 40015]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 214-216.
It is well known that if a random vector with given marginal distributions is comonotonic, it has the largest sum in the sense of the
convex order. Cheung (2008) proved that the converse of this assertion is also true, provided that all marginal distribution
functions are continuous and that the underlying probability space is atomless. This continuity assumption on the marginals was
45

removed by Cheung (2010). In this short note, we give a new and simple proof of Cheungs result without the assumption that the
underlying probability space is atomless.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

A quantitative comparison of the Lee-Carter Model under different types of non-Gaussian innovations. Wang, Chou-Wen; Huang,
Hong-Chih; Liu, I-Chien Palgrave Macmillan, [RKN: 44909]
Shelved at: Per: Geneva (Oxf)
Geneva Papers on Risk and Insurance (2011) 36(4) : 675-696.
In the classical Lee-Carter model, the mortality indices that are assumed to be a random walk model with drift are normally
distributed. However, for the long-term mortality data, the error terms of the Lee-Carter model and the mortality indices have tails
thicker than those of a normal distribution and appear to be skewed. This study therefore adopts five non-Gaussian
distributionsStudents t-distribution and its skew extension (i.e., generalised hyperbolic skew Students t-distribution), one
finite-activity Lvy model (jump diffusion distribution), and two infinite-activity or pure jump models (variance gamma and normal
inverse Gaussian)to model the error terms of the Lee-Carter model. With mortality data from six countries over the period
19002007, both in-sample model selection criteria (e.g., Bayesian information criterion, KolmogorovSmirnov test,
AndersonDarling test, Cramrvon-Mises test) and out-of-sample projection errors indicate a preference for modelling the
Lee-Carter model with non-Gaussian innovations.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

DIVIDENDS

Classical and singular stochastic control for the optimal dividend policy when there is regime switching. Sotomayor, Luz R;
Cadenillas, Abel [RKN: 45128]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (3) : 344-354.
Motivated by economic and empirical arguments, we consider a company whose cash surplus is affected by macroeconomic
conditions. Specifically, we model the cash surplus as a Brownian motion with drift and volatility modulated by an observable
continuous-time Markov chain that represents the regime of the economy. The objective of the management is to select the
dividend policy that maximizes the expected total discounted dividend payments to be received by the shareholders. We study two
different cases: bounded dividend rates and unbounded dividend rates. These cases generate, respectively, problems of classical
stochastic control with regime switching and singular stochastic control with regime switching. We solve these problems, and
obtain the first analytical solutions for the optimal dividend policy in the presence of business cycles. We prove that the optimal
dividend policy depends strongly on macroeconomic conditions.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Dividends and reinsurance under a penalty for ruin. Liang, Zhibin; Young, Virginia R [RKN: 45647]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 437-445.
We find the optimal dividend strategy in a diffusion risk model under a penalty for ruin, as in Thonhauser and Albrecher (2007),
although we allow for both a positive and a negative penalty. Furthermore, we determine the optimal proportional reinsurance
strategy, when so-called expensive reinsurance is available; that is, the premium loading on reinsurance is greater than the
loading on the directly written insurance. One can think of our model as taking the one in Taksar (2000, Section 6) and adding a
penalty for ruin. We use the Legendre transform to obtain the optimal dividend and reinsurance strategies. Not surprisingly, the
optimal dividend strategy is a barrier strategy. Also, we investigate the effect of the penalty P on the optimal strategies. In
particular, we show that the optimal barrier increases with respect to P, while the optimal proportion retained and the value
function decrease with respect to P. In the end, we explore the time of ruin, and find that the expected time of ruin increases with
respect to P under a net profit condition.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Levy risk model with two-sided jumps and a barrier dividend strategy. Bo, Lijun; Song, Renming; Tang, Dan; Wang, Tongjin; Yang,
Xuewei [RKN: 45601]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (2) : 280-291.
In this paper, we consider a general Lvy risk model with two-sided jumps and a constant dividend barrier. We connect the ruin
problem of the ex-dividend risk process with the first passage problem of the Lvy process reflected at its running maximum. We
prove that if the positive jumps of the risk model form a compound Poisson process and the remaining part is a spectrally negative
Lvy process with unbounded variation, the Laplace transform (as a function of the initial surplus) of the upward entrance time of
the reflected (at the running infimum) Lvy process exhibits the smooth pasting property at the reflecting barrier. When the surplus
process is described by a double exponential jump diffusion in the absence of dividend payment, we derive some explicit
expressions for the Laplace transform of the ruin time, the distribution of the deficit at ruin, and the total expected discounted
dividends. Numerical experiments concerning the optimal barrier strategy are performed and new empirical findings are
presented.
Available via Athens: Palgrave MacMillan
http://www.openathens.net



46


New analytic approach to address putcall parity violation due to discrete dividends. Buryaka, Alexander; Guo, Ivan Routledge,
[RKN: 45794]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (1-2) : 37-58.
The issue of developing simple BlackScholes (BS)-type approximations for pricing European options with large discrete
dividends was popular since the early 2000s with a few different approaches reported during the last 10 years. Moreover, it has
been claimed that at least some of the resulting expressions represent high-quality approximations which closely match the results
obtained by the use of numerics. In this article we review, on the one hand, these previously suggested BS-type approximations
and, on the other hand, different versions of the corresponding CrankNicolson (CN) numerical schemes with a primary focus on
their boundary condition variations. Unexpectedly we often observe substantial deviations between the analytical and numerical
results which may be especially pronounced for European puts. Moreover, our analysis demonstrates that any BS-type
approximation which adjusts put parameters identically to call parameters has an inherent problem of failing to detect a little known
putcall parity violation phenomenon. To address this issue, we derive a new analytic pricing approximation which is in better
agreement with the corresponding numerical results in comparison with any of the previously known analytic approaches for
European calls and puts with large discrete dividends.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

On the threshold dividend strategy for a generalized jumpdiffusion risk model. Chi, Yichun; Lin, X. Sheldon [RKN: 45126]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (3) : 326-337.
In this paper, we generalize the CramrLundberg risk model perturbed by diffusion to incorporate jumps due to surplus
fluctuation and to relax the positive loading condition. Assuming that the surplus process has exponential upward and arbitrary
downward jumps, we analyze the expected discounted penalty (EDP) function of Gerber and Shiu (1998) under the threshold
dividend strategy. An integral equation for the EDP function is derived using the WienerHopf factorization. As a result, an explicit
analytical expression is obtained for the EDP function by solving the integral equation. Finally, phase-type downward jumps are
considered and a matrix representation of the EDP function is presented.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Optimal dividend policies for compound Poisson processes : The case of bounded dividend rates. Azcue, Pablo; Muler, Nora
[RKN: 45721]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 26-42.
We consider in this paper the optimal dividend problem for an insurance company whose uncontrolled reserve process evolves as
a classical CramrLundberg model with arbitrary claim-size distribution. Our objective is to find the dividend payment policy
which maximizes the cumulative expected discounted dividend pay-outs until the time of bankruptcy imposing a ceiling on the
dividend rates. We characterize the optimal value function as the unique bounded viscosity solution of the associated
HamiltonJacobiBellman equation. We prove that there exists an optimal dividend strategy and that this strategy is stationary
with a band structure. We study the regularity of the optimal value function. We find a characterization result to check optimality
even in the case where the optimal value function is not differentiable. We construct examples where the claim-size distribution is
smooth but the optimal dividend policy is not threshold and the optimal value function is not differentiable. We study the survival
probability of the company under the optimal dividend policy. We also present examples where the optimal dividend policy has
infinitely many bands even in the case that the claim-size distribution has a bounded density.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Stochastic expansion for the pricing of call options with discrete dividends. Etore, Pierre; Gobet, Emmanuel Routledge, [RKN:
45839]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (3-4) : 233-264.
In the context of an asset paying affine-type discrete dividends, we present closed analytical approximations for the pricing of
European vanilla options in the BlackScholes model with time-dependent parameters. They are obtained using a stochastic
Taylor expansion around a shifted lognormal proxy model. The final formulae are, respectively, first-, second- and third- order
approximations w.r.t. the fixed part of the dividends. Using CameronMartin transformations, we provide explicit representations
of the correction terms as Greeks in the BlackScholes model. The use of Malliavin calculus enables us to provide tight error
estimates for our approximations. Numerical experiments show that this approach yields very accurate results, in particular
compared with known approximations of Bos, Gairat and Shepeleva (20035. Bos , R. , Gairat , A. and Shepeleva , D. 2003 .
Dealing with discrete dividends . Risk Magazine , 16 : 109 112 and Veiga and Wystup (2009, Closed formula for options with
discrete dividends and its derivatives, Applied Mathematical Finance, 16(6): 517531) and quicker than the iterated integration
procedure of Haug, Haug and Lewis (2003, Back to basics: a new approach to the discrete dividend problem, Wilmott Magazine,
pp. 37 47) or than the binomial tree method of Vellekoop and Nieuwenhuis (2006, Efficient pricing of derivatives on assets with
discrete dividends, Applied Mathematical Finance, 13(3), pp. 265284).
Available via Athens: Taylor & Francis Online
http://www.openathens.net





47

DOWNSIDE RISK AVERSION

Decreasing absolute risk aversion, prudence and increased downside risk aversion. Meyer, Jack; Liu, Liqun Springer, - 18 pages.
[RKN: 73973]
Shelved at: Per: J Risk Uncrtnty
Journal of Risk and Uncertainty (2012) 44 (3) : 243-260.
Downside risk increases have previously been characterized as changes preferred by all decision makers u(x) with u'''(x) > 0. For
risk averse decision makers, u'''(x) > 0 also defines prudence. This paper finds that downside risk increases can also be
characterized as changes preferred by all decision makers displaying decreasing absolute risk aversion (DARA) since those
changes involve random variables that have equal means. Building on these findings, the paper proposes using more
decreasingly absolute risk averse or more prudent as alternative definitions of increased downside risk aversion. These
alternative definitions generate a transitive ordering, while the existing definition based on a transformation function with a positive
third derivative does not. Other properties of the new definitions of increased downside risk aversion are also presented.
http://www.openathens.net

DYNAMIC PROGRAMMING

Dynamic portfolio optimization in discrete-time with transaction costs. Atkinson, Colin; Quek, Gary Routledge, [RKN: 45840]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (3-4) : 265-298.
A discrete-time model of portfolio optimization is studied under the effects of proportional transaction costs. A general class of
underlying probability distributions is assumed for the returns of the asset prices. An investor with an exponential utility function
seeks to maximize the utility of terminal wealth by determining the optimal investment strategy at the start of each time step.
Dynamic programming is used to derive an algorithm for computing the optimal value function and optimal boundaries of the
no-transaction region at each time step. In the limit of small transaction costs, perturbation analysis is applied to obtain the optimal
value function and optimal boundaries at any time step in the rebalancing of the portfolio.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

ECONOMIC CAPITAL

Measurement and modelling of dependencies in economic capital. Shaw, R A; Smith, A D; Spivak, G S - 99 pages. [RKN: 73863]
Shelved at: Per: BAJ (Oxf) Per: BAJ (Lon) Shelved at: REF
BAJ (2011) 16 (3) : 601-699.
This paper covers a number of different topics related to the measurement and modelling of dependency within economic capital
models. The scope of the paper is relatively wide. We address in some detail the different approaches to modelling dependencies
ranging from the more common variance-covariance matrix approach, to the consideration of the use of copulas and the more
sophisticated causal models that feature feedback loops and other systems design ideas.
There are many data and model uncertainties in modelling dependency and so we have also endeavoured to cover topics such as
spurious relationships and wrong-way risk to highlight some of the uncertainties involved.
With the advent of the internal model approval process under Solvency II, senior management needs to have a greater
understanding of dependency methodology. We have devoted a section of this paper to a discussion of possible different ways to
communicate the results of modelling to the board, senior management and other interested parties within an insurance company.
We have endeavoured throughout this paper to include as many numerical examples as possible to help in the understanding of
the key points, including our discussion of model parameterisation and the communication to an insurance executive of the impact
of dependency on economic capital modelling results.
The economic capital model can be seen as a combination of two key components: the marginal risk distribution of each risk and
the aggregation methodology which combines these into a single aggregate distribution or capital number. This paper is
concerned with the aggregation part, the methods and assumptions employed and the issues arising, and not the determination of
the marginal risk distributions which is equally of importance and in many cases equally as complex.
http://www.actuaries.org.uk/research-and-resources/pages/members-access-journals

Measurement and modelling of dependencies in economic capital : Abstract of the London discussion. Shaw, Richard - 21 pages.
[RKN: 73864]
Shelved at: Per: BAJ (Oxf) Per: BAJ (Lon) Shelved at: REF
BAJ (2011) 16 (3) : 701-721.
This abstract relates to the following paper:
Shaw, R.A., Smith, A.D. & Spivak, G.S. Measurement and modelling of dependencies in economic
capital. British Actuarial Journal, 16 (3).
http://www.actuaries.org.uk/research-and-resources/pages/members-access-journals




48

ECONOMIC INDICATORS

Yet more on a stochastic economic model : Part 1: updating and refitting, 1995 to 2009. Wilkie, A D; Sahin, Sule; Cairns, A J G;
Kleinow, Torsten Faculty of Actuaries and Institute of Actuaries; Cambridge University Press, [RKN: 40001]
Shelved at: Per: AAS (Oxf) Per: AAS (Lon)
Annals of Actuarial Science (2011) 5(1) : 53-99.
In this paper we review the Wilkie asset model for a variety of UK economic indices, including the Retail Prices Index, both without
and with an ARCH model, the wages index, share dividend yields, share dividends and share prices, long term bond yields, short
term bond yields and index-linked bond yields, in each case by updating the parameters to June 2009. We discuss how the model
has performed from 1994 to 2009 and estimate the values of the parameters and their confidence intervals over various
sub-periods to study their stability. Our analysis shows that the residuals of many of the series are much fatter-tailed than in a
normal distribution. We observe also that besides the stochastic uncertainty built into the model by the random innovations there
is also parameter uncertainty arising from the estimated values of the parameters.
http://www.actuaries.org.uk/research-and-resources/pages/access-journals

EM ALGORITHM

Modeling dependent risks with multivariate Erlang mixtures. Lee, Simon C K; Lin, X Sheldon - 28 pages. [RKN: 70747]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2012) 42 (1) : 153-180.
In this paper, we introduce a class of multivariate Erlang mixtures and present its desirable properties. We show that a multivariate
Erlang mixture could be an ideal multivariate parametric model for insurance modeling, especially when modeling dependence is
a concern. When multivariate losses are governed by a multivariate Erlang mixture, many quantities of interest such as joint
density and Laplace transform, moments, and Kendalls tau have a closed form. Further, the class is closed under convolutions
and mixtures, which enables us to model aggregate losses in a straightforward way. We also introduce a new concept called
quasi-comonotonicity that can be useful to derive an upper bound for individual losses in a multivariate stochastic order and upper
bounds for stop-loss premiums of the aggregate loss. Finally, an EM algorithm tailored to multivariate Erlang mixtures is presented
and numerical experiments are performed to test the efficiency of the algorithm.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

ENVIRONMENT

The endogenous price dynamics of emission allowances and an application to CO2 option pricing. Chesney, Marc; Taschini, Luca
[RKN: 45878]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (5-6) : 447-475.
Market mechanisms are increasingly being used as a tool for allocating somewhat scarce but unpriced rights and resources, and
the European Emission Trading Scheme is an example. By means of dynamic optimization in the contest of firms covered by such
environmental regulations, this article generates endogenously the price dynamics of emission permits under asymmetric
information, allowing inter-temporal banking and borrowing. In the market, there are a finite number of firms and each firm's
pollution emission follows an exogenously given stochastic process. We prove the discounted permit price is a martingale with
respect to the relevant filtration. The model is solved numerically. Finally, a closed-form pricing formula for European-style options
is derived.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

The implied market price of weather risk. Hardlea, Wolfgang Karl; Caberaa, Brenda Lopez Routledge, [RKN: 45795]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (1-2) : 59-95.
Weather derivatives (WD) are end-products of a process known as securitization that transforms non-tradable risk factors
(weather) into tradable financial assets. For pricing and hedging non-tradable assets, one essentially needs to incorporate the
market price of risk (MPR), which is an important parameter of the associated equivalent martingale measure (EMM). The majority
of papers so far has priced non-tradable assets assuming zero or constant MPR, but this assumption yields biased prices and has
never been quantified earlier under the EMM framework. Given that liquid-derivative contracts based on daily temperature are
traded on the Chicago Mercantile Exchange (CME), we infer the MPR from traded futures-type contracts (CAT, CDD, HDD and
AAT). The results show how the MPR significantly differs from 0, how it varies in time and changes in sign. It can be
parameterized, given its dependencies on time and temperature seasonal variation. We establish connections between the
market risk premium (RP) and the MPR.
Available via Athens: Taylor & Francis Online
http://www.openathens.net



49

EQUITIES

New analytic approach to address putcall parity violation due to discrete dividends. Buryaka, Alexander; Guo, Ivan Routledge,
[RKN: 45794]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (1-2) : 37-58.
The issue of developing simple BlackScholes (BS)-type approximations for pricing European options with large discrete
dividends was popular since the early 2000s with a few different approaches reported during the last 10 years. Moreover, it has
been claimed that at least some of the resulting expressions represent high-quality approximations which closely match the results
obtained by the use of numerics. In this article we review, on the one hand, these previously suggested BS-type approximations
and, on the other hand, different versions of the corresponding CrankNicolson (CN) numerical schemes with a primary focus on
their boundary condition variations. Unexpectedly we often observe substantial deviations between the analytical and numerical
results which may be especially pronounced for European puts. Moreover, our analysis demonstrates that any BS-type
approximation which adjusts put parameters identically to call parameters has an inherent problem of failing to detect a little known
putcall parity violation phenomenon. To address this issue, we derive a new analytic pricing approximation which is in better
agreement with the corresponding numerical results in comparison with any of the previously known analytic approaches for
European calls and puts with large discrete dividends.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

Quantile hedging for equity-linked contracts. Klusik, Przemyslaw; Palmowski, Zbigniew [RKN: 40023]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 280-286.
We consider an equity-linked contract whose payoff depends on the lifetime of the policy holder and the stock price. We provide
the best strategy for an insurance company assuming limited capital for the hedging. The main idea of the proof consists in
reducing the construction of such strategies for a given claim to a problem of superhedging for a modified claim, which is the
solution to a static optimization problem of the NeymanPearson type. This modified claim is given via some sets constructed in
an iterative way. Some numerical examples are also given.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Stochastic expansion for the pricing of call options with discrete dividends. Etore, Pierre; Gobet, Emmanuel Routledge, [RKN:
45839]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (3-4) : 233-264.
In the context of an asset paying affine-type discrete dividends, we present closed analytical approximations for the pricing of
European vanilla options in the BlackScholes model with time-dependent parameters. They are obtained using a stochastic
Taylor expansion around a shifted lognormal proxy model. The final formulae are, respectively, first-, second- and third- order
approximations w.r.t. the fixed part of the dividends. Using CameronMartin transformations, we provide explicit representations
of the correction terms as Greeks in the BlackScholes model. The use of Malliavin calculus enables us to provide tight error
estimates for our approximations. Numerical experiments show that this approach yields very accurate results, in particular
compared with known approximations of Bos, Gairat and Shepeleva (20035. Bos , R. , Gairat , A. and Shepeleva , D. 2003 .
Dealing with discrete dividends . Risk Magazine , 16 : 109 112 and Veiga and Wystup (2009, Closed formula for options with
discrete dividends and its derivatives, Applied Mathematical Finance, 16(6): 517531) and quicker than the iterated integration
procedure of Haug, Haug and Lewis (2003, Back to basics: a new approach to the discrete dividend problem, Wilmott Magazine,
pp. 37 47) or than the binomial tree method of Vellekoop and Nieuwenhuis (2006, Efficient pricing of derivatives on assets with
discrete dividends, Applied Mathematical Finance, 13(3), pp. 265284).

Available via Athens: Taylor & Francis Online
http://www.openathens.net

Valuing equity-linked death benefits and other contingent options : A discounted density approach. Gerber, Hans U; Shiu, Elias S
W; Yang, Hailiang [RKN: 45725]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 73-92.
Motivated by the Guaranteed Minimum Death Benefits in various deferred annuities, we investigate the calculation of the expected
discounted value of a payment at the time of death. The payment depends on the price of a stock at that time and possibly also on
the history of the stock price. If the payment turns out to be the payoff of an option, we call the contract for the payment a (life)
contingent option. Because each time-until-death distribution can be approximated by a combination of exponential distributions,
the analysis is made for the case where the time until death is exponentially distributed, i.e., under the assumption of a constant
force of mortality. The time-until-death random variable is assumed to be independent of the stock price process which is a
geometric Brownian motion. Our key tool is a discounted joint density function. A substantial series of closed-form formulas is
obtained, for the contingent call and put options, for lookback options, for barrier options, for dynamic fund protection, and for
dynamic withdrawal benefits. In a section on several stocks, the method of Esscher transforms proves to be useful for finding
among others an explicit result for valuing contingent Margrabe options or exchange options. For the case where the contracts
have a finite expiry date, closed-form formulas are found for the contingent call and put options. From these, results for De
Moivres law are obtained as limits. We also discuss equity-linked death benefit reserves and investment strategies for maintaining
such reserves. The elasticity of the reserve with respect to the stock price plays an important role. Whereas in the most important
applications the stopping time is the time of death, it could be different in other applications, for example, the time of the next
catastrophe.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

50

ERLAND RISK MODELS

Erlang risk models and finite time ruin problems. Dickson, David C M; Li, Shuanming [RKN: 44884]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2012) 3 : 183-202.
We consider the joint density of the time of ruin and deficit at ruin in the Erlang(n) risk model. We give a general formula for this
joint density and illustrate how the components of this formula can be found in the special case when n=2. We then show how the
formula can be implemented numerically for a general value of n. We also discuss how the ideas extend to the generalised
Erlang(n) risk model.
Available via Athens access
http://www.openathens.net/

ERLANG(2)

The distributions of some quantities for Erlang(2) risk models. Dickson, David C M; Li, Shuanming (2012). - Victoria: University of
Melbourne, 2012. - 18 pages. [RKN: 73947]
We study the distributions of [1] the first time that the surplus reaches a given level and [2] the duration of negative surplus in a
Sparre Andersen risk process with the inter-claim times being Erlang(2) distributed. These distributions can be obtained through
the inversion of Laplace transforms using the inversion relationship for the Erlang(2) risk model given by Dickson and Li (2010).
No.1 (1993) onwards available online. Download as PDF.
http://www.economics.unimelb.edu.au/ACT/papers.shtml

ERLANG MIXTURE

Modeling dependent risks with multivariate Erlang mixtures. Lee, Simon C K; Lin, X Sheldon - 28 pages. [RKN: 70747]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2012) 42 (1) : 153-180.
In this paper, we introduce a class of multivariate Erlang mixtures and present its desirable properties. We show that a multivariate
Erlang mixture could be an ideal multivariate parametric model for insurance modeling, especially when modeling dependence is
a concern. When multivariate losses are governed by a multivariate Erlang mixture, many quantities of interest such as joint
density and Laplace transform, moments, and Kendalls tau have a closed form. Further, the class is closed under convolutions
and mixtures, which enables us to model aggregate losses in a straightforward way. We also introduce a new concept called
quasi-comonotonicity that can be useful to derive an upper bound for individual losses in a multivariate stochastic order and upper
bounds for stop-loss premiums of the aggregate loss. Finally, an EM algorithm tailored to multivariate Erlang mixtures is presented
and numerical experiments are performed to test the efficiency of the algorithm.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

ESTIMATION

Assessing the performance of different volatility estimators : A Monte Carlo analysis. Cartea, Alvaro; Karyampas, Dimitrios [RKN:
45882]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (5-6) : 535-552.
We test the performance of different volatility estimators that have recently been proposed in the literature and have been
designed to deal with problems arising when ultra high-frequency data are employed: microstructure noise and price
discontinuities. Our goal is to provide an extensive simulation analysis for different levels of noise and frequency of jumps to
compare the performance of the proposed volatility estimators. We conclude that the maximum likelihood estimator filter (MLE-F),
a two-step parametric volatility estimator proposed by Cartea and Karyampas (2011a10. Cartea , . and Karyampas , D. 2011a .
The relationship between the volatility of returns and the number of jumps in financial markets, SSRN eLibrary , Working Paper
Series, SSRN .
View all references; The relationship between the volatility returns and the number of jumps in financial markets, SSRN eLibrary,
Working Paper Series, SSRN), outperforms most of the well-known high-frequency volatility estimators when different
assumptions about the path properties of stock dynamics are used.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

A Bayesian approach for estimating extreme quantiles under a semiparametric mixture model. Cabras, Stefano; Castellanos,
Maria Eugenia [RKN: 45301]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU/AST
ASTIN Bulletin (2011) 41 (1) : 87-106.
In this paper we propose an additive mixture model, where one component is the Generalized Pareto distribution (GPD) that
allows us to estimate extreme quantiles. GPD plays an important role in modeling extreme quantiles for the wide class of
distributions belonging to the maximum domain of attraction of an extreme value model. One of the main difficulty with this
modeling approach is the choice of the threshold u, such that all observations greater than u enter into the likelihood function of the
GPD model. Difficulties are due to the fact that GPD parameter estimators are sensible to the choice of u. In this work we estimate
51

u, and other parameters, using suitable priors in a Bayesian approach. In particular, we propose to model all data, extremes and
non-extremes, using a semiparametric model for data below u, and the GPD for the exceedances over u. In contrast to the usual
estimation techniques for u, in this setup we account for uncertainty on all GPD parameters, including u, via their posterior
distributions. A Monte Carlo study shows that posterior credible intervals also have frequentist coverages. We further illustrate the
advantages of our approach on two applications from insurance.
online access via International Actuarial Association:
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Maximum likelihood and estimation efficiency of the chain ladder. Taylor, Greg [RKN: 45303]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU/AST
ASTIN Bulletin (2011) 41 (1) : 131-155.
The chain ladder is considered in relation to certain recursive and non-recursive models of claim observations. The recursive
models resemble the (distribution free) Mack model but are augmented with distributional assumptions. The nonrecursive models
are generalisations of Poisson cross-classified structure for which the chain ladder is known to be maximum likelihood. The error
distributions considered are drawn from the exponential dispersion family.
Each of these models is examined with respect to sufficient statistics and completeness (Section 5), minimum variance estimators
(Section 6) and maximum likelihood (Section 7). The chain ladder is found to provide maximum likelihood and minimum variance
unbiased estimates of loss reserves under a wide range of recursive models. Similar results are obtained for a much more
restricted range of non-recursive models.
These results lead to a full classification of this papers chain ladder models with respect to the estimation properties (bias,
minimum variance) of the chain ladder algorithm (Section 8).
online access via International Actuarial Association:
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Viterbi-based estimation for Markov switching GARCH model. Routledge, [RKN: 45838]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (3-4) : 219-231.
We outline a two-stage estimation method for a Markov-switching Generalized Autoregressive Conditional Heteroscedastic
(GARCH) model modulated by a hidden Markov chain. The first stage involves the estimation of a hidden Markov chain using the
Vitberi algorithm given the model parameters. The second stage uses the maximum likelihood method to estimate the model
parameters given the estimated hidden Markov chain. Applications to financial risk management are discussed through simulated
data.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

EUROPE

Determination of the probability distribution measures from market option prices using the method of maximum entropy in the
mean. Gzyl, Henryk; Mayoral, Silvia Routledge, [RKN: 45841]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (3-4) : 299-312.
We consider the problem of recovering the risk-neutral probability distribution of the price of an asset, when the information
available consists of the market price of derivatives of European type having the asset as underlying. The information available
may or may not include the spot value of the asset as data. When we only know the true empirical law of the underlying, our
method will provide a measure that is absolutely continuous with respect to the empirical law, thus making our procedure model
independent. If we assume that the prices of the derivatives include risk premia and/or transaction prices, using this method it is
possible to estimate those values, as well as the no-arbitrage prices. This is of interest not only when the market is not complete,
but also if for some reason we do not have information about the model for the price of the underlying.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

Editorial: European Actuarial Journal. Hipp, Christian [RKN: 44804]
Shelved at: online only
European Actuarial Journal (2011) 1(1) July : 1-2.
The editors welcome readers to this new international scientific journal which is published and edited by a cooperation of 13
actuarial associations of the following 11 countries: Austria, Belgium, France, Germany, Greece, Hungary, Italy, Poland, Portugal,
Slovenia, and Switzerland. EAJ is the successor of the following six actuarial journals: 1. Belgian Actuarial Bulletin, 2. Bltter der
Deutschen Gesellschaft fr Versicherungs- und Finanzmathematik, 3. Boletim do Instituto dos Acturios Portugueses, 4. Giornale
dellIstituto Italiano degli Attuari, 5. Mitteilungen der Schwweiserische Aktuarveringung/Bulletin de lAssociation Suisse des
Actuaires, 6. Mitteilungen der Aktuarveringung sterreichs (Austria)
Available via Athens: Springer




52

EXPECTATION

A note on subadditivity of zero-utility premiums. Denuit, Michel; Eeckhoudt, Louis; Menegatti, Mario [RKN: 45307]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU/AST
ASTIN Bulletin (2011) 41 (1) : 239-250.
Many papers in the literature have adopted the expected utility paradigm to analyze insurance decisions. Insurance companies
manage policies by growing, by adding independent risks. Even if adding risks generally ultimately decreases the probability of
insolvency, the impact on the insurers expected utility is less clear. Indeed, it is not true that the risk aversion toward the additional
loss generated by a new policy included in an insurance portfolio always decreases with the number of contracts already
underwritten. The present paper derives conditions under which zero-utility premium principles are subadditive for independent
risks. It is shown that subadditivity is the exception rather than the rule: the zero-utility premium principle generates a
superadditive risk premium for most common utility functions. For instance, all completely monotonic utility functions generate
superadditive zero-utility premiums. The main message of the present paper is thus that the zero-utility premium for a marginal
policy is generally not sufficient to guarantee the formation of insurance portfolios without additional capital.
online access via International Actuarial Association:
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

EXPENSES

Multivariate longitudinal modeling of insurance company expenses. Shi, Peng [RKN: 45737]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 204-215.
Insurers, investors and regulators are interested in understanding the behaviour of insurance company expenses, due to the high
operating cost of the industry. Expense models can be used for prediction, to identify unusual behavior, and to measure firm
efficiency. Current literature focuses on the study of total expenses that consist of three components: underwriting, investment and
loss adjustment. A joint study of expenses by type is to deliver more information and is critical in understanding their relationship.

This paper introduces a copula regression model to examine the three types of expenses in a longitudinal context. In our method,
elliptical copulas are employed to accommodate the between-subject contemporaneous and lag dependencies, as well as the
within-subject serial correlations of the three types. Flexible distributions are allowed for the marginals of each type with covariates
incorporated in distribution parameters. A model validation procedure based on a t-plot method is proposed for in-sample and
out-of-sample validation purposes. The multivariate longitudinal model effectively addresses the typical features of expenses
data: the heavy tails, the strong individual effects and the lack of balance.

The analysis is performed using propertycasualty insurance company expenses data from the National Association of Insurance
Commissioners of years 20012006. A unique set of covariates is determined for each type of expenses. We found that
underwriting expenses and loss adjustment expenses are complements rather than substitutes. The model is shown to be
successful in efficiency classification. Also, a multivariate predictive density is derived to quantify the future values of an insurers
expenses.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

EXPERIENCE RATING

Experience and exposure rating for property per risk excess of loss reinsurance revisited. Desmedt, S; Snoussi, M; Chenut, X;
Walhin, J F - 38 pages. [RKN: 70750]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2012) 42 (1) : 233-270.
Experience and exposure rating are traditionally considered to be independent but complementary methods for pricing property
per risk excess of loss reinsurance. Strengths and limitations of these techniques are well-known. In practice, both methods often
lead to quite different prices. In this paper we show that limitations of traditional experience rating can be overcome by taking into
account historical profile information by means of exposure curves. For pricing unused or rarely used capacity, we propose to use
exposure rating, calibrated on the experience rate of a working layer. We compare the method presented with more traditional
methods based on the information which is generally available to the reinsurer.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

EXPOSURE RATING

Experience and exposure rating for property per risk excess of loss reinsurance revisited. Desmedt, S; Snoussi, M; Chenut, X;
Walhin, J F - 38 pages. [RKN: 70750]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2012) 42 (1) : 233-270.
Experience and exposure rating are traditionally considered to be independent but complementary methods for pricing property
per risk excess of loss reinsurance. Strengths and limitations of these techniques are well-known. In practice, both methods often
lead to quite different prices. In this paper we show that limitations of traditional experience rating can be overcome by taking into
53

account historical profile information by means of exposure curves. For pricing unused or rarely used capacity, we propose to use
exposure rating, calibrated on the experience rate of a working layer. We compare the method presented with more traditional
methods based on the information which is generally available to the reinsurer.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

EXTREME EVENTS

Global warming, extreme weather events, and forecasting tropical cyclones. Chang, Carolyn W; Chang, Jack S K; Guan Lim, Kian -
25 pages. [RKN: 70744]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2012) 42 (1) : 77-101.
Global warming has more than doubled the likelihood of extreme weather events, e.g. the 2003 European heat wave, the growing
intensity of rain and snow in the Northern Hemisphere, and the increasing risk of flooding in the United Kingdom. It has also
induced an increasing number of deadly tropical cyclones with a continuing trend. Many individual meteorological dynamic
simulations and statistical models are available for forecasting hurricanes but they neither forecast well hurricane intensity nor
produce clear-cut consensus. We develop a novel hurricane forecasting model by straddling two seemingly unrelated disciplines
physical science and finance based on the well known price discovery function of trading in financial markets. Traders of
hurricane derivative contracts employ all available forecasting models, public or proprietary, to forecast hurricanes in order to
make their pricing and trading decisions. By using transactional price changes of these contracts that continuously clear the
market supply and demand as the predictor, and with calibration to extract the embedded hurricane information by developing
hurricane futures and futures option pricing models, one can gain a forward-looking market-consensus forecast out of all of the
individual forecasting models employed. Our model can forecast when a hurricane will make landfall, how destructive it will be,
and how this destructive power will evolve from inception to landing. While the NHC (National Hurricane Center) blends 50 plus
individual forecasting results for its consensus model forecasts using a subjective approach, our aggregate is market-based.
Believing their proprietary forecasts are sufficiently different from our market-based forecasts, traders could also examine the
discrepancy for a potential trading opportunity using hurricane derivatives. We also provide a real case analysis of Hurricane Irene
in 2011 using our methodology.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

EXTREME VALUE THEORY

Asymptotics for risk capital allocations based on Conditional Tail Expectation. Asimit, Alexandru V; Furman, Edward; Tang, Qihe;
Vernic, Raluca [RKN: 44933]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 310-324.
An investigation of the limiting behavior of a risk capital allocation rule based on the Conditional Tail Expectation (CTE) risk
measure is carried out. More specifically, with the help of general notions of Extreme Value Theory (EVT), the aforementioned risk
capital allocation is shown to be asymptotically proportional to the corresponding Value-at-Risk (VaR) risk measure. The existing
methodology acquired for VaR can therefore be applied to a somewhat less well-studied CTE. In the context of interest, the EVT
approach is seemingly well-motivated by modern regulations, which openly strive for the excessive prudence in determining risk
capitals.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Bias-reduced estimators for bivariate tail modelling. Beirlant, J; Dierckx, G; Guillou, A [RKN: 44971]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (1) : 18-26.
Ledford and Tawn (1997) introduced a flexible bivariate tail model based on the coefficient of tail dependence and on the
dependence of the extreme values of the random variables. In this paper, we extend the concept by specifying the slowly varying
part of the model as done by Hall (1982) with the univariate case. Based on Beirlant et al. (2009), we propose a bias-reduced
estimator for the coefficient of tail dependence and for the estimation of small tail probabilities. We discuss the properties of these
estimators via simulations and a real-life example. Furthermore, we discuss some theoretical asymptotic aspects of this approach.

Available via Athens: Palgrave MacMillan
http://www.openathens.net

Composite LognormalPareto model with random threshold. Pigeon, Mathieu; Denuit, Michel [RKN: 45488]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 3 : 177-192.
This paper further considers the composite LognormalPareto model proposed by Cooray & Ananda (2005) and suitably modified
by Scollnik (2007). This model is based on a Lognormal density up to an unknown threshold value and a Pareto density thereafter.
Instead of using a single threshold value applying uniformly to the whole data set, the model proposed in the present paper allows
for heterogeneity with respect to the threshold and let it vary among observations. Specifically, the threshold value for a particular
observation is seen as the realization of a positive random variable and the mixed composite LognormalPareto model is obtained
by averaging over the population of interest. The performance of the composite LognormalPareto model and of its mixed
extension is compared using the well-known Danish fire losses data set.


54


Estimating the distortion parameter of the proportional-hazard premium for heavy-tailed losses. Brahimi, Brahim; Meraghni,
Djamel; Necir, Abdelhakim; Zitikis, Ricardas [RKN: 44934]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 325-334.
The distortion parameter reflects the amount of loading in insurance premiums. A specific value of a given premium determines a
value of the distortion parameter, which depends on the underlying loss distribution. Estimating the parameter, therefore,
becomes a statistical inferential problem, which has been initiated by Jones and Zitikis [Jones, B.L., Zitikis, R., 2007. Risk
measures, distortion parameters, and their empirical estimation. Insurance: Mathematics and Economics, 41, 279297] in the
case of the distortion premium and tackled within the framework of the central limit theorem. Heavy-tailed losses do not fall into
this framework as they rely on the extreme-value theory. In this paper, we concentrate on a special but important distortion
premium, called the proportional-hazard premium, and propose an estimator for its distortion parameter in the case of heavy-tailed
losses. We derive an asymptotic distribution of the estimator, construct a practically implementable confidence interval for the
distortion parameter, and illustrate the performance of the interval in a simulation study.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Extreme value behavior of aggregate dependent risks. Chen, Die; Mao, Tiantian; Pan, Xiaoming; Hu, Taizhong [RKN: 44995]
Insurance: Mathematics & Economics (2012) 50 (1) : 99-108.
Consider a portfolio of n identically distributed risks with dependence structure modeled by an Archimedean survival copula.
Wthrich (2003) and Alink et al. (2004) proved that the probability of a large aggregate loss scales like the probability of a large
individual loss, times a proportionality factor. This factor depends on the dependence strength and the tail behavior of the
individual risk, denoted by , and according to whether the tail behavior belongs to the maximum domain of attraction of the
Frchet, the Weibull or the Gumbel distribution, respectively. We investigate properties of the factors and with respect to the
dependence parameter and/or the tail behavior parameter, and revisit the asymptotic behavior of conditional tail expectations of
aggregate risks for the Weibull and the Gumbel cases by using a different method. The main results strengthen and complement
some results in [Alink et al., 2004] and [Alink et al., 2005]Barbe et al. (2006), and Embrechts et al. (2009).
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Laws of small numbers : Extremes and rare events. Falk, Michael; Hsler, Jrg; Reis, Rolf-Dieter (2011). - 3rd, revised and extended
ed. Birkhuser, 2011. - 509 pages. [RKN: 73659]
Shelved at: 519.24 22 LAW
"The intention of the book is to give a mathematically oriented development of the theory of rare events underlying various
applications. This characteristic of the book was strengthened in the second edition by incorporating various new results. In this
third edition, the dramatic change of focus of extreme value theory has been taken into account: from concentrating on maxima of
observations it has shifted to large observations, defined as exceedances over high thresholds. One emphasis of the present third
edition lies on multivariate generalized Pareto distributions, their representations, properties such as their peaks-over-threshold
stability, simulation, testing and estimation."

A new class of models for heavy tailed distributions in finance and insurance risk. Ahn, Soohan; Kim, Joseph H T; Ramaswami,
Vaidyanathan [RKN: 45722]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 43-52.
Many insurance loss data are known to be heavy-tailed. In this article we study the class of Log phase-type (LogPH) distributions
as a parametric alternative in fitting heavy tailed data. Transformed from the popular phase-type distribution class, the LogPH
introduced by Ramaswami exhibits several advantages over other parametric alternatives. We analytically derive its tail related
quantities including the conditional tail moments and the mean excess function, and also discuss its tail thickness in the context of
extreme value theory. Because of its denseness proved herein, we argue that the LogPH can offer a rich class of heavy-tailed loss
distributions without separate modeling for the tail side, which is the case for the generalized Pareto distribution (GPD). As a
numerical example we use the well-known Danish fire data to calibrate the LogPH model and compare the result with that of the
GPD. We also present fitting results for a set of insurance guarantee loss data.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

FAMILY

Optimal investment and consumption decision of a family with life insurance. Kwak, Minsuk; Shin, Yong Hyun; Choi, U Jin [RKN:
40011]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 176-188.
We study an optimal portfolio and consumption choice problem of a family that combines life insurance for parents who receive
deterministic labor income until the fixed time T. We consider utility functions of parents and children separately and assume that
parents have an uncertain lifetime. If parents die before time T, children have no labor income and they choose the optimal
consumption and portfolio with remaining wealth and life insurance benefit. The object of the family is to maximize the weighted
average of utility of parents and that of children. We obtain analytic solutions for the value function and the optimal policies, and
then analyze how the changes of the weight of the parents utility function and other factors affect the optimal policies.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

55

FINANCE

The endogenous price dynamics of emission allowances and an application to CO2 option pricing. Chesney, Marc; Taschini, Luca
[RKN: 45878]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (5-6) : 447-475.
Market mechanisms are increasingly being used as a tool for allocating somewhat scarce but unpriced rights and resources, and
the European Emission Trading Scheme is an example. By means of dynamic optimization in the contest of firms covered by such
environmental regulations, this article generates endogenously the price dynamics of emission permits under asymmetric
information, allowing inter-temporal banking and borrowing. In the market, there are a finite number of firms and each firm's
pollution emission follows an exogenously given stochastic process. We prove the discounted permit price is a martingale with
respect to the relevant filtration. The model is solved numerically. Finally, a closed-form pricing formula for European-style options
is derived.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

FINANCIAL MARKETS

Modeling dependence dynamics through copulas with regime switching. Silvo Filho, Osvaldo Candido da; Ziegelmann, Flavio
Augusto; Dueker, Michael J [RKN: 45638]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 346-356.
Measuring dynamic dependence between international financial markets has recently attracted great interest in financial
econometrics because the observed correlations rose dramatically during the 200809 global financial crisis. Here, we propose a
novel approach for measuring dependence dynamics. We include a hidden Markov chain (MC) in the equation describing
dependence dynamics, allowing the unobserved time-varying dependence parameter to vary according to both a restricted ARMA
process and an unobserved two-state MC. Estimation is carried out via the inference for the margins in conjunction with
filtering/smoothing algorithms. We use block bootstrapping to estimate the covariance matrix of our estimators. Monte Carlo
simulations compare the performance of regime switching and no switching models, supporting the regime-switching
specification. Finally the proposed approach is applied to empirical data, through the study of the S&P500 (USA), FTSE100 (UK)
and BOVESPA (Brazil) stock market indexes.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

FIRE INSURANCE

Modeling with Weibull-Pareto models. Scollnik, David P M; Sun, Chenchen Society of Actuaries, - 13 pages. [RKN: 70138]
Shelved at: Per: NAAJ (Oxf) Per NAAJ (Lon) Shelved at: JOU
North American Actuarial Journal (2012) 16 (2) : 260-272.
In this paper we develop several composite Weibull-Pareto models and suggest their use to model loss payments and other forms
of actuarial data. These models all comprise a Weibull distribution up to a threshold point, and some form of Pareto distribution
thereafter. They are similar in spirit to some composite lognormal-Pareto models that have previously been considered in the
literature. All of these models are applied, and their performance compared, in the context of a real world fire insurance data set.
http://www.soa.org/news-and-publications/publications/journals/naaj/naaj-detail.aspx

Multivariate density estimation using dimension reducing information and tail flattening transformations. Buch-Kromann, Tine;
Guilln, Montserrat; Linton, Oliver; Nielsen, Jens Perch [RKN: 39933]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 99-110.
We propose a nonparametric multiplicative bias corrected transformation estimator designed for heavy tailed data. The
multiplicative correction is based on prior knowledge and has a dimension reducing effect at the same time as the original
dimension of the estimation problem is retained. Adding a tail flattening transformation improves the estimation
significantlyparticularly in the tailand provides significant graphical advantages by allowing the density estimation to be
visualized in a simple way. The combined method is demonstrated on a fire insurance data set and in a data-driven simulation
study.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

FORECASTING

Mortality density forecasts: An analysis of six stochastic mortality models. Cairns, Andrew J G; Blake, David; Dowd, Kevin; Epstein,
David; Khalaf-Allah, Marwa; Coughlan, Guy D [RKN: 45129]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (3) : 355-367.
56

This paper develops a framework for developing forecasts of future mortality rates. We discuss the suitability of six stochastic
mortality models for forecasting future mortality and estimating the density of mortality rates at different ages. In particular, the
models are assessed individually with reference to the following qualitative criteria that focus on the plausibility of their forecasts:
biological reasonableness; the plausibility of predicted levels of uncertainty in forecasts at different ages; and the robustness of the
forecasts relative to the sample period used to fit the model. An important, though unsurprising, conclusion is that a good fit to
historical data does not guarantee sensible forecasts. We also discuss the issue of model risk, common to many modelling
situations in demography and elsewhere. We find that even for those models satisfying our qualitative criteria, there are significant
differences among central forecasts of mortality rates at different ages and among the distributions surrounding those central
forecasts.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

FOREIGN EXCHANGE

The stochastic intrinsic currency volatility model : A consistent framework for multiple FX rates and their volatilities. Doust, Paul
Routledge, [RKN: 45876]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (5-6) : 381-445.
The SABR and the Heston stochastic volatility models are widely used for foreign exchange (FX) option pricing. Although they are
able to reproduce the market's volatility smiles and skews for single FX rates, they cannot be extended to model multiple FX rates
in a consistent way. This is because when two FX rates with a common currency are described by either SABR or Heston
processes, the stochastic process for the third FX rate associated with the three currencies will not be a SABR or a Heston
process. A consistent description of the FX market should be symmetric so that all FX rates are described by the same type of
stochastic process. This article presents a way of doing that using the concept of intrinsic currency values. To model FX volatility
curves, the intrinsic currency framework is extended by allowing the volatility of each intrinsic currency value to be stochastic. This
makes the framework more realistic, while preserving all FX market symmetries. The result is a new SABR-style option pricing
formula and a model that can simultaneously be calibrated to the volatility smiles of all possible currency pairs under
consideration. Consequently, it is more powerful than modelling the volatility curves of each currency pair individually and offers a
methodology for comparing the volatility curves of different currency pairs against each other. One potential application of this is in
the pricing of FX options, such as vanilla options on less liquid currency pairs. Given the volatilities of less liquid currencies
against, for example, USD and EUR, the model could then be used to calculate the volatility smiles of those less liquid currencies
against all other currencies.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

FOREIGN EXCHANGE MARKETS

The stochastic intrinsic currency volatility model : A consistent framework for multiple FX rates and their volatilities. Doust, Paul
Routledge, [RKN: 45876]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (5-6) : 381-445.
The SABR and the Heston stochastic volatility models are widely used for foreign exchange (FX) option pricing. Although they are
able to reproduce the market's volatility smiles and skews for single FX rates, they cannot be extended to model multiple FX rates
in a consistent way. This is because when two FX rates with a common currency are described by either SABR or Heston
processes, the stochastic process for the third FX rate associated with the three currencies will not be a SABR or a Heston
process. A consistent description of the FX market should be symmetric so that all FX rates are described by the same type of
stochastic process. This article presents a way of doing that using the concept of intrinsic currency values. To model FX volatility
curves, the intrinsic currency framework is extended by allowing the volatility of each intrinsic currency value to be stochastic. This
makes the framework more realistic, while preserving all FX market symmetries. The result is a new SABR-style option pricing
formula and a model that can simultaneously be calibrated to the volatility smiles of all possible currency pairs under
consideration. Consequently, it is more powerful than modelling the volatility curves of each currency pair individually and offers a
methodology for comparing the volatility curves of different currency pairs against each other. One potential application of this is in
the pricing of FX options, such as vanilla options on less liquid currency pairs. Given the volatilities of less liquid currencies
against, for example, USD and EUR, the model could then be used to calculate the volatility smiles of those less liquid currencies
against all other currencies.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

FORMULAE

A general formula for option prices in a stochastic volatility model. Chin, Stephen; Dufresne, Daniel Routledge, [RKN: 45842]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (3-4) : 313-340.
We consider the pricing of European derivatives in a BlackScholes model with stochastic volatility. We show how Parseval's
57

theorem may be used to express those prices as Fourier integrals. This is a significant improvement over Monte Carlo simulation.
The main ingredient in our method is the Laplace transform of the ordinary (constant volatility) price of a put or call in the
BlackScholes model, where the transform is taken with respect to maturity (T); this does not appear to have been used before in
pricing options under stochastic volatility. We derive these formulas and then apply them to the case where volatility is modelled as
a continuous-time Markov chain, the so-called Markov regime-switching model. This model has been used previously in stochastic
volatility modelling, but mostly with only states. We show how to use states without difficulty, and how larger number of states
can be handled. Numerical illustrations are given, including the implied volatility curve in two- and three-state models. The curves
have the smile shape observed in practice.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

The Solvency II square-root formula for systematic biometric risk. Christiansen, Marcus C; Denuit, Michel M; Lazar, Dorina [RKN:
45599]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (2) : 257-265.
In this paper, we develop a model supporting the so-called square-root formula used in Solvency II to aggregate the modular life
SCR. Describing the insurance policy by a Markov jump process, we can obtain expressions similar to the square-root formula in
Solvency II by means of limited expansions around the best estimate. Numerical illustrations are given, based on German
population data. Even if the square-root formula can be supported by theoretical considerations, it is shown that the QIS
correlation matrix is highly questionable.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Stochastic expansion for the pricing of call options with discrete dividends. Etore, Pierre; Gobet, Emmanuel Routledge, [RKN:
45839]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (3-4) : 233-264.
In the context of an asset paying affine-type discrete dividends, we present closed analytical approximations for the pricing of
European vanilla options in the BlackScholes model with time-dependent parameters. They are obtained using a stochastic
Taylor expansion around a shifted lognormal proxy model. The final formulae are, respectively, first-, second- and third- order
approximations w.r.t. the fixed part of the dividends. Using CameronMartin transformations, we provide explicit representations
of the correction terms as Greeks in the BlackScholes model. The use of Malliavin calculus enables us to provide tight error
estimates for our approximations. Numerical experiments show that this approach yields very accurate results, in particular
compared with known approximations of Bos, Gairat and Shepeleva (20035. Bos , R. , Gairat , A. and Shepeleva , D. 2003 .
Dealing with discrete dividends . Risk Magazine , 16 : 109 112 and Veiga and Wystup (2009, Closed formula for options with
discrete dividends and its derivatives, Applied Mathematical Finance, 16(6): 517531) and quicker than the iterated integration
procedure of Haug, Haug and Lewis (2003, Back to basics: a new approach to the discrete dividend problem, Wilmott Magazine,
pp. 37 47) or than the binomial tree method of Vellekoop and Nieuwenhuis (2006, Efficient pricing of derivatives on assets with
discrete dividends, Applied Mathematical Finance, 13(3), pp. 265284).

Available via Athens: Taylor & Francis Online
http://www.openathens.net

TVaR-based capital allocation for multivariate compound distributions with positive continuous claim amounts. Cossette,
Helene; Mailhot, Melina; Marceau, Etienne [RKN: 45598]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (2) : 247-256.
In this paper, we consider a portfolio of n dependent risks X1,,Xn and we study the stochastic behavior of the aggregate claim
amount [unable to display]. Our objective is to determine the amount of economic capital needed for the whole portfolio and to
compute the amount of capital to be allocated to each risk X1,,Xn. To do so, we use a topdown approach. For (X1,,Xn), we
consider risk models based on multivariate compound distributions defined with a multivariate counting distribution. We use the
TVaR to evaluate the total capital requirement of the portfolio based on the distribution of S, and we use the TVaR-based capital
allocation method to quantify the contribution of each risk. To simplify the presentation, the claim amounts are assumed to be
continuously distributed. For multivariate compound distributions with continuous claim amounts, we provide general formulas for
the cumulative distribution function of S, for the TVaR of S and the contribution to each risk. We obtain closed-form expressions for
those quantities for multivariate compound distributions with gamma and mixed Erlang claim amounts. Finally, we treat in detail
the multivariate compound Poisson distribution case. Numerical examples are provided in order to examine the impact of the
dependence relation on the TVaR of S, the contribution to each risk of the portfolio, and the benefit of the aggregation of several
risks.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

FUND MANAGEMENT

Pension fund management and conditional indexation. Kleinow, Torsten [RKN: 45300]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU/AST
ASTIN Bulletin (2011) 41 (1) : 61-86.
Conditional indexation offers a middle way between defined benefit and defined contribution pension schemes. In this paper, we
consider a fully-funded pension scheme with conditional indexation. We show how the pension fund can be managed to reduce
the risks associated with promised pension benefits when declared benefits are adjusted regularly during the working life. In
particular, we derive an investment strategy that provides protection against underfunding at retirement and which is self-financing
on average. Our results are illustrated in an extensive simulation study.
58

online access via International Actuarial Association:
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

GAME THEORY

The limitations of game theory - response to A. Smith, May 2011 : Letter to the editor. Clarkson, Robert Staple Inn Actuarial
Society, - 1 pages. [RKN: 73894]
Shelved at: Per: Actuary (Oxf) Per: Actuary (Lon) Shelved at: JOU
The Actuary (2011) June : 6.
Response to article in the May 2011 Actuary regarding the limitations of game theory.
http://www.theactuary.com/

GAMES

On 1-convexity and nucleolus of co-insurance games. Driessen, Theo S H; Vito, Fragnelli; Katsev, Ilya V; Khmelnitskaya, Anna B
[RKN: 40016]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 217-225.
The insurance situation in which an enormous risk is insured by a number of insurance companies is modeled through a
cooperative TU game, the so-called co-insurance game, first introduced in Fragnelli and Marina (2004). In this paper we present
certain conditions on the parameters of the model that guarantee the 1-convexity property of co-insurance games which in turn
ensures the nonemptiness of the core and the linearity of the nucleolus as a function of the variable premium. Further we reveal
conditions when a co-insurance game is representable in the form of a veto-removed game and present an efficient final algorithm
for computing the nucleolus of a veto-removed game.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

GARCH

Viterbi-based estimation for Markov switching GARCH model. Routledge, [RKN: 45838]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (3-4) : 219-231.
We outline a two-stage estimation method for a Markov-switching Generalized Autoregressive Conditional Heteroscedastic
(GARCH) model modulated by a hidden Markov chain. The first stage involves the estimation of a hidden Markov chain using the
Vitberi algorithm given the model parameters. The second stage uses the maximum likelihood method to estimate the model
parameters given the estimated hidden Markov chain. Applications to financial risk management are discussed through simulated
data.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

GENERAL INSURANCE

Bayesian multivariate Poisson models for insurance ratemaking. Bermudez, Lluis; Karlis, Dimitris [RKN: 40017]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 226-236.
When actuaries face the problem of pricing an insurance contract that contains different types of coverage, such as a motor
insurance or a homeowners insurance policy, they usually assume that types of claim are independent. However, this assumption
may not be realistic: several studies have shown that there is a positive correlation between types of claim. Here we introduce
different multivariate Poisson regression models in order to relax the independence assumption, including zero-inflated models to
account for excess of zeros and overdispersion. These models have been largely ignored to date, mainly because of their
computational difficulties. Bayesian inference based on MCMC helps to resolve this problem (and also allows us to derive, for
several quantities of interest, posterior summaries to account for uncertainty). Finally, these models are applied to an automobile
insurance claims database with three different types of claim. We analyse the consequences for pure and loaded premiums when
the independence assumption is relaxed by using different multivariate Poisson regression models together with their zero-inflated
versions.
Available via Athens: Palgrave MacMillan
http://www.openathens.net


59


Diagonal effects in claims reserving. Jessen, Anders Hedegaard; Rietdorf, Niels [RKN: 45147]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 1 : 21-37.
In this paper we present two different approaches to how one can include diagonal effects in non-life claims reserving based on
run-off triangles. Empirical analyses suggest that the approaches in Zehnwirth (2003) and Kuang et al. (2008a, 2008b) do not work
well with low-dimensional run-off triangles because estimation uncertainty is too large. To overcome this problem we consider
similar models with a smaller number of parameters. These are closely related to the framework considered in Verbeek (1972) and
Taylor (1977, 2000); the separation method. We explain that these models can be interpreted as extensions of the multiplicative
Poisson models introduced by Hachemeister & Stanard (1975) and Mack (1991).

Folded and log-folded-t distributions as models for insurance loss data. Brazauskas, Vytaras; Kleefeld, Andreas [RKN: 45149]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 1 : 59-74.
A rich variety of probability distributions has been proposed in the actuarial literature for fitting of insurance loss data. Examples
include: lognormal, log-t, various versions of Pareto, loglogistic, Weibull, gamma and its variants, and generalized beta of the
second kind distributions, among others. In this paper, we supplement the literature by adding the log-folded-normal and
log-folded-t families. Shapes of the density function and key distributional properties of the 'folded' distributions are presented
along with three methods for the estimation of parameters: method of maximum likelihood; method of moments; and method of
trimmed moments. Further, large and small-sample properties of these estimators are studied in detail. Finally, we fit the newly
proposed distributions to data which represent the total damage done by 827 fires in Norway for the year 1988. The fitted models
are then employed in a few quantitative risk management examples, where point and interval estimates for several value-at-risk
measures are calculated.

Higher moments of the claims development result in general insurance. Salzmann, Robert; Wthrich, Mario V; Merz, Michael - 30
pages. [RKN: 70755]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2012) 42 (1) : 355-384.
The claims development result (CDR) is one of the major risk drivers in the profit and loss statement of a general insurance
company. Therefore, the CDR has become a central object of interest under new solvency regulation. In current practice, simple
methods based on the first two moments of the CDR are implemented to find a proxy for the distribution of the CDR. Such
approximations based on the first two moments are rather rough and may fail to appropriately describe the shape of the
distribution of the CDR. In this paper we provide an analysis of higher moments of the CDR. Within a Bayes chain ladder
framework we consider two different models for which it is possible to derive analytical solutions for the higher moments of the
CDR. Based on higher moments we can e.g. calculate the skewness and the excess kurtosis of the distribution of the CDR and
obtain refined approximations. Moreover, a case study investigates and answers questions raised in IASB [9].
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Impact of insurance for operational risk: Is it worthwhile to insure or be insured for severe losses?. Peters, Gareth W; Byrnes,
Aaron D; Shevchenko, Pavel V [RKN: 40024]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 287-303.
Under the Basel II standards, the Operational Risk (OpRisk) advanced measurement approach allows a provision for reduction of
capital as a result of insurance mitigation of up to 20%. This paper studies different insurance policies in the context of capital
reduction for a range of extreme loss models and insurance policy scenarios in a multi-period, multiple risk setting. A Loss
Distributional Approach (LDA) for modeling of the annual loss process, involving homogeneous compound Poisson processes for
the annual losses, with heavy-tailed severity models comprised of a-stable severities is considered. There has been little analysis
of such models to date and it is believed insurance models will play more of a role in OpRisk mitigation and capital reduction in
future. The first question of interest is when would it be equitable for a bank or financial institution to purchase insurance for
heavy-tailed OpRisk losses under different insurance policy scenarios? The second question pertains to Solvency II and
addresses quantification of insurer capital for such operational risk scenarios. Considering fundamental insurance policies
available, in several two risk scenarios, we can provide both analytic results and extensive simulation studies of insurance
mitigation for important basic policies, the intention being to address questions related to VaR reduction under Basel II, SCR under
Solvency II and fair insurance premiums in OpRisk for different extreme loss scenarios. In the process we provide closed-form
solutions for the distribution of loss processes and claims processes in an LDA structure as well as closed-form analytic solutions
for the Expected Shortfall, SCR and MCR under Basel II and Solvency II. We also provide closed-form analytic solutions for the
annual loss distribution of multiple risks including insurance mitigation.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Log-supermodularity of weight functions, ordering weighted losses, and the loading monotonicity of weighted premiums.
Sendov, Hristo S; Wang, Ying; Zitikis, Ricardas [RKN: 40020]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 257-264.
The paper is motivated by a problem concerning the monotonicity of insurance premiums with respect to their loading parameter:
the larger the parameter, the larger the insurance premium is expected to be. This property, usually called the loading
monotonicity, is satisfied by premiums that appear in the literature. The increased interest in constructing new insurance
premiums has raised a question as to what weight functions would produce loading-monotonic premiums. In this paper, we
demonstrate a decisive role of log-supermodularity or, equivalently, of total positivity of order 2 (TP2) in answering this question.
As a consequence, we establishat a strokethe loading monotonicity of a number of well-known insurance premiums, and offer a
host of further weight functions, and consequently of premiums, thus illustrating the power of the herein suggested methodology
for constructing loading-monotonic insurance premiums.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

60


On 1-convexity and nucleolus of co-insurance games. Driessen, Theo S H; Vito, Fragnelli; Katsev, Ilya V; Khmelnitskaya, Anna B
[RKN: 40016]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 217-225.
The insurance situation in which an enormous risk is insured by a number of insurance companies is modeled through a
cooperative TU game, the so-called co-insurance game, first introduced in Fragnelli and Marina (2004). In this paper we present
certain conditions on the parameters of the model that guarantee the 1-convexity property of co-insurance games which in turn
ensures the nonemptiness of the core and the linearity of the nucleolus as a function of the variable premium. Further we reveal
conditions when a co-insurance game is representable in the form of a veto-removed game and present an efficient final algorithm
for computing the nucleolus of a veto-removed game.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Optimality of general reinsurance contracts under CTE risk measure. Tan, Ken Seng; Weng, Chengguo; Zhang, Yi [RKN: 44960]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (2) : 175-187.
By formulating a constrained optimization model, we address the problem of optimal reinsurance design using the criterion of
minimizing the conditional tail expectation (CTE) risk measure of the insurers total risk. For completeness, we analyze the optimal
reinsurance model under both binding and unbinding reinsurance premium constraints. By resorting to the Lagrangian approach
based on the concept of directional derivative, explicit and analytical optimal solutions are obtained in each case under some mild
conditions. We show that pure stop-loss ceded loss function is always optimal. More interestingly, we demonstrate that ceded loss
functions, that are not always non-decreasing, could be optimal. We also show that, in some cases, it is optimal to exhaust the
entire reinsurance premium budget to determine the optimal reinsurance, while in other cases, it is rational to spend less than the
prescribed reinsurance premium budget.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Stochastic comparisons for allocations of policy limits and deductibles with applications. Lu, ZhiYi; Meng, LiLi [RKN: 45127]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (3) : 338-343.
In this paper, we study the problem of comparing losses of a policyholder who has an increasing utility function when the form of
coverage is policy limit and deductible. The total retained losses of a policyholder [formula] are ordered in the usual stochastic
order sense when Xi(i=1,,n) are ordered with respect to the likelihood ratio order. The parallel results for the case of deductibles
are obtained in the same way. It is shown that the ordering of the losses are related to the characteristics (log-concavity or
log-convexity) of distributions of the risks. As an application of the comparison results, the optimal problems of allocations of policy
limits and deductibles are studied in usual stochastic order sense and the closed-form optimal solutions are obtained in some
special cases.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

GENERAL INSURANCE COMPANY

Detection and correction of outliers in the bivariate chainladder method. Verdonck, T; Van Wouwe, M [RKN: 44961]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (2) : 188-193.
The expected profit or loss of a non-life insurance company is determined for the whole of its multiple business lines. This implies
the study of the claims reserving problem for a portfolio consisting of several correlated run-off triangles. A popular technique to
deal with such a portfolio is the multivariate chainladder method of . However, it is well known that the chainladder method is
very sensitive to outlying data. For the univariate case, we have already developed a robust version of the chainladder method.
In this article we propose two techniques to detect and correct outlying values in a bivariate situation. The methodologies are
illustrated and compared on real examples from practice.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

GENERALISED LINEAR MODELS

A dynamic parameterization modeling for the age-period-cohort mortality. Hatzopoulos, P; Haberman, Steven [RKN: 44959]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (2) : 155-174.
An extended version of dynamic parametric model is proposed for analyzing mortality structures, incorporating the cohort effect. A
one-factor parameterized exponential polynomial in age effects within the generalized linear models (GLM) framework is used.
Sparse principal component analysis (SPCA) is then applied to time-dependent GLM parameter estimates and provides
(marginal) estimates for a two-factor principal component (PC) approach structure. Modeling the two-factor residuals in the same
way, in age-cohort effects, provides estimates for the (conditional) three-factor ageperiodcohort model. The age-time and
cohort related components are extrapolated using dynamic linear regression (DLR) models. An application is presented for
England & Wales males (18412006).
Available via Athens: Palgrave MacMillan
http://www.openathens.net
61


A generalized linear model with smoothing effects for claims reserving. Bjrkwall, Susanna; Hssjer, Ola; Ohlsson, Esbjrn; Verrall,
Richard [RKN: 44972]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (1) : 27-37.
In this paper, we continue the development of the ideas introduced in England and Verrall (2001) by suggesting the use of a
reparameterized version of the generalized linear model (GLM) which is frequently used in stochastic claims reserving. This model
enables us to smooth the origin, development and calendar year parameters in a similar way as is often done in practice, but still
keep the GLM structure. Specifically, we use this model structure in order to obtain reserve estimates and to systemize the model
selection procedure that arises in the smoothing process. Moreover, we provide a bootstrap procedure to achieve a full predictive
distribution.


Available via Athens: Palgrave MacMillan
http://www.openathens.net

A statistical basis for claims experience monitoring. Taylor, Greg Society of Actuaries, - 18 pages. [RKN: 74921]
Shelved at: Per: NAAJ (Oxf) Per NAAJ (Lon) Shelved at: JOU
North American Actuarial Journal (2011) 15 (4) : 535-552.
By claims experience monitoring is meant the systematic comparison of the forecasts from a claims model with claims experience
as it emerges subsequently. In the event that the stochastic properties of the forecasts are known, the comparison can be
represented as a collection of probabilistic statements. This is stochastic monitoring. This paper defines this process rigorously in
terms of statistical hypothesis testing. If the model is a regression model (which is the case for most stochastic claims models),
then the natural form of hypothesis test is a number of likelihood ratio tests, one for each parameter in the valuation model. Such
testing is shown to be very easily implemented by means of generalized linear modeling software. This tests the formal structure
of the claims model and is referred to as microtesting. There may be other quantities (e.g., amount of claim payments in a defined
interval) that require testing for practical reasons. This sort of testing is referred to as macrotesting, and its formulation is also
discussed.
http://www.soa.org/news-and-publications/publications/journals/naaj/naaj-detail.aspx

GERBER-SHIU FUNCTION

A generalized penalty function in Sparre Andersen risk models with surplus-dependent premium. Cheung, Eric C K [RKN: 45133]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (3) : 384-397.
In a general Sparre Andersen risk model with surplus-dependent premium income, the generalization of GerberShiu function
proposed by Cheung et al. (2010a) is studied. A general expression for such GerberShiu function is derived, and it is shown that
its determination reduces to the evaluation of a transition function which is independent of the penalty function. Properties of and
explicit expressions for such a transition function are derived when the surplus process is subject to (i) constant premium; (ii) a
threshold dividend strategy; or (iii) credit interest. Extension of the approach is discussed for an absolute ruin model with debit
interest.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Mathematical investigation of the GerberShiu function in the case of dependent inter-claim time and claim size. Mihalyko, Eva
Orban; Mihalyko, Csaba [RKN: 45132]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (3) : 378-383.
In this paper we investigate the well-known GerberShiu expected discounted penalty function in the case of dependence
between the inter-claim times and the claim amounts. We set up an integral equation for it and we prove the existence and
uniqueness of its solution in the set of bounded functions. We show that if d>0, the limit property of the solution is not a regularity
condition, but the characteristic of the solution even in the case when the net profit condition is not fulfilled. It is the consequence
of the choice of the penalty function for a given density function. We present an example when the GerberShiu function is not
bounded, consequently, it does not tend to zero. Using an operator technique we also prove exponential boundedness.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

On a multi-threshold compound Poisson surplus process with interest. Mitric, Ilie-Radu [RKN: 45353]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 2 : 75-95.
We consider a multi-threshold compound Poisson surplus process. When the initial surplus is between any two consecutive
thresholds, the insurer has the option to choose the respective premium rate and interest rate. Also, the model allows for
borrowing the current amount of deficit whenever the surplus falls below zero. Starting from the integro-differential equations
satisfied by the Gerber-Shiu function that appear in Yang et al. (2008), we consider exponentially and phase-type(2) distributed
claim sizes, in which cases we are able to transform the integro-differential equations into ordinary differential equations. As a
result, we obtain explicit expressions for the Gerber-Shiu function.

The proper distribution function of the deficit in the delayed renewal risk model. Kim, So-Yeun; Willmot, Gordon E [RKN: 45355]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 2 : 118-137.
The main focus of this paper is to extend the analysis of ruin-related quantities to the delayed renewal risk models. First, the
background for the delayed renewal risk model is introduced and a general equation that is used as a framework is derived. The
62

equation is obtained by conditioning on the first drop below the initial surplus level. Then, we consider the deficit at ruin among
many random variables associated with ruin. The properties of the distribution function (DF) of the proper deficit are examined in
particular.

GLOBAL WARMING

An extension of the WhittakerHenderson method of graduation. Nocon, Alicja S; Scott, William F [RKN: 44882]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2012) 1 : 70-79.
We present an outline and historical summary of the WhittakerHenderson method of graduation (or data smoothing), together
with an extension of the method in which the graduated values are obtained by minimising a WhittakerHenderson criterion
subject to constraints. Examples are given, using data for the global average temperature anomaly and for a set of share prices, in
which the proposed method appears to give good results.

Global warming, extreme weather events, and forecasting tropical cyclones. Chang, Carolyn W; Chang, Jack S K; Guan Lim, Kian -
25 pages. [RKN: 70744]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2012) 42 (1) : 77-101.
Global warming has more than doubled the likelihood of extreme weather events, e.g. the 2003 European heat wave, the growing
intensity of rain and snow in the Northern Hemisphere, and the increasing risk of flooding in the United Kingdom. It has also
induced an increasing number of deadly tropical cyclones with a continuing trend. Many individual meteorological dynamic
simulations and statistical models are available for forecasting hurricanes but they neither forecast well hurricane intensity nor
produce clear-cut consensus. We develop a novel hurricane forecasting model by straddling two seemingly unrelated disciplines
physical science and finance based on the well known price discovery function of trading in financial markets. Traders of
hurricane derivative contracts employ all available forecasting models, public or proprietary, to forecast hurricanes in order to
make their pricing and trading decisions. By using transactional price changes of these contracts that continuously clear the
market supply and demand as the predictor, and with calibration to extract the embedded hurricane information by developing
hurricane futures and futures option pricing models, one can gain a forward-looking market-consensus forecast out of all of the
individual forecasting models employed. Our model can forecast when a hurricane will make landfall, how destructive it will be,
and how this destructive power will evolve from inception to landing. While the NHC (National Hurricane Center) blends 50 plus
individual forecasting results for its consensus model forecasts using a subjective approach, our aggregate is market-based.
Believing their proprietary forecasts are sufficiently different from our market-based forecasts, traders could also examine the
discrepancy for a potential trading opportunity using hurricane derivatives. We also provide a real case analysis of Hurricane Irene
in 2011 using our methodology.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

GRADUATION

Automated graduation using Bayesian trans-dimensional models. Verrall, Richard J; Haberman, S Institute and Faculty of
Actuaries; Cambridge University Press, - 21 pages. [RKN: 74953]
Shelved at: Per: AAS (Oxf) Per: AAS (Lon)
Annals of Actuarial Science (2011) 5(2) : 231-251.
This paper presents a new method of graduation which uses parametric formulae together with Bayesian reversible jump Markov
chain Monte Carlo methods. The aim is to provide a method which can be applied to a wide range of data, and which does not
require a lot of adjustment or modification. The method also does not require one particular parametric formula to be selected:
instead, the graduated values are a weighted average of the values from a range of formulae. In this way, the new method can be
seen as an automatic graduation method which we believe can be applied in many cases without any adjustments and provide
satisfactory graduated values. An advantage of a Bayesian approach is that it allows for model uncertainty unlike standard
methods of graduation.
http://www.actuaries.org.uk/research-and-resources/pages/access-journals

An extension of the WhittakerHenderson method of graduation. Nocon, Alicja S; Scott, William F [RKN: 44882]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2012) 1 : 70-79.
We present an outline and historical summary of the WhittakerHenderson method of graduation (or data smoothing), together
with an extension of the method in which the graduated values are obtained by minimising a WhittakerHenderson criterion
subject to constraints. Examples are given, using data for the global average temperature anomaly and for a set of share prices, in
which the proposed method appears to give good results.

GRAM-CHARLIER

Gram-Charlier processes and equity-indexed annuities. Chateau, Jean-Pierre; Dufresne, Daniel (2012). - Victoria: University of
Melbourne, 2012. - 32 pages. [RKN: 73948]
A Gram-Charlier distribution has a density that is a polynomial times a normal density. The historical connection between actuarial
science and the Gram- Charlier expansions goes back to the 19th century. A critical review of the financial literature on the
Gram-Charlier distribution is made. Properties of the Gram-Charlier distributions are derived, including moments, tail estimates,
63

moment indeterminacy of the exponential of a Gram-Charlier distributed variable, non-existence of a continuoustime Levy process
with Gram-Charlier increments, as well as formulas for option prices and their sensitivities. A procedure for simulating
Gram-Charlier distributions is given. Multiperiod Gram-Charlier modelling of asset returns is described, apparently for the first
time. Formulas for equity indexed annuities premium option values are given, and a numerical illustration shows the importance of
skewness and kurtosis of the risk neutral density.
No.1 (1993) onwards available online. Download as PDF.
http://www.economics.unimelb.edu.au/ACT/papers.shtml

GUARANTEES

Equity-linked pension schemes with guarantees. Nielsen, J Aase; Sandmann, Klaus; Schlgl, Erik [RKN: 44956]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 547-564.
This paper analyses the relationship between the level of a return guarantee in an equity-linked pension scheme and the
proportion of an investors contribution needed to finance this guarantee. Three types of schemes are considered: investment
guarantee, contribution guarantee and surplus participation. The evaluation of each scheme involves pricing an Asian option, for
which relatively tight upper and lower bounds can be calculated in a numerically efficient manner. The authors find a negative (and
for two contract specifications also concave) relationship between the participation in the surplus return of the investment strategy
and the guarantee level in terms of a minimum rate of return. Furthermore, the introduction of the possibility of early termination of
the contract (e.g. due to the death of the investor) has no qualitative and very little quantitative impact on this relationship.

Available via Athens: Palgrave MacMillan
http://www.openathens.net

Optimal asset allocation for DC pension plans under inflation. Han, Nan-wei; Hung, Mao-Wei [RKN: 45734]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 172-181.
In this paper, the stochastic dynamic programming approach is used to investigate the optimal asset allocation for a
defined-contribution pension plan with downside protection under stochastic inflation. The plan participant invests the fund wealth
and the stochastic interim contribution flows into the financial market. The nominal interest rate model is described by the
CoxIngersollRoss (Cox et al., 1985) dynamics. To cope with the inflation risk, the inflation indexed bond is included in the asset
menu. The retired individuals receive an annuity that is indexed by inflation and a downside protection on the amount of this
annuity is considered. The closed-form solution is derived under the CRRA utility function. Finally, a numerical application is
presented to characterize the dynamic behavior of the optimal investment strategy.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

A performance analysis of participating life insurance contracts. Faust, Roger; Schmeiser, Hato; Zemp, Alexandra [RKN: 45733]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 158-171.
Participating life insurance contracts are one of the most important products in the European life insurance market. Even though
these contract forms are very common, only very little research has been conducted in respect to their performance. Hence, we
conduct a performance analysis to provide a decision support for policyholders. We decompose a participating life insurance
contract in a term life insurance and a savings part and simulate the cash flow distribution of the latter. Simulation results are
compared with cash flows resulting from two benchmarks investing in the same portfolio of assets but without investment
guarantees and bonus distribution schemes, in order to measure the impact of these two product features. To provide a realistic
picture within the two alternatives, we take transaction costs and wealth transfers between different groups of policyholders into
account. We show that the payoff distribution strongly depends on the initial reserve situation and managerial discretion. Results
indicate that policyholders will in general profit from a better payoff distribution of the participating life insurance compared to a
mutual fund benchmark but not compared to an exchange-traded fund benchmark portfolio.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

HASTINGS ALGORITHM

A Bayesian approach to parameter estimation for kernel density estimation via transformations. Liu, Qing; Pitt, David; Zhang,
Xibin; Wu, Xueyuan Institute and Faculty of Actuaries; Cambridge University Press, - 13 pages. [RKN: 74950]
Shelved at: Per: AAS (Oxf) Per: AAS (Lon)
Annals of Actuarial Science (2011) 5(2) : 181-193.
In this paper, we present a Markov chain Monte Carlo (MCMC) simulation algorithm for estimating parameters in the kernel
density estimation of bivariate insurance claim data via transformations. Our data set consists of two types of auto insurance claim
costs and exhibits a high-level of skewness in the marginal empirical distributions. Therefore, the kernel density estimator based
on original data does not perform well. However, the density of the original data can be estimated through estimating the density of
the transformed data using kernels. It is well known that the performance of a kernel density estimator is mainly determined by the
bandwidth, and only in a minor way by the kernel. In the current literature, there have been some developments in the area of
estimating densities based on transformed data, where bandwidth selection usually depends on pre-determined transformation
parameters. Moreover, in the bivariate situation, the transformation parameters were estimated for each dimension individually.
We use a Bayesian sampling algorithm and present a Metropolis-Hastings sampling procedure to sample the bandwidth and
transformation parameters from their posterior density. Our contribution is to estimate the bandwidths and transformation
64

parameters simultaneously within a Metropolis-Hastings sampling procedure. Moreover, we demonstrate that the correlation
between the two dimensions is better captured through the bivariate density estimator based on transformed data.
http://www.actuaries.org.uk/research-and-resources/pages/access-journals

HEDGING

Assessing the costs of protection in a context of switching stochastic regimes. Barrieu, Pauline; Bellamy, Nadine; Sahut,
Jean-Michel [RKN: 45880]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (5-6) : 495-511.
We consider the problem of cost assessment in the context of switching stochastic regimes. The dynamics of a given asset include
a background noise, described by a Brownian motion and a random shock, the impact of which is characterized by changes in the
coefficient diffusions. A particular economic agent that is directly exposed to variations in the underlying asset price, incurs some
costs, F(L), when the underlying asset price reaches a certain threshold, L. Ideally, the agent would make advance provision, or
hedge, for these costs at time 0. We evaluate the amount of provision, or the hedging premium, M(L), for these costs in the
disrupted environment, with changes in the regime for a given time horizon, and analyse the sensitivity of this amount to possible
model misspecifications.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

A computationally efficient algorithm for estimating the distribution of future annuity values under interest-rate and longevity
risks. Dowd, Kevin; Blake, David; Cairns, Andrew J G Society of Actuaries, - 11 pages. [RKN: 74836]
Shelved at: Per: NAAJ (Oxf) Per NAAJ (Lon) Shelved at: JOU
North American Actuarial Journal (2011) 15 (2) : 237-247.
This paper proposes a computationally efficient algorithm for quantifying the impact of interestrate risk and longevity risk on the
distribution of annuity values in the distant future. The algorithm simulates the state variables out to the end of the horizon period
and then uses a Taylor series approximation to compute approximate annuity values at the end of that period, thereby avoiding a
computationally expensive simulation-within-simulation problem. Illustrative results suggest that annuity values are likely to rise
considerably but are also quite uncertain. These findings have some unpleasant implications both for defined contribution pension
plans and for defined benefit plan sponsors considering using annuities to hedge their exposure to these risks at some point in the
future.
http://www.soa.org/news-and-publications/publications/journals/naaj/naaj-detail.aspx

DeltaGamma hedging of mortality and interest rate risk. Luciano, Elisa; Regis, Luca; Vigna, Elena [RKN: 45643]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 402-412.
One of the major concerns of life insurers and pension funds is the increasing longevity of their beneficiaries. This paper studies
the hedging problem of annuity cash flows when mortality and interest rates are stochastic. We first propose a DeltaGamma
hedging technique for mortality risk. The risk factor against which to hedge is the difference between the actual mortality intensity
in the future and its forecast today, the forward intensity. We specialize the hedging technique first to the case in which mortality
intensities are affine, then to OrnsteinUhlenbeck and Feller processes, providing actuarial justifications for this selection. We
show that, without imposing no arbitrage, we can get equivalent probability measures under which the HJM condition for no
arbitrage is satisfied. Last, we extend our results to the presence of both interest rate and mortality risk. We provide a UK
calibrated example of DeltaGamma hedging of both mortality and interest rate risk.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Dynamic hedging of conditional value-at-risk. Melnikov, Alexander; Smirnov, Ivan [RKN: 45735]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 182-190.
In this paper, the problem of partial hedging is studied by constructing hedging strategies that minimize conditional value-at-risk
(CVaR) of the portfolio. Two dual versions of the problem are considered: minimization of CVaR with the initial wealth bounded
from above, and minimization of hedging costs subject to a CVaR constraint. The NeymanPearson lemma approach is used to
deduce semi-explicit solutions. Our results are illustrated by constructing CVaR-efficient hedging strategies for a call option in the
BlackScholes model and also for an embedded call option in an equity-linked life insurance contract.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Hedging of spatial temperature risk with market-traded futures. Barth, Andrea; Benth, Fred Espen; Potthoff, Jurgen [RKN: 45255]
Applied Mathematical Finance (2011) 18 (1-2) : 93-117.
The main objective of this work is to construct optimal temperature futures from available market-traded contracts to hedge spatial
risk. Temperature dynamics are modelled by a stochastic differential equation with spatial dependence. Optimal positions in
market-traded futures minimizing the variance are calculated. Examples with numerical simulations based on a fast algorithm for
the generation of random fields are presented.

Available via Athens: Taylor & Francis Online
http://www.openathens.net



65

The impact of stochastic volatility on pricing, hedging, and hedge efficiency of withdrawal benefit guarantees in variable
annuities. Kling, Alexander; Ruez, Frederik; Russ, Jochen - 35 pages. [RKN: 74745]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2011) 41 (2) : 511-545.
We analyze different types of guaranteed withdrawal benefits for life, the latest guarantee feature within variable annuities.
Besides an analysis of the impact of different product features on the clients' payoff profile, we focus on pricing and hedging of the
guarantees. In particular, we investigate the impact of stochastic equity volatility on pricing and hedging. We consider different
dynamic hedging strategies for delta and vega risks and compare their performance. We also examine the effects if the hedging
model (with deterministic volatility) differs from the data-generating model (with stochastic volatility). This is an indication for the
model risk an insurer takes by assuming constant equity volatilities for risk management purposes, whereas in the real world
volatilities are stochastic.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Longevity risk management for life and variable annuities: the effectiveness of static hedging using longevity bonds and
derivatives. Ngai, Andrew; Sherris, Michael [RKN: 44980]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (1) : 100-114.
For many years, the longevity risk of individuals has been underestimated, as survival probabilities have improved across the
developed world. The uncertainty and volatility of future longevity has posed significant risk issues for both individuals and product
providers of annuities and pensions. This paper investigates the effectiveness of static hedging strategies for longevity risk
management using longevity bonds and derivatives (q-forwards) for the retail products: life annuity, deferred life annuity, indexed
life annuity, and variable annuity with guaranteed lifetime benefits. Improved market and mortality models are developed for the
underlying risks in annuities. The market model is a regime-switching vector error correction model for GDP, inflation, interest
rates, and share prices. The mortality model is a discrete-time logit model for mortality rates with age dependence. Models were
estimated using Australian data. The basis risk between annuitant portfolios and population mortality was based on UK
experience. Results show that static hedging using q-forwards or longevity bonds reduces the longevity risk substantially for life
annuities, but significantly less for deferred annuities. For inflation-indexed annuities, static hedging of longevity is less effective
because of the inflation risk. Variable annuities provide limited longevity protection compared to life annuities and indexed
annuities, and as a result longevity risk hedging adds little value for these products.


Available via Athens: Palgrave MacMillan
http://www.openathens.net

No-good-deal, local mean-variance and ambiguity risk pricing and hedging for an insurance payment process. Delong, Lukasz -
30 pages. [RKN: 70749]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2012) 42 (1) : 203-232.
We study pricing and hedging for an insurance payment process. We investigate a Black-Scholes financial model with stochastic
coefficients and a payment process with death, survival and annuity claims driven by a point process with a stochastic intensity.
The dependence of the claims and the intensity on the financial market and on an additional background noise (correlated index,
longevity risk) is allowed. We establish a general modeling framework for no-good-deal, local mean-variance and ambiguity risk
pricing and hedging. We show that these three valuation approaches are equivalent under appropriate formulations. We
characterize the price and the hedging strategy as a solution to a backward stochastic differential equation. The results could be
applied to pricing and hedging of variable annuities, surrender options under an irrational lapse behavior and mortality derivatives.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Optimal strategies for hedging portfolios of unit-linked life insurance contracts with minimum death guarantee. Nteukam T,
Oberlain; Planchet, Frdric; Therond, Pierre [RKN: 40010]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 161-175.
In this paper, we are interested in hedging strategies which allow the insurer to reduce the risk to their portfolio of unit-linked life
insurance contracts with minimum death guarantee. Hedging strategies are developed in the Black and Scholes model and in the
Merton jumpdiffusion model. According to the new frameworks (IFRS, Solvency II and MCEV), risk premium is integrated into our
valuations. We will study the optimality of hedging strategies by comparing risk indicators (Expected loss, volatility, VaR and CTE)
in relation to transaction costs and costs generated by the re-hedging error. We will analyze the robustness of hedging strategies
by stress-testing the effect of a sharp rise in future mortality rates and a severe depreciation in the price of the underlying asset.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Pension fund management and conditional indexation. Kleinow, Torsten [RKN: 45300]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU/AST
ASTIN Bulletin (2011) 41 (1) : 61-86.
Conditional indexation offers a middle way between defined benefit and defined contribution pension schemes. In this paper, we
consider a fully-funded pension scheme with conditional indexation. We show how the pension fund can be managed to reduce
the risks associated with promised pension benefits when declared benefits are adjusted regularly during the working life. In
particular, we derive an investment strategy that provides protection against underfunding at retirement and which is self-financing
on average. Our results are illustrated in an extensive simulation study.
online access via International Actuarial Association:
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Quantile hedging for equity-linked contracts. Klusik, Przemyslaw; Palmowski, Zbigniew [RKN: 40023]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 280-286.
We consider an equity-linked contract whose payoff depends on the lifetime of the policy holder and the stock price. We provide
66

the best strategy for an insurance company assuming limited capital for the hedging. The main idea of the proof consists in
reducing the construction of such strategies for a given claim to a problem of superhedging for a modified claim, which is the
solution to a static optimization problem of the NeymanPearson type. This modified claim is given via some sets constructed in
an iterative way. Some numerical examples are also given.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Variance-optimal hedging for time-changed Levy processes. Kallsen, Jan; Pauwels, Arnd [RKN: 45251]
Applied Mathematical Finance (2011) 18 (1-2) : 1-28.
In this article, we solve the variance-optimal hedging problem in stochastic volatility (SV) models based on time-changed Levy
processes, that is, in the setup of Carr et al. (2003). The solution is derived using results for general affine models in the
companion article [Kallsen and Pauwels (2009)].

Available via Athens: Taylor & Francis Online
http://www.openathens.net

HURRICANES

Global warming, extreme weather events, and forecasting tropical cyclones. Chang, Carolyn W; Chang, Jack S K; Guan Lim, Kian -
25 pages. [RKN: 70744]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2012) 42 (1) : 77-101.
Global warming has more than doubled the likelihood of extreme weather events, e.g. the 2003 European heat wave, the growing
intensity of rain and snow in the Northern Hemisphere, and the increasing risk of flooding in the United Kingdom. It has also
induced an increasing number of deadly tropical cyclones with a continuing trend. Many individual meteorological dynamic
simulations and statistical models are available for forecasting hurricanes but they neither forecast well hurricane intensity nor
produce clear-cut consensus. We develop a novel hurricane forecasting model by straddling two seemingly unrelated disciplines
physical science and finance based on the well known price discovery function of trading in financial markets. Traders of
hurricane derivative contracts employ all available forecasting models, public or proprietary, to forecast hurricanes in order to
make their pricing and trading decisions. By using transactional price changes of these contracts that continuously clear the
market supply and demand as the predictor, and with calibration to extract the embedded hurricane information by developing
hurricane futures and futures option pricing models, one can gain a forward-looking market-consensus forecast out of all of the
individual forecasting models employed. Our model can forecast when a hurricane will make landfall, how destructive it will be,
and how this destructive power will evolve from inception to landing. While the NHC (National Hurricane Center) blends 50 plus
individual forecasting results for its consensus model forecasts using a subjective approach, our aggregate is market-based.
Believing their proprietary forecasts are sufficiently different from our market-based forecasts, traders could also examine the
discrepancy for a potential trading opportunity using hurricane derivatives. We also provide a real case analysis of Hurricane Irene
in 2011 using our methodology.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

A nonhomogeneous Poisson hidden Markov model for claim counts. Lu, Yi; Zeng, Leilei - 22 pages. [RKN: 70748]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2012) 42 (1) : 181-202.
We propose a nonhomogeneous Poisson hidden Markov model for a time series of claim counts that accounts for both seasonal
variations and random fluctuations in the claims intensity. It assumes that the parameters of the intensity function for the
nonhomogeneous Poisson distribution vary according to an (unobserved) underlying Markov chain. This can apply to natural
phenomena that evolve in a seasonal environment. For example, hurricanes that are subject to random fluctuations (El Nio-La
Nia cycles) affect insurance claims. The Expectation-Maximization (EM) algorithm is used to calculate the maximum likelihood
estimators for the parameters of this dynamic Poisson hidden Markov model. Statistical applications of this model to Atlantic
hurricanes and tropical storms data are discussed.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

IMPERFECT INFORMATION

Updating beliefs with imperfect signals: Experimental evidence. Poinas, Franois; Rosaz, Julie; Roussillon, Batrice Springer, - 23
pages. [RKN: 73972]
Shelved at: Per: J Risk Uncrtnty
Journal of Risk and Uncertainty (2012) 44 (3) : 219-241.
We conduct an experiment on individual choice under risk in which we study belief updating when an agent receives a signal that
restricts the number of possible states of the world. Subjects observe a sample drawn from an urn and form initial beliefs about the
urns composition. We then elicit how beliefs are modified after subjects receive a signal that restricts the set of the possible urns
from which the observed sample could have been drawn. We find that this type of signal increases the frequency of correct
assessments and that prediction accuracy is higher for lower levels of risk. We also show that prediction accuracy is higher after
invalidating signals (i.e. signals that contradict the initial belief). This pattern is explained by the lower level of risk associated with
invalidating signals. Finally, we find evidence for a lack of persistence of choices under high risk.
http://www.openathens.net

67

INCOME

Pricing fixed-income securities in an information-based framework. Hughston, Lane P; Macrina, Andrea Routledge, [RKN: 45844]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (3-4) : 361-379.
The purpose of this article is to introduce a class of information-based models for the pricing of fixed-income securities. We
consider a set of continuous-time processes that describe the flow of information concerning market factors in a monetary
economy. The nominal pricing kernel is assumed to be given at any specified time by a function of the values of information
processes at that time. Using a change-of-measure technique, we derive explicit expressions for the prices of nominal discount
bonds and deduce the associated dynamics of the short rate of interest and the market price of risk. The interest rate positivity
condition is expressed as a differential inequality. An example that shows how the model can be calibrated to an arbitrary initial
yield curve is presented. We proceed to model the price level, which is also taken at any specified time to be given by a function of
the values of the information processes at that time. A simple model for a stochastic monetary economy is introduced in which the
prices of the nominal discount bonds and inflation-linked notes can be expressed in terms of aggregate consumption and the
liquidity benefit generated by the money supply.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

INCOME PROTECTION

The genetics of breast and ovarian cancer IV: a model of breast cancer progression. Lu, Baopeng; Macdonald, Angus S; Waters,
Howard R [RKN: 44921]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 4 : 239-266.
Gui et al. (2006) in Part III of a series of papers, proposed a dynamic family history model of breast cancer and ovarian cancer in
which the development of a family history was represented explicitly as a transition between states, and then applied this model to
life insurance and critical illness insurance. In this study, the authors extend the model to income protection insurance. In this
paper, Part IV of the series, the authors construct and parameterise a semi-Markov model for the life history of a woman with
breast cancer, in which events such as diagnosis, treatment, recovery and recurrence are incorporated. In Part V, we then show:
(a) estimates of premium ratings depending on genotype or family history; and (b) the impact of adverse selection under various
moratoria on the use of genetic information.

INCOME PROTECTION INSURANCE

The genetics of breast and ovarian cancer V: application to income protection insurance. Lu, Baopeng; Macdonald, Angus S;
Waters, Howard R; Yu, Fei [RKN: 44922]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 4 : 267-291.
In part IV, we presented a comprehensive model of a life history of a woman at risk of breast cancer (BC), in which relevant events
such as diagnosis, treatment, recovery and recurrence were represented explicitly, and corresponding transition intensities werre
estimated. In this part, the authors study some applications to income protection insurance (IPI) business. The authors calculate
premiums based either on genetic test results or more practically on a family history of breast cancer. They then extend the model
into an Income Protection Insurance model by incorporating rates of insurance-buying behaviour, in order to estimate the possible
costs of adverse selection, in terms of increased premiums, under various moratoria on the use of genetic information.

Survival analysis of left truncated income protection insurance data. Liu, Qing; Pitt, David; Wang, Yan; Wu, Xueyuan (2012). -
Victoria: University of Melbourne, 2012. - 19 pages. [RKN: 73987]
One of the main characteristics of Income Protection Insurance (IPI) claim duration data, which has not been considered in the
actuarial literature on the topic, is left-truncation. Claimants that are observed are those whose sickness durations are longer than
the deferred periods specified in the policies, and hence left-truncation exists in these data. This paper investigates a series of
conditional mixture models when applying survival analysis to model sickness durations of IPI claimants, and examines the
consequence of treating the IPI data with lengthy deferred periods as complete data and therefore ignoring the left truncation by
fitting the corresponding unconditional distributions. It also quantifies the extent of the bias in the resulting parameter estimates
when ignoring the left-truncation in the data. Using the UK Continuous Mortality Investigation (CMI) sickness duration data, some
well-fitting survival model results are estimated. It is demonstrated that ignoring the left-truncation in certain IPI data can lead to
substantially different statistical estimates. We therefore suggest taking left-truncation into account by fitting conditional mixture
distributions to IPI data. Furthermore, the best fitting model is extended by introducing a number of covariates into the conditional
part to do regression analysis.
No.1 (1993) onwards available online. Download as PDF.
http://www.economics.unimelb.edu.au/ACT/papers.shtml




68

INDEPENDENCE

Approximation of bivariate copulas by patched bivariate Frchet copulas. Zheng, Yanting; Yang, Jingping; Huang, Jianhua Z [RKN:
40019]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 246-256.
Bivariate Frchet (BF) copulas characterize dependence as a mixture of three simple structures: comonotonicity, independence
and countermonotonicity. They are easily interpretable but have limitations when used as approximations to general dependence
structures. To improve the approximation property of the BF copulas and keep the advantage of easy interpretation, we develop a
new copula approximation scheme by using BF copulas locally and patching the local pieces together. Error bounds and a
probabilistic interpretation of this approximation scheme are developed. The new approximation scheme is compared with several
existing copula approximations, including shuffle of min, checkmin, checkerboard and Bernstein approximations and exhibits
better performance, especially in characterizing the local dependence. The utility of the new approximation scheme in insurance
and finance is illustrated in the computation of the rainbow option prices and stop-loss premiums.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

INDEXATION

Pension fund management and conditional indexation. Kleinow, Torsten [RKN: 45300]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU/AST
ASTIN Bulletin (2011) 41 (1) : 61-86.
Conditional indexation offers a middle way between defined benefit and defined contribution pension schemes. In this paper, we
consider a fully-funded pension scheme with conditional indexation. We show how the pension fund can be managed to reduce
the risks associated with promised pension benefits when declared benefits are adjusted regularly during the working life. In
particular, we derive an investment strategy that provides protection against underfunding at retirement and which is self-financing
on average. Our results are illustrated in an extensive simulation study.
online access via International Actuarial Association:
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

INFLATION

An affine two-factor heteroskedastic macro-finance term structure model. Spreij, Peter; Veerman, Enno; Vlaar, Peter [RKN: 45462]
Applied Mathematical Finance (2011) 18 (3-4) : 331-352.
We propose an affine macro-finance term structure model for interest rates that allows for both constant volatilities
(homoskedastic model) and state-dependent volatilities (heteroskedastic model). In a homoskedastic model, interest rates are
symmetric, which means that either very low interest rates are predicted too often or very high interest rates not often enough. This
undesirable symmetry for constant volatility models motivates the use of heteroskedastic models where the volatility depends on
the driving factors.

For a truly heteroskedastic model in continuous time, which involves a multivariate square root process, the so-called Feller
conditions are usually imposed to ensure that the roots have non-negative arguments. For a discrete time approximate model, the
Feller conditions do not give this guarantee. Moreover, in a macro-finance context, the restrictions imposed might be economically
unappealing. It has also been observed that even without the Feller conditions imposed, for a practically relevant term structure
model, negative arguments rarely occur.

Using models estimated on German data, we compare the yields implied by (approximate) analytic exponentially affine
expressions to those obtained through Monte Carlo simulations of very high numbers of sample paths. It turns out that the
differences are rarely statistically significant, whether the Feller conditions are imposed or not. Moreover, economically, the
differences are negligible, as they are always below one basis point.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

Optimal asset allocation for DC pension plans under inflation. Han, Nan-wei; Hung, Mao-Wei [RKN: 45734]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 172-181.
In this paper, the stochastic dynamic programming approach is used to investigate the optimal asset allocation for a
defined-contribution pension plan with downside protection under stochastic inflation. The plan participant invests the fund wealth
and the stochastic interim contribution flows into the financial market. The nominal interest rate model is described by the
CoxIngersollRoss (Cox et al., 1985) dynamics. To cope with the inflation risk, the inflation indexed bond is included in the asset
menu. The retired individuals receive an annuity that is indexed by inflation and a downside protection on the amount of this
annuity is considered. The closed-form solution is derived under the CRRA utility function. Finally, a numerical application is
presented to characterize the dynamic behavior of the optimal investment strategy.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

69

INFORMATION

Insurance pricing with complete information, state-dependent utility, and production costs. Ramsay, Colin M; Oguledo, Victor I
[RKN: 45649]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 462-469.
We consider a group of identical risk-neutral insurers selling single-period indemnity insurance policies. The insurance market
consists of individuals with common state-dependent utility function who are identical except for their known accident probability q.
Insurers incur production costs (commonly called expenses or transaction costs by actuaries) that are proportional to the amount
of insurance purchased and to the premium charged. By introducing the concept of insurance desirability, we prove that the
existence of insurer expenses generates a pair of constants qmin and qmax that naturally partitions the applicant pool into three
mutually exclusive and exhaustive groups of individuals: those individuals with accident probability q [0,qmin) are insurable but do
not desire insurance, those individuals with accident probability q [qmin,qmax] are insurable and desire insurance, and those
individuals with accident probability q (qmax,1] desire insurance but are uninsurable. We also prove that, depending on the level of
q and the marginal rate of substitution between states, it may be optimal for individuals to buy complete (full) insurance, partial
insurance, or no insurance at all. Finally, we prove that when q is known in monopolistic markets (i.e., markets with a single
insurer), applicants may be induced to over insure whenever partial insurance is bought.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Pricing fixed-income securities in an information-based framework. Hughston, Lane P; Macrina, Andrea Routledge, [RKN: 45844]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (3-4) : 361-379.
The purpose of this article is to introduce a class of information-based models for the pricing of fixed-income securities. We
consider a set of continuous-time processes that describe the flow of information concerning market factors in a monetary
economy. The nominal pricing kernel is assumed to be given at any specified time by a function of the values of information
processes at that time. Using a change-of-measure technique, we derive explicit expressions for the prices of nominal discount
bonds and deduce the associated dynamics of the short rate of interest and the market price of risk. The interest rate positivity
condition is expressed as a differential inequality. An example that shows how the model can be calibrated to an arbitrary initial
yield curve is presented. We proceed to model the price level, which is also taken at any specified time to be given by a function of
the values of the information processes at that time. A simple model for a stochastic monetary economy is introduced in which the
prices of the nominal discount bonds and inflation-linked notes can be expressed in terms of aggregate consumption and the
liquidity benefit generated by the money supply.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

INSURANCE

Ambiguity aversion : a new perspective on insurance pricing. Zhao, Lin; Zhu, Wei [RKN: 45304]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU/AST
ASTIN Bulletin (2011) 41 (1) : 157-189.
This paper intends to develop a feasible framework which incorporates ambiguity aversion into the pricing of insurance products
and investigate the implications of ambiguity aversion on the pricing by comparing it with risk aversion. As applications of the
framework, we present the closed-form pricing formulae for some insurance products appearing in life insurance and property
insurance. Our model confirms that the effects of ambiguity aversion on the pricing of insurance do differ from those of risk
aversion. Implications of our model are consistent with some empirical evidences documented in the literature. Our results
suggest that taking advantage of natural hedge mechanism can help us control the effects of model uncertainty.
online access via International Actuarial Association:
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Behavioral optimal insurance. Sung, K C; Yam, S C P; Yung, S P; Zhou, J H [RKN: 44944]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 418-428.
The present work studies the optimal insurance policy offered by an insurer adopting a proportional premium principle to an
insured whose decision-making behavior is modeled by Kahneman and Tverskys Cumulative Prospect Theory with convex
probability distortions. We show that, under a fixed premium rate, the optimal insurance policy is a generalized insurance layer
(that is, either an insurance layer or a stoploss insurance). This optimal insurance decision problem is resolved by first converting
it into three different sub-problems similar to those in ; however, as we now demand a more regular optimal solution, a completely
different approach has been developed to tackle them. When the premium is regarded as a decision variable and there is no risk
loading, the optimal indemnity schedule in this form has no deductibles but a cap; further results also suggests that the deductible
amount will be reduced if the risk loading is decreased. As a whole, our paper provides a theoretical explanation for the popularity
of limited coverage insurance policies in the market as observed by many socio-economists, which serves as a mathematical
bridge between behavioral finance and actuarial science.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Competitive insurance market in the presence of ambiguity. Anwar, Sajid; Zheng, Mingli [RKN: 44992]
Insurance: Mathematics & Economics (2012) 50 (1) : 79-84.
Within the context of a competitive insurance market, this paper examines the impact of ambiguity on the behavior of buyers and
sellers. Ambiguity is described through a probability measure on an extended state space that includes extra ambiguous states. It
70

is shown that if insurers face the same or less ambiguity than their customers, a unique equilibrium exists where customers are
fully insured. On the other hand, if insurers face more ambiguity than their customers, customers will be under insured and it is
even possible that customers may not purchase any insurance.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Copula models for insurance claim numbers with excess zeros and time-dependence. Zhao, XiaoBing; Zhou, Xian [RKN: 45535]
Insurance: Mathematics & Economics (2012) 50 (1) : 191-199.
This paper develops two copula models for fitting the insurance claim numbers with excess zeros and time-dependence. The joint
distribution of the claims in two successive periods is modeled by a copula with discrete or continuous marginal distributions. The
first model fits two successive claims by a bivariate copula with discrete marginal distributions. In the second model, a copula is
used to model the random effects of the conjoint numbers of successive claims with continuous marginal distributions.
Zero-inflated phenomenon is taken into account in the above copula models. The maximum likelihood is applied to estimate the
parameters of the discrete copula model. A two-step procedure is proposed to estimate the parameters in the second model, with
the first step to estimate the marginals, followed by the second step to estimate the unobserved random effect variables and the
copula parameter. Simulations are performed to assess the proposed models and methodologies.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Dependence modeling in non-life insurance using the Bernstein copula. Diers, Dorothea; Eling, Martin; Marek, Sebastian D [RKN:
45646]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 430-436.
This paper illustrates the modeling of dependence structures of non-life insurance risks using the Bernstein copula. We conduct a
goodness-of-fit analysis and compare the Bernstein copula with other widely used copulas. Then, we illustrate the use of the
Bernstein copula in a value-at-risk and tail-value-at-risk simulation study. For both analyses we utilize German claims data on
storm, flood, and water damage insurance for calibration. Our results highlight the advantages of the Bernstein copula, including
its flexibility in mapping inhomogeneous dependence structures and its easy use in a simulation context due to its representation
as mixture of independent Beta densities. Practitioners and regulators working toward appropriate modeling of dependences in a
risk management and solvency context can benefit from our results.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Fitting insurance claims to skewed distributions: are the skew-normal and skew-student good models?. Eling, Martin [RKN:
44782]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51(2) : 239-248.
This paper analyzes whether the skew-normal and skew-student distributions recently discussed in the finance literature are
reasonable models for describing claims in property-liability insurance. We consider two well-known datasets from actuarial
science and fit a number of parametric distributions to these data. Also the non-parametric transformation kernel approach is
considered as a benchmark model. We find that the skew-normal and skew-student are reasonably competitive compared to other
models in the literature when describing insurance data. In addition to goodness-of-fit tests, tail risk measures such as value at risk
and tail value at risk are estimated for the datasets under consideration.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Hierarchical structures in the aggregation of premium risk for insurance underwriting. Savelli, Nino; Clemente, Gian Paolo [RKN:
45489]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 3 : 193-213.
In the valuation of the Solvency II capital requirement, the correct appraisal of risk dependencies acquires particular relevance.
These dependencies refer to the recognition of risk diversification in the aggregation process and there are different levels of
aggregation and hence different types of diversification. For instance, for a non-life company at the first level the risk components
of each single line of business (e.g. premium, reserve, and CAT risks) need to be combined in the overall portfolio, the second
level regards the aggregation of different kind of risks as, for example, market and underwriting risk, and finally various solo legal
entities could be joined together in a group.

Solvency II allows companies to capture these diversification effects in capital requirement assessment, but the identification of a
proper methodology can represent a delicate issue. Indeed, while internal models by simulation approaches permit usually to
obtain the portfolio multivariate distribution only in the independence case, generally the use of copula functions can consent to
have the multivariate distribution under dependence assumptions too.

However, the choice of the copula and the parameter estimation could be very problematic when only few data are available. So it
could be useful to find a closed formula based on Internal Models independence results with the aim to obtain the capital
requirement under dependence assumption.

A simple technique, to measure the diversification effect in capital requirement assessment, is the formula, proposed by Solvency
II quantitative impact studies, focused on the aggregation of capital charges, the latter equal to percentile minus average of total
claims amount distribution of single line of business (LoB), using a linear correlation matrix.

On the other hand, this formula produces the correct result only for a restricted class of distributions, while it may underestimate
the diversification effect.

In this paper we present an alternative method, based on the idea to adjust that formula with proper calibration factors (proposed
by Sandstrm (2007)) and appropriately extended with the aim to consider very skewed distribution too.

71

In the last part considering different non-life multi-line insurers, we compare the capital requirements obtained, for only premium
risk, applying the aggregation formula to the results derived by elliptical copulas and hierarchical Archimedean copulas.

Insurance pricing with complete information, state-dependent utility, and production costs. Ramsay, Colin M; Oguledo, Victor I
[RKN: 45649]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 462-469.
We consider a group of identical risk-neutral insurers selling single-period indemnity insurance policies. The insurance market
consists of individuals with common state-dependent utility function who are identical except for their known accident probability q.
Insurers incur production costs (commonly called expenses or transaction costs by actuaries) that are proportional to the amount
of insurance purchased and to the premium charged. By introducing the concept of insurance desirability, we prove that the
existence of insurer expenses generates a pair of constants qmin and qmax that naturally partitions the applicant pool into three
mutually exclusive and exhaustive groups of individuals: those individuals with accident probability q [0,qmin) are insurable but do
not desire insurance, those individuals with accident probability q [qmin,qmax] are insurable and desire insurance, and those
individuals with accident probability q (qmax,1] desire insurance but are uninsurable. We also prove that, depending on the level of
q and the marginal rate of substitution between states, it may be optimal for individuals to buy complete (full) insurance, partial
insurance, or no insurance at all. Finally, we prove that when q is known in monopolistic markets (i.e., markets with a single
insurer), applicants may be induced to over insure whenever partial insurance is bought.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Multivariate insurance models: an overview. Anastasiadis, Simon; Chukova, Stefanka [RKN: 45739]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 222-227.
This literature review summarizes the results from a collection of research papers that relate to modeling insurance claims and the
processes associated with them. We consider work by more than 55 authors, published or presented between 1971 and 2008.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

On 1-convexity and nucleolus of co-insurance games. Driessen, Theo S H; Vito, Fragnelli; Katsev, Ilya V; Khmelnitskaya, Anna B
[RKN: 40016]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 217-225.
The insurance situation in which an enormous risk is insured by a number of insurance companies is modeled through a
cooperative TU game, the so-called co-insurance game, first introduced in Fragnelli and Marina (2004). In this paper we present
certain conditions on the parameters of the model that guarantee the 1-convexity property of co-insurance games which in turn
ensures the nonemptiness of the core and the linearity of the nucleolus as a function of the variable premium. Further we reveal
conditions when a co-insurance game is representable in the form of a veto-removed game and present an efficient final algorithm
for computing the nucleolus of a veto-removed game.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

On allocation of upper limits and deductibles with dependent frequencies and comonotonic severities. Li, Xiaohu; You, Yinping
[RKN: 45645]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 423-429.
With the assumption of Archimedean copula for the occurrence frequencies of the risks covered by an insurance policy, this note
further investigates the allocation problem of upper limits and deductibles addressed in Hua and Cheung (2008a). Sufficient
conditions for a risk averse policyholder to well allocate the upper limits and the deductibles are built, respectively.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Optimal control and dependence modeling of insurance portfolios with Lvy dynamics. Bauerle, Nicole; Blatter, Anja [RKN: 45134]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (3) : 398-405.
In this paper we are interested in optimizing proportional reinsurance and investment policies in a multidimensional Lvy-driven
insurance model. The criterion is that of maximizing exponential utility. Solving the classical HamiltonJacobiBellman equation
yields that the optimal retention level keeps a constant amount of claims regardless of time and the companys wealth level. A
special feature of our construction is to allow for dependencies of the risk reserves in different business lines. Dependence is
modeled via an Archimedean Lvy copula. We derive a sufficient and necessary condition for an Archimedean Lvy generator to
create a multidimensional positive Lvy copula in arbitrary dimension. Based on these results we identify structure conditions for
the generator and the Lvy measure of an Archimedean Lvy copula under which an insurance company reinsures a larger
fraction of claims from one business line than from another.

Available via Athens: Palgrave MacMillan
http://www.openathens.net

The Solvency II square-root formula for systematic biometric risk. Christiansen, Marcus C; Denuit, Michel M; Lazar, Dorina [RKN:
45599]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (2) : 257-265.
In this paper, we develop a model supporting the so-called square-root formula used in Solvency II to aggregate the modular life
SCR. Describing the insurance policy by a Markov jump process, we can obtain expressions similar to the square-root formula in
Solvency II by means of limited expansions around the best estimate. Numerical illustrations are given, based on German
population data. Even if the square-root formula can be supported by theoretical considerations, it is shown that the QIS
72

correlation matrix is highly questionable.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

INSURANCE COMPANIES

Alarm system for insurance companies : A strategy for capital allocation. Das, S; Kratz, M [RKN: 45723]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 53-65.
One possible way of risk management for an insurance company is to develop an early and appropriate alarm system before the
possible ruin. The ruin is defined through the status of the aggregate risk process, which in turn is determined by premium
accumulation as well as claim settlement outgo for the insurance company. The main purpose of this work is to design an effective
alarm system, i.e. to define alarm times and to recommend augmentation of capital of suitable magnitude at those points to reduce
the chance of ruin. To draw a fair measure of effectiveness of alarm system, comparison is drawn between an alarm system, with
capital being added at the sound of every alarm, and the corresponding system without any alarm, but an equivalently higher initial
capital. Analytical results are obtained in general setup and this is backed up by simulated performances with various types of loss
severity distributions. This provides a strategy for suitably spreading out the capital and yet addressing survivability concerns at
factory level.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Multivariate longitudinal modeling of insurance company expenses. Shi, Peng [RKN: 45737]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 204-215.
Insurers, investors and regulators are interested in understanding the behaviour of insurance company expenses, due to the high
operating cost of the industry. Expense models can be used for prediction, to identify unusual behavior, and to measure firm
efficiency. Current literature focuses on the study of total expenses that consist of three components: underwriting, investment and
loss adjustment. A joint study of expenses by type is to deliver more information and is critical in understanding their relationship.

This paper introduces a copula regression model to examine the three types of expenses in a longitudinal context. In our method,
elliptical copulas are employed to accommodate the between-subject contemporaneous and lag dependencies, as well as the
within-subject serial correlations of the three types. Flexible distributions are allowed for the marginals of each type with covariates
incorporated in distribution parameters. A model validation procedure based on a t-plot method is proposed for in-sample and
out-of-sample validation purposes. The multivariate longitudinal model effectively addresses the typical features of expenses
data: the heavy tails, the strong individual effects and the lack of balance.

The analysis is performed using propertycasualty insurance company expenses data from the National Association of Insurance
Commissioners of years 20012006. A unique set of covariates is determined for each type of expenses. We found that
underwriting expenses and loss adjustment expenses are complements rather than substitutes. The model is shown to be
successful in efficiency classification. Also, a multivariate predictive density is derived to quantify the future values of an insurers
expenses.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Optimal non-proportional reinsurance control and stochastic differential games. Taksar, Michael; Zeng, Xudong [RKN: 38228]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 64-71.
We study stochastic differential games between two insurance companies who employ reinsurance to reduce risk exposure. We
consider competition between two companies and construct a single payoff function of two companies surplus processes. One
company chooses a dynamic reinsurance strategy in order to maximize the payoff function while its opponent is simultaneously
choosing a dynamic reinsurance strategy so as to minimize the same quantity. We describe the Nash equilibrium of the game and
prove a verification theorem for a general payoff function. For the payoff function being the probability that the difference between
two surplus reaches an upper bound before it reaches a lower bound, the game is solved explicitly.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Quantile hedging for equity-linked contracts. Klusik, Przemyslaw; Palmowski, Zbigniew [RKN: 40023]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 280-286.
We consider an equity-linked contract whose payoff depends on the lifetime of the policy holder and the stock price. We provide
the best strategy for an insurance company assuming limited capital for the hedging. The main idea of the proof consists in
reducing the construction of such strategies for a given claim to a problem of superhedging for a modified claim, which is the
solution to a static optimization problem of the NeymanPearson type. This modified claim is given via some sets constructed in
an iterative way. Some numerical examples are also given.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

A utility-based comparison of pension funds and life insurance companies under regulatory constraints. Broeders, Dirk; Chen,
An; Koos, Birgit [RKN: 44969]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (1) : 1-10.
This paper compares two different types of annuity providers, i.e. defined benefit pension funds and life insurance companies.
73

One of the key differences is that the residual risk in pension funds is collectively borne by the beneficiaries and the sponsors
shareholders while in the case of life insurers it is borne by the external shareholders. First, this paper employs a contingent claim
approach to evaluate the risk return tradeoff for annuitants. For that, we take into account the differences in contract specifications
and in regulatory regimes. Second, a welfare analysis is conducted to examine whether a consumer with power utility experiences
utility gains if she chooses a defined benefit plan or a life annuity contract over a defined contribution plan. We demonstrate that
regulation can be designed to support a level playing field amongst different financial institutions.

Available via Athens: Palgrave MacMillan
http://www.openathens.net

INSURANCE COMPANY

Optimal dividend and investing control of an insurance company with higher solvency constraints. Liang, Zongxia; Huang,
Jianping [RKN: 44952]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 501-511.
This paper considers the optimal control problem of a large insurance company under a fixed insolvency probability. The company
controls proportional reinsurance rate, dividend pay-outs and investing process to maximize the expected present value of the
dividend pay-outs until the time of bankruptcy. This paper aims at describing the optimal return function as well as the optimal
policy. As a by-product, the paper theoretically sets a risk-based capital standard to ensure the capital requirement that can cover
the total risk.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

INSURANCE INDUSTRY

Future building water loss projections posed by climate change. Haug, Ola; Dimakos, Xeni K; Vardal, Jofrid F; Aldrin, Magne;
Meze-Hausken, Elisabeth [RKN: 45146]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 1 : 1-20.
The insurance industry, like other parts of the financial sector, is vulnerable to climate change. Life as well as non-life products are
affected and knowledge of future loss levels is valuable. Risk and premium calculations may be updated accordingly, and
dedicated loss-preventive measures can be communicated to customers and regulators. We have established statistical claims
models for the coherence between externally inflicted water damage to private buildings in Norway and selected meteorological
variables. Based on these models and downscaled climate predictions from the Hadley centre HadAM3H climate model, the
estimated loss level of a future scenario period (2071-2100) is compared to that of a control period (1961-1990). In spite of
substantial estimation uncertainty, our analyses identify an incontestable increase in the claims level along with some regional
variability. Of the uncertainties inherently involved in such predictions, only the error due to model fit is quantifiable.

INTEREST

On the absolute ruin problem in a Sparre Andersen risk model with constant interest. Mitric, Ilie-Radu; Badescu, Andrei L; Stanford,
David A [RKN: 45533]
Insurance: Mathematics & Economics (2012) 50 (1) : 167-178.
In this paper, we extend the work of Mitric and Sendova (2010), which considered the absolute ruin problem in a risk model with
debit and credit interest, to renewal and non-renewal structures. Our first results apply to MAP processes, which we later restrict to
the Sparre Andersen renewal risk model with interclaim times that are generalized Erlang (n) distributed and claim amounts
following a Matrix-Exponential (ME) distribution (see for e.g. Asmussen and OCinneide (1997)). Under this scenario, we present
a general methodology to analyze the GerberShiu discounted penalty function defined at absolute ruin, as a solution of
high-order linear differential equations with non-constant coefficients. Closed-form solutions for some absolute ruin related
quantities in the generalized Erlang (2) case complement the results obtained under the classical risk model by Mitric and
Sendova (2010).
Available via Athens: Palgrave MacMillan
http://www.openathens.net

INTEREST RATES

Bonds and options in exponentially affine bond models. Bermin, Hans-Peter [RKN: 45881]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (5-6) : 513-534.
In this article we apply the FlesakerHughston approach to invert the yield curve and to price various options by letting the
randomness in the economy be driven by a process closely related to the short rate, called the abstract short rate. This process is
74

a pure deterministic translation of the short rate itself, and we use the deterministic shift to calibrate the models to the initial yield
curve. We show that we can solve for the shift needed in closed form by transforming the problem to a new probability measure.
Furthermore, when the abstract short rate follows a CoxIngersollRoss (CIR) process we compute bond option and swaption
prices in closed form. We also propose a short-rate specification under the risk-neutral measure that allows the yield curve to be
inverted and is consistent with the CIR dynamics for the abstract short rate, thus giving rise to closed form bond option and
swaption prices.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

Covariance of discounted compound renewal sums with a stochastic interest rate. Lveill, Ghislain; Adkambi, Franck [RKN:
45356]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 2 : 138-153.
Formulas have been obtained for the moments of the discounted aggregate claims process, for a constant instantaneous interest
rate, and for a claims number process that is an ordinary or a delayed renewal process. In this paper, we present explicit formulas
on the first two moments and the joint moment of this risk process, for a non-trivial extension to a stochastic instantaneous interest
rate. Examples are given for Erlang claims number processes, and for the Ho-Lee-Merton and the Vasicek interest rate models.

DeltaGamma hedging of mortality and interest rate risk. Luciano, Elisa; Regis, Luca; Vigna, Elena [RKN: 45643]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 402-412.
One of the major concerns of life insurers and pension funds is the increasing longevity of their beneficiaries. This paper studies
the hedging problem of annuity cash flows when mortality and interest rates are stochastic. We first propose a DeltaGamma
hedging technique for mortality risk. The risk factor against which to hedge is the difference between the actual mortality intensity
in the future and its forecast today, the forward intensity. We specialize the hedging technique first to the case in which mortality
intensities are affine, then to OrnsteinUhlenbeck and Feller processes, providing actuarial justifications for this selection. We
show that, without imposing no arbitrage, we can get equivalent probability measures under which the HJM condition for no
arbitrage is satisfied. Last, we extend our results to the presence of both interest rate and mortality risk. We provide a UK
calibrated example of DeltaGamma hedging of both mortality and interest rate risk.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Joint moments of discounted compound renewal sums. Lveill, Ghislain; Adkambi, Franck [RKN: 44930]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2012) 1 : 40-55.
The first two moments and the covariance of the aggregate discounted claims have been found for a stochastic interest rate, from
which the inflation rate has been subtracted, and for a claims number process that is an ordinary or a delayed renewal
process.Hereafter we extend the preceding results by presenting recursive formulas for the joint moments of this risk process, for
a constant interest rate, and non-recursive formulas for higher joint moments when the interest rate is stochastic. Examples are
given for exponential claims inter-arrival times and for the Ho-Lee-Merton interest rate model.

On cross-currency models with stochastic volatility and correlated interest rates. Grzelak, Lech A; Oosterleeac, Cornelis W
Routledge, [RKN: 45793]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (1-2) : 1-35.
We construct multi-currency models with stochastic volatility (SV) and correlated stochastic interest rates with a full matrix of
correlations. We first deal with a foreign exchange (FX) model of Heston-type, in which the domestic and foreign interest rates are
generated by the short-rate process of HullWhite (Hull, J. and White, A. [1990] Pricing interest-rate derivative securities, Review
of Financial Studies, 3, pp. 573592). We then extend the framework by modelling the interest rate by an SV displaced-diffusion
(DD) Libor Market Model (Andersen, L. B. G. and Andreasen, J. [2000] Volatility skews and extensions of the libor market model,
Applied Mathematics Finance, 1[7], pp. 132), which can model an interest rate smile. We provide semi-closed form
approximations which lead to efficient calibration of the multi-currency models. Finally, we add a correlated stock to the framework
and discuss the construction, model calibration and pricing of equityFXinterest rate hybrid pay-offs.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

On the moments of aggregate discounted claims with dependence introduced by a FGM copula. Bargs, Mathieu; Cossette,
Helene; Loisel, Stphane; Marceau, Etienne [RKN: 45306]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU/AST
ASTIN Bulletin (2011) 41 (1) : 215-238.
In this paper, we investigate the computation of the moments of the compound Poisson sums with discounted claims when
introducing dependence between the interclaim time and the subsequent claim size. The dependence structure between the two
random variables is defined by a Farlie-Gumbel-Morgenstern copula. Assuming that the claim distribution has finite moments, we
give expressions for the first and the second moments and then we obtain a general formula for any mth order moment. The
results are illustrated with applications to premium calculation and approximations based on moment matching methods.
online access via International Actuarial Association:
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

A performance analysis of participating life insurance contracts. Faust, Roger; Schmeiser, Hato; Zemp, Alexandra [RKN: 45733]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 158-171.
Participating life insurance contracts are one of the most important products in the European life insurance market. Even though
these contract forms are very common, only very little research has been conducted in respect to their performance. Hence, we
conduct a performance analysis to provide a decision support for policyholders. We decompose a participating life insurance
contract in a term life insurance and a savings part and simulate the cash flow distribution of the latter. Simulation results are
75

compared with cash flows resulting from two benchmarks investing in the same portfolio of assets but without investment
guarantees and bonus distribution schemes, in order to measure the impact of these two product features. To provide a realistic
picture within the two alternatives, we take transaction costs and wealth transfers between different groups of policyholders into
account. We show that the payoff distribution strongly depends on the initial reserve situation and managerial discretion. Results
indicate that policyholders will in general profit from a better payoff distribution of the participating life insurance compared to a
mutual fund benchmark but not compared to an exchange-traded fund benchmark portfolio.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Pricing fixed-income securities in an information-based framework. Hughston, Lane P; Macrina, Andrea Routledge, [RKN: 45844]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (3-4) : 361-379.
The purpose of this article is to introduce a class of information-based models for the pricing of fixed-income securities. We
consider a set of continuous-time processes that describe the flow of information concerning market factors in a monetary
economy. The nominal pricing kernel is assumed to be given at any specified time by a function of the values of information
processes at that time. Using a change-of-measure technique, we derive explicit expressions for the prices of nominal discount
bonds and deduce the associated dynamics of the short rate of interest and the market price of risk. The interest rate positivity
condition is expressed as a differential inequality. An example that shows how the model can be calibrated to an arbitrary initial
yield curve is presented. We proceed to model the price level, which is also taken at any specified time to be given by a function of
the values of the information processes at that time. A simple model for a stochastic monetary economy is introduced in which the
prices of the nominal discount bonds and inflation-linked notes can be expressed in terms of aggregate consumption and the
liquidity benefit generated by the money supply.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

Valuation of two-factor interest rate contingent claims using Green's theorem. Sorwar, Ghulam; Barone-Adesi, Giovanni [RKN:
45460]
Applied Mathematical Finance (2011) 18 (3-4) : 277-289.
Over the years a number of two-factor interest rate models have been proposed that have formed the basis for the valuation of
interest rate contingent claims. This valuation equation often takes the form of a partial differential equation that is solved using the
finite difference approach. In the case of two-factor models this has resulted in solving two second-order partial derivatives leading
to boundary errors, as well as numerous first-order derivatives. In this article we demonstrate that using Green's theorem,
second-order derivatives can be reduced to first-order derivatives that can be easily discretized; consequently, two-factor partial
differential equations are easier to discretize than one-factor partial differential equations. We illustrate our approach by applying
it to value contingent claims based on the two-factor CIR model. We provide numerical examples that illustrate that our approach
shows excellent agreement with analytical prices and the popular CrankNicolson method.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

INTERNATIONAL MARKETING

Modeling dependence dynamics through copulas with regime switching. Silvo Filho, Osvaldo Candido da; Ziegelmann, Flavio
Augusto; Dueker, Michael J [RKN: 45638]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 346-356.
Measuring dynamic dependence between international financial markets has recently attracted great interest in financial
econometrics because the observed correlations rose dramatically during the 200809 global financial crisis. Here, we propose a
novel approach for measuring dependence dynamics. We include a hidden Markov chain (MC) in the equation describing
dependence dynamics, allowing the unobserved time-varying dependence parameter to vary according to both a restricted ARMA
process and an unobserved two-state MC. Estimation is carried out via the inference for the margins in conjunction with
filtering/smoothing algorithms. We use block bootstrapping to estimate the covariance matrix of our estimators. Monte Carlo
simulations compare the performance of regime switching and no switching models, supporting the regime-switching
specification. Finally the proposed approach is applied to empirical data, through the study of the S&P500 (USA), FTSE100 (UK)
and BOVESPA (Brazil) stock market indexes.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

INVESTMENT

Extreme Events: Robust Portfolio Construction in the Presence of Fat Tails. Kemp, Malcolm H D (2011). - Chichester: John Wiley
& Sons Ltd, 2011. - 312 pages. [RKN: 13141]
Shelved at: EF/JNH (Lon) Shelved at: 368.01 KEM
Markets are fat-tailed; extreme outcomes occur more often than many might hope, or indeed the statistics or normal distributions
might indicate. In this book, the author provides readers with the latest tools and techniques on how best to adapt portfolio
construction techniques to cope with extreme events. Beginning with an overview of portfolio construction and market drivers, the
book will analyze fat tails, what they are, their behavior, how they can differ and what their underlying causes are. The book will
then move on to look at portfolio construction techniques which take into account fat tailed behavior, and how to stress test your
portfolio against extreme events. Finally, the book will analyze really extreme events in the context of portfolio choice and
76

problems. The book will offer readers: Ways of understanding and analyzing sources of extreme events, Tools for analyzing the
key drivers of risk and return, their potential magnitude and how they might interact, Methodologies for achieving efficient portfolio
construction and risk budgeting, Approaches for catering for the time-varying nature of the world in which we live, Back-stop
approaches for coping with really extreme events, Illustrations and real life examples of extreme events across asset classes. This
will be an indispensible guide for portfolio and risk managers who will need to better protect their portfolios against extreme events
which, within the financial markets, occur more frequently than we might expect.

A hybrid estimate for the finite-time ruin probability in a bivariate autoregressive risk model with application to portfolio
optimization. Tang, Qihe; Yuan, Zhongyi Society of Actuaries, - 20 pages. [RKN: 70667]
Shelved at: Per: NAAJ (Oxf) Per NAAJ (Lon) Shelved at: JOU
North American Actuarial Journal (2012) 16 (3) : 378-397.
Consider a discrete-time risk model in which the insurer is allowed to invest a proportion of its wealth in a risky stock and keep the
rest in a risk-free bond. Assume that the claim amounts within individual periods follow an autoregressive process with
heavy-tailed innovations and that the logreturns of the stock follow another autoregressive process, independent of the former
one. We derive an asymptotic formula for the finite-time ruin probability and propose a hybrid method, combining simulation with
asymptotics, to compute this ruin probability more efficiently. As an application, we consider a portfolio optimization problem in
which we determine the proportion invested in the risky stock that maximizes the expected terminal wealth subject to a constraint
on the ruin probability.
http://www.soa.org/news-and-publications/publications/journals/naaj/naaj-detail.aspx

Optimal asset allocation for passive investing with capital loss harvesting. Ostrov, Daniel N; Wong, Thomas G [RKN: 45461]
Applied Mathematical Finance (2011) 18 (3-4) : 291-329.
This article examines how to quantify and optimally utilize the beneficial effect that capital loss harvesting generates in a taxable
portfolio. We explicitly determine the optimal initial asset allocation for an investor who follows the continuous time dynamic trading
strategy of Constantinides (1983). This strategy sells and re-buys all stocks with losses, but is otherwise passive. Our model
allows the use of the stock position's full purchase history for computing the cost basis. The method can also be used to rebalance
at later times. For portfolios with one stock position and cash, the probability density function for the portfolio's state corresponds
to the solution of a 3-space + 1-time dimensional partial differential equation (PDE) with an oblique reflecting boundary condition.
Extensions of this PDE, including to the case of multiple correlated stocks, are also presented.

We detail a numerical algorithm for the PDE in the single stock case. The algorithm shows the significant effect capital loss
harvesting can have on the optimal stock allocation, and it also allows us to compute the expected additional return that capital
loss harvesting generates. Our PDE-based algorithm, compared with Monte Carlo methods, is shown to generate much more
precise results in a fraction of the time. Finally, we employ Monte Carlo methods to approximate the impact of many of our model's
assumptions.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

Optimal investment and consumption decision of a family with life insurance. Kwak, Minsuk; Shin, Yong Hyun; Choi, U Jin [RKN:
40011]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 176-188.
We study an optimal portfolio and consumption choice problem of a family that combines life insurance for parents who receive
deterministic labor income until the fixed time T. We consider utility functions of parents and children separately and assume that
parents have an uncertain lifetime. If parents die before time T, children have no labor income and they choose the optimal
consumption and portfolio with remaining wealth and life insurance benefit. The object of the family is to maximize the weighted
average of utility of parents and that of children. We obtain analytic solutions for the value function and the optimal policies, and
then analyze how the changes of the weight of the parents utility function and other factors affect the optimal policies.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Optimal time-consistent investment and reinsurance strategies for insurers under Hestons SV model. Li, Zhongfei; Zeng, Yan;
Lai, Yongzeng [RKN: 45736]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 191-203.
This paper considers the optimal time-consistent investment and reinsurance strategies for an insurer under Hestons stochastic
volatility (SV) model. Such an SV model applied to insurers portfolio problems has not yet been discussed as far as we know. The
surplus process of the insurer is approximated by a Brownian motion with drift. The financial market consists of one risk-free asset
and one risky asset whose price process satisfies Hestons SV model. Firstly, a general problem is formulated and a verification
theorem is provided. Secondly, the closed-form expressions of the optimal strategies and the optimal value functions for the
meanvariance problem without precommitment are derived under two cases: one is the investmentreinsurance case and the
other is the investment-only case. Thirdly, economic implications and numerical sensitivity analysis are presented for our results.
Finally, some interesting phenomena are found and discussed.
Available via Athens: Palgrave MacMillan
http://www.openathens.net






77

INVESTMENT MANAGEMENT

Accounting for regime and parameter uncertainty in regime-switching models. Hartman, Brian M; Heaton, Matthew J [RKN: 44945]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 429-437.
As investment guarantees become increasingly complex, realistic simulation of the price becomes more critical. Currently,
regime-switching models are commonly used to simulate asset returns. Under a regime switching model, simulating random asset
streams involves three steps: (i) estimate the model parameters given the number of regimes using maximum likelihood, (ii)
choose the number of regimes using a model selection criteria, and (iii) simulate the streams using the optimal number of regimes
and parameter values. This method, however, does not properly incorporate regime or parameter uncertainty into the generated
asset streams and therefore into the price of the guarantee. To remedy this, this article adopts a Bayesian approach to properly
account for those two sources of uncertainty and improve pricing.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

A numerical method for the expected penaltyreward function in a Markov-modulated jumpdiffusion process. Diko, Peter;
Usbel, Miguel [RKN: 44982]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (1) : 126-131.
A generalization of the CramrLundberg risk model perturbed by a diffusion is proposed. Aggregate claims of an insurer follow a
compound Poisson process and premiums are collected at a constant rate with additional random fluctuation. The insurer is
allowed to invest the surplus into a risky asset with volatility dependent on the level of the investment, which permits the
incorporation of rational investment strategies as proposed by Berk and Green (2004). The return on investment is modulated by
a Markov process which generalizes previously studied settings for the evolution of the interest rate in time. The GerberShiu
expected penaltyreward function is studied in this context, including ruin probabilities (a first-passage problem) as a special case.
The second order integro-differential system of equations that characterizes the function of interest is obtained. As a closed-form
solution does not exist, a numerical procedure based on the Chebyshev polynomial approximation through a collocation method is
proposed. Finally, some examples illustrating the procedure are presented.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Optimal time-consistent investment and reinsurance policies for mean-variance insurers. Zeng, Yan; Li, Zhongfei [RKN: 44984]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (1) : 145-154.
This paper investigates the optimal time-consistent policies of an investment-reinsurance problem and an investment-only
problem under the mean-variance criterion for an insurer whose surplus process is approximated by a Brownian motion with drift.
The financial market considered by the insurer consists of one risk-free asset and multiple risky assets whose price processes
follow geometric Brownian motions. A general verification theorem is developed, and explicit closed-form expressions of the
optimal polices and the optimal value functions are derived for the two problems. Economic implications and numerical sensitivity
analysis are presented for our results. Our main findings are: (i) the optimal time-consistent policies of both problems are
independent of their corresponding wealth processes; (ii) the two problems have the same optimal investment policies; (iii) the
parameters of the risky assets (the insurance market) have no impact on the optimal reinsurance (investment) policy; (iv) the
premium return rate of the insurer does not affect the optimal policies but affects the optimal value functions; (v) reinsurance can
increase the mean-variance utility.


Available via Athens: Palgrave MacMillan
http://www.openathens.net

INVESTMENT PERFORMANCE

Accounting for regime and parameter uncertainty in regime-switching models. Hartman, Brian M; Heaton, Matthew J [RKN: 44945]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 429-437.
As investment guarantees become increasingly complex, realistic simulation of the price becomes more critical. Currently,
regime-switching models are commonly used to simulate asset returns. Under a regime switching model, simulating random asset
streams involves three steps: (i) estimate the model parameters given the number of regimes using maximum likelihood, (ii)
choose the number of regimes using a model selection criteria, and (iii) simulate the streams using the optimal number of regimes
and parameter values. This method, however, does not properly incorporate regime or parameter uncertainty into the generated
asset streams and therefore into the price of the guarantee. To remedy this, this article adopts a Bayesian approach to properly
account for those two sources of uncertainty and improve pricing.
Available via Athens: Palgrave MacMillan
http://www.openathens.net




78

INVESTMENT POLICY

Exponential change of measure applied to term structures of interest rates and exchange rates. Bo, Lijun [RKN: 44964]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (2) : 216-225.
In this paper, we study the term structures of interest rates and foreign exchange rates through establishing a state-price deflator.
The state-price deflator considered here can be viewed as an extension to the potential representation of the state-price density in
[Rogers, L.C.G., 1997. The potential approach to the term structure of interest rates and foreign exchange rates. Mathematical
Finance 7(2), 157164]. We identify a risk-neutral probability measure from the state-price deflator by a technique of exponential
change of measure for Markov processes proposed by [Palmowski, Z., Rolski, T., 2002. A technique for exponential change of
measure for Markov processes. Bernoulli 8(6), 767785] and present examples of several classes of diffusion processes
(jumpdiffusions and diffusions with regime-switching) to illustrate the proposed theory. A comparison between the exponential
change of measure and the Esscher transform for identifying risk-neutral measures is also presented. Finally, we consider the
exchange rate dynamics by virtue of the ratio of the current state-price deflators between two economies and then discuss the
pricing of currency options.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Optimal proportional reinsurance and investment in a stock market with OrnsteinUhlenbeck process. Liang, Zhibin; Yuen, Kam
Chuen; Guo, Junyi [RKN: 44963]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (2) : 207-215.
In this paper, the authors study the optimal investment and proportional reinsurance strategy when an insurance company wishes
to maximize the expected exponential utility of the terminal wealth. It is assumed that the instantaneous rate of investment return
follows an OrnsteinUhlenbeck process. Using stochastic control theory and HamiltonJacobiBellman equations, explicit
expressions for the optimal strategy and value function are derived not only for the compound Poisson risk model but also for the
Brownian motion risk model. Further, we investigate the partially observable optimization problem, and also obtain explicit
expressions for the optimal results.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Reactive investment strategies. Leung, Andrew P [RKN: 44979]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (1) : 89-99.
Asset liability management is a key aspect of the operation of all financial institutions. In this endeavor, asset allocation is
considered the most important element of investment management. Asset allocation strategies may be static, and as such are
usually assessed under asset models of various degrees of complexity and sophistication. In recent years attention has turned to
dynamic strategies, which promise to control risk more effectively. In this paper we present a new class of dynamic asset strategy,
which respond to actual events. Hence they are referred to as reactive strategies. They cannot be characterized as a series of
specific asset allocations over time, but comprise rules for determining such allocations as the world evolves. Though they depend
on how asset returns and other financial variables are modeled, they are otherwise objective in nature. The resulting strategies are
optimal, in the sense that they can be shown to outperform all other strategies of their type when no asset allocation constraints
are imposed. Where such constraints are imposed, the strategies may be demonstrated to be almost optimal, and dramatically
more effective than static strategies.

Available via Athens: Palgrave MacMillan
http://www.openathens.net

IRELAND

Mortality projections using generalized additive models with applications to annuity values for the Irish population. Hall, M; Friel,
N Faculty of Actuaries and Institute of Actuaries; Cambridge University Press, [RKN: 39999]
Shelved at: Per: AAS (Oxf) Per: AAS (Lon)
Annals of Actuarial Science (2011) 5(1) : 19-32.
Generalized Additive Models (GAMs) with age, period and cohort as possible covariates are used to predict future mortality
improvements for the Irish population. The GAMs considered are the 1-dimensional age + period and age + cohort models and the
2-dimensional age-period and age-cohort models. In each case thin plate regression splines are used as the smoothing functions.
The generalized additive models are compared with the P-Spline (Currie et al., 2004) and Lee-Carter (Lee & Carter, 1992) models
included in version 1.0 of the Continuous Mortality Investigation (CMI) library of mortality projections. Using the Root Mean Square
Error to assess the accuracy of future predictions, the GAMs outperform the P-Spline and Lee-Carter models over intervals of 25
and 35 years in the age range 60 to 90. The GAMs allow intuitively simple models of mortality to be specified whilst also providing
the flexibility to model complex relationships between the covariates. The majority of morality improvements derived from the
projections of future Irish mortality yield annuity values at ages 60, 65, 70 and 80 in 2007 in the range of annuity values calculated,
assuming a 2 to 4 percent annual compound improvement in mortality rates for both males and females.
http://www.actuaries.org.uk/research-and-resources/pages/access-journals


79

JUMP DIFFUSION

An operator-based approach to the analysis of ruin-related quantities in jump diffusion risk models. Feng, Runhuan [RKN: 40025]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 304-313.
Recent developments in ruin theory have seen the growing popularity of jump diffusion processes in modeling an insurers assets
and liabilities. Despite the variations of technique, the analysis of ruin-related quantities mostly relies on solutions to certain
differential equations. In this paper, we propose in the context of Lvy-type jump diffusion risk models a solution method to a
general class of ruin-related quantities. Then we present a novel operator-based approach to solving a particular type of
integro-differential equations. Explicit expressions for resolvent densities for jump diffusion processes killed on exit below zero are
obtained as by-products of this work.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

KALMAN FILTER

Evolutionary credibility theory: A generalized linear mixed modeling approach. Lai, Tze Leung; Sun, Kevin Haoyu Society of
Actuaries, - 12 pages. [RKN: 70146]
Shelved at: Per: NAAJ (Oxf) Per NAAJ (Lon) Shelved at: JOU
North American Actuarial Journal (2012) 16 (2) : 273-284.
The conventional approach to evolutionary credibility theory assumes a linear state-space model for the longitudinal claims data
so that Kalman filters can be used to estimate the claims expected values, which are assumed to form an autoregressive time
series. We propose a class of linear mixed models as an alternative to linear state-space models for evolutionary credibility and
show that the predictive performance is comparable to that of the Kalman filter when the claims are generated by a linear
state-space model. More importantly, this approach can be readily extended to generalized linear mixed models for the
longitudinal claims data. We illustrate its applications by addressing the excess zeros issue that a substantial fraction of policies
does not have claims at various times in the period under consideration.
http://www.soa.org/news-and-publications/publications/journals/naaj/naaj-detail.aspx

KERNEL DESTINY ESTIMATOR

A Bayesian approach to parameter estimation for kernel density estimation via transformations. Liu, Qing; Pitt, David; Zhang,
Xibin; Wu, Xueyuan Institute and Faculty of Actuaries; Cambridge University Press, - 13 pages. [RKN: 74950]
Shelved at: Per: AAS (Oxf) Per: AAS (Lon)
Annals of Actuarial Science (2011) 5(2) : 181-193.
In this paper, we present a Markov chain Monte Carlo (MCMC) simulation algorithm for estimating parameters in the kernel
density estimation of bivariate insurance claim data via transformations. Our data set consists of two types of auto insurance claim
costs and exhibits a high-level of skewness in the marginal empirical distributions. Therefore, the kernel density estimator based
on original data does not perform well. However, the density of the original data can be estimated through estimating the density of
the transformed data using kernels. It is well known that the performance of a kernel density estimator is mainly determined by the
bandwidth, and only in a minor way by the kernel. In the current literature, there have been some developments in the area of
estimating densities based on transformed data, where bandwidth selection usually depends on pre-determined transformation
parameters. Moreover, in the bivariate situation, the transformation parameters were estimated for each dimension individually.
We use a Bayesian sampling algorithm and present a Metropolis-Hastings sampling procedure to sample the bandwidth and
transformation parameters from their posterior density. Our contribution is to estimate the bandwidths and transformation
parameters simultaneously within a Metropolis-Hastings sampling procedure. Moreover, we demonstrate that the correlation
between the two dimensions is better captured through the bivariate density estimator based on transformed data.
http://www.actuaries.org.uk/research-and-resources/pages/access-journals

LAPLACE TRANSFORM

The distributions of some quantities for Erlang(2) risk models. Dickson, David C M; Li, Shuanming (2012). - Victoria: University of
Melbourne, 2012. - 18 pages. [RKN: 73947]
We study the distributions of [1] the first time that the surplus reaches a given level and [2] the duration of negative surplus in a
Sparre Andersen risk process with the inter-claim times being Erlang(2) distributed. These distributions can be obtained through
the inversion of Laplace transforms using the inversion relationship for the Erlang(2) risk model given by Dickson and Li (2010).
No.1 (1993) onwards available online. Download as PDF.
http://www.economics.unimelb.edu.au/ACT/papers.shtml


80

LIABILITIES

Cash flow simulation for a model of outstanding liabilities based on claim amounts and claim numbers. Martinez Miranda, Maria
Dolores; Nielsen, Bent; Nielsen, Jens Perch; Verrall, Richard [RKN: 45302]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU/AST
ASTIN Bulletin (2011) 41 (1) : 107-129.
In this paper we develop a full stochastic cash flow model of outstanding liabilities for the model developed in Verrall, Nielsen and
Jessen (2010). This model is based on the simple triangular data available in most non-life insurance companies. By using more
data, it is expected that the method will have less volatility than the celebrated chain ladder method. Eventually, our method will
lead to lower solvency requirements for those insurance companies that decide to collect counts data and replace their
conventional chain ladder method.
online access via International Actuarial Association:
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Repeat performance. Hursey, Chris Staple Inn Actuarial Society, - 2 pages. [RKN: 74930]
Shelved at: Per: Actuary (Oxf) Per: Actuary (Lon) Shelved at: JOU
The Actuary (2012) January/February : 32-33.
Chris Hursey describes a new and original theory on determining optimal calibration nodes for replicating formulae
http://www.theactuary.com/

LIFE ASSURANCE

Household consumption, investment and life insurance. Bruhn, Kenneth; Steffensen, Mogens [RKN: 45125]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (3) : 315-325.
This paper develops a continuous-time Markov model for utility optimization of households. The household optimizes expected
future utility from consumption by controlling consumption, investments and purchase of life insurance for each person in the
household. The optimal controls are investigated in the special case of a two-person household, and we present graphics
illustrating how differences between the two persons affect the controls.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

A joint valuation of premium payment and surrender options in participating life insurance contracts. Schmeiser, H; Wagner, J
[RKN: 44958]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 580-596.
In addition to an interest rate guarantee and annual surplus participation, life insurance contracts typically embed the right to stop
premium payments during the term of the contract (paid-up option), to resume payments later (resumption option), or to terminate
the contract early (surrender option). Terminal guarantees are on benefits payable upon death, survival and surrender. The latter
are adapted after exercising the options. A model framework including these features and an algorithm to jointly value the
premium payment and surrender options is presented. In a first step, the standard principles of risk-neutral evaluation are applied
and the policyholder is assumed to use an economically rational exercise strategy. In a second step, option value sensitivity on
different contract parameters, benefit adaptation mechanisms, and exercise behavior is analyzed numerically. The two latter are
the main drivers for the option value.

Available via Athens: Palgrave MacMillan
http://www.openathens.net

Optimal investment and consumption decision of a family with life insurance. Kwak, Minsuk; Shin, Yong Hyun; Choi, U Jin [RKN:
40011]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 176-188.
We study an optimal portfolio and consumption choice problem of a family that combines life insurance for parents who receive
deterministic labor income until the fixed time T. We consider utility functions of parents and children separately and assume that
parents have an uncertain lifetime. If parents die before time T, children have no labor income and they choose the optimal
consumption and portfolio with remaining wealth and life insurance benefit. The object of the family is to maximize the weighted
average of utility of parents and that of children. We obtain analytic solutions for the value function and the optimal policies, and
then analyze how the changes of the weight of the parents utility function and other factors affect the optimal policies.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Risk analysis and valuation of life insurance contracts: combining actuarial and financial approaches. Graf, Stefan; Kling,
Alexander; Ru, Jochen [RKN: 44981]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (1) : 115-125.
In this paper, we analyze traditional (i.e. not unit-linked) participating life insurance contracts with a guaranteed interest rate and
surplus participation. We consider three different surplus distribution models and an asset allocation that consists of money
market, bonds with different maturities, and stocks. In this setting, we combine actuarial and financial approaches by selecting a
risk minimizing asset allocation (under the real world measure P) and distributing terminal surplus such that the contract value
(under the pricing measure Q) is fair. We prove that this strategy is always possible unless the insurance contracts introduce
81

arbitrage opportunities in the market. We then analyze differences between the different surplus distribution models and
investigate the impact of the selected risk measure on the risk minimizing portfolio.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Risk comparison of different bonus distribution approaches in participating life insurance. Zemp, Alexandra [RKN: 44967]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (2) : 249-264.
The fair pricing of explicit and implicit options in life insurance products has received broad attention in the academic literature over
the past years. Participating life insurance (PLI) contracts have been the focus especially. These policies are typically
characterized by a term life insurance, a minimum interest rate guarantee, and bonus participation rules with regard to the
insurers asset returns or reserve situation. Researchers replicate these bonus policies quite differently. We categorize and
formally present the most common PLI bonus distribution mechanisms. These bonus models closely mirror the Danish, German,
British, and Italian regulatory framework. Subsequently, we perform a comparative analysis of the different bonus models with
regard to risk valuation. We calibrate contract parameters so that the compared contracts have a net present value of zero and the
same safety level as the initial position, using risk-neutral valuation. Subsequently, we analyze the effect of changes in the asset
volatility and in the initial reserve amount (per contract) on the value of the default put option (DPO), while keeping all other
parameters constant. Our results show that DPO values obtained with the PLI bonus distribution model of Bacinello (2001), which
replicates the Italian regulatory framework, are most sensitive to changes in volatility and initial reserves.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

LIFE CONTINGENCIES

Actuarial applications of the linear hazard transform in life contingencies. Tsai, Cary Chi-Liang; Jiang, Lingzhi [RKN: 44977]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (1) : 70-80.
In this paper, we study the linear hazard transform and its applications in life contingencies. Under the linear hazard transform, the
survival function of a risk is distorted, which provides a safety margin for pricing insurance products. Combining the assumption of
a-power approximation with the linear hazard transform, the net single premium of a continuous life insurance policy can be
approximated in terms of the net single premiums of discrete ones. Moreover, Macaulay duration, modified duration and dollar
duration, all measuring the sensitivity of the price of a life insurance policy to force of mortality movements under the linear hazard
transform, are defined and investigated. Some examples are given for illustration.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

LIFE EXPECTATION

Entropy, longevity and the cost of annuities. Haberman, Steven; Khalaf-Allah, Marwa; Verrall, Richard [RKN: 40013]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 197-204.
This paper presents an extension of the application of the concept of entropy to annuity costs. Keyfitz (1985) introduced the
concept of entropy, and analysed this in the context of continuous changes in life expectancy. He showed that a higher level of
entropy indicates that the life expectancy has a greater propensity to respond to a change in the force of mortality than a lower
level of entropy. In other words, a high level of entropy means that further reductions in mortality rates would have an impact on
measures like life expectancy. In this paper, we apply this to the cost of annuities and show how it allows the sensitivity of the cost
of a life annuity contract to changes in longevity to be summarized in a single figure index.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Time-simultaneous prediction bands: a new look at the uncertainty involved in forecasting mortality. Li, Johnny Siu-Hang; Chan,
Wai-Sum [RKN: 44978]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (1) : 81-88.
Conventionally, isolated (point-wise) prediction intervals are used to quantify the uncertainty in future mortality rates and other
demographic quantities such as life expectancy. A pointwise interval reflects uncertainty in a variable at a single time point, but it
does not account for any dynamic property of the time-series. As a result, in situations when the path or trajectory of future
mortality rates is important, a band of pointwise intervals might lead to an invalid inference. To improve the communication of
uncertainty, a simultaneous prediction band can be used. The primary objective of this paper is to demonstrate how simultaneous
prediction bands can be created for prevalent stochastic models, including the CairnsBlakeDowd and LeeCarter models. The
illustrations in this paper are based on mortality data from the general population of England and Wales.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

82

LIFE INSURANCE

Analyzing surplus appropriation schemes in participating life insurance from the insurers and the policyholders perspective.
Bohnert, Alexander; Gatzert, Nadine [RKN: 44991]
Insurance: Mathematics & Economics (2012) 50 (1) : 64-78.
This paper examines the impact of three surplus appropriation schemes often inherent in participating life insurance contracts on
the insurers shortfall risk and the net present value from an insureds viewpoint. (1) In case of the bonus system, surplus is used
to increase the guaranteed death and survival benefit, leading to higher reserves; (2) the interest-bearing accumulation increases
only the survival benefit by accumulating the surplus on a separate account; and (3) surplus can also be used to shorten the
contract term, which results in an earlier payment of the survival benefit and a reduced sum of premium payments. The pool of
participating life insurance contracts with death and survival benefit is modeled actuarially with annual premium payments;
mortality rates are generated based on an extension of the Lee-Carter (1992) model, and the asset process follows a geometric
Brownian motion. In a simulation analysis, we then compare the influence of different asset portfolios and shocks to mortality on
the insurers risk situation and the policyholders net present value for the three surplus schemes. Our findings demonstrate that,
even though the surplus distribution and thus the amount of surplus is calculated the same way, the type of surplus appropriation
scheme has a substantial impact on the insurers risk exposure and the policyholders net present value.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Household consumption, investment and life insurance. Bruhn, Kenneth; Steffensen, Mogens [RKN: 45125]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (3) : 315-325.
This paper develops a continuous-time Markov model for utility optimization of households. The household optimizes expected
future utility from consumption by controlling consumption, investments and purchase of life insurance for each person in the
household. The optimal controls are investigated in the special case of a two-person household, and we present graphics
illustrating how differences between the two persons affect the controls.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Multidimensional LeeCarter model with switching mortality processes. Hainaut, Donatien [RKN: 45597]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (2) : 236-246.
This paper proposes a multidimensional LeeCarter model, in which the time dependent components are ruled by switching
regime processes. The main feature of this model is its ability to replicate the changes of regimes observed in the mortality
evolution. Changes of measure, preserving the dynamics of the mortality process under a pricing measure, are also studied. After
a review of the calibration method, a 2D, 2-regimes model is fitted to the male and female French population, for the period
19462007. Our analysis reveals that one regime corresponds to longevity conditions observed during the decade following the
second world war, while the second regime is related to longevity improvements observed during the last 30 years. To conclude,
we analyze, in a numerical application, the influence of changes of measure affecting transition probabilities, on prices of life and
death insurances.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Optimal investment and consumption decision of a family with life insurance. Kwak, Minsuk; Shin, Yong Hyun; Choi, U Jin [RKN:
40011]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 176-188.
We study an optimal portfolio and consumption choice problem of a family that combines life insurance for parents who receive
deterministic labor income until the fixed time T. We consider utility functions of parents and children separately and assume that
parents have an uncertain lifetime. If parents die before time T, children have no labor income and they choose the optimal
consumption and portfolio with remaining wealth and life insurance benefit. The object of the family is to maximize the weighted
average of utility of parents and that of children. We obtain analytic solutions for the value function and the optimal policies, and
then analyze how the changes of the weight of the parents utility function and other factors affect the optimal policies.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

A performance analysis of participating life insurance contracts. Faust, Roger; Schmeiser, Hato; Zemp, Alexandra [RKN: 45733]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 158-171.
Participating life insurance contracts are one of the most important products in the European life insurance market. Even though
these contract forms are very common, only very little research has been conducted in respect to their performance. Hence, we
conduct a performance analysis to provide a decision support for policyholders. We decompose a participating life insurance
contract in a term life insurance and a savings part and simulate the cash flow distribution of the latter. Simulation results are
compared with cash flows resulting from two benchmarks investing in the same portfolio of assets but without investment
guarantees and bonus distribution schemes, in order to measure the impact of these two product features. To provide a realistic
picture within the two alternatives, we take transaction costs and wealth transfers between different groups of policyholders into
account. We show that the payoff distribution strongly depends on the initial reserve situation and managerial discretion. Results
indicate that policyholders will in general profit from a better payoff distribution of the participating life insurance compared to a
mutual fund benchmark but not compared to an exchange-traded fund benchmark portfolio.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

83

LIFE PRODUCTS

Managing longevity and disability risks in life annuities with long term care. Levantesi, Susanna; Menzietti, Massimiliano [RKN:
45642]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 391-401.
The aim of the paper is twofold. Firstly, it develops a model for risk assessment in a portfolio of life annuities with long term care
benefits. These products are usually represented by a Markovian Multi-State model and are affected by both longevity and
disability risks. Here, a stochastic projection model is proposed in order to represent the future evolution of mortality and disability
transition intensities. Data from the Italian National Institute of Social Security (INPS) and from Human Mortality Database (HMD)
are used to estimate the model parameters. Secondly, it investigates the solvency in a portfolio of enhanced pensions. To this aim
a risk model based on the portfolio risk reserve is proposed and different rules to calculate solvency capital requirements for life
underwriting risk are examined. Such rules are then compared with the standard formula proposed by the Solvency II project.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

LIFE TABLES

Comparison and bounds for functionals of future lifetimes consistent with life tables. Barz, Christiane; Muller, Alfred [RKN: 45596]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (2) : 229-235.
We derive a new crossing criterion of hazard rates to identify a stochastic order relation between two random variables. We apply
this crossing criterion in the context of life tables to derive stochastic ordering results among three families of fractional age
assumptions: the family of linear force of mortality functions, the family of quadratic survival functions and the power family.
Further, this criterion is used to derive tight bounds for functionals of future lifetimes that exhibit an increasing force of mortality
with given one-year survival probabilities. Numerical examples illustrate our findings.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

LINEAR EQUATIONS

A maximum-entropy approach to the linear credibility formula. Najafabadi, Amir T. Payandeh; Hatami, Hamid; Najafabadi, Maryam
Omidi [RKN: 45738]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 216-221.
Payandeh [Payandeh Najafabadi, A.T., 2010. A new approach to credibility formula. Insurance: Mathematics and Economy 46,
334338] introduced a new technique to approximate a Bayes estimator with the exact credibilitys form. This article employs a
well known and powerful maximum-entropy method (MEM) to extend results of Payandeh Najafabadi (2010) to a class of linear
credibility, whenever claim sizes have been distributed according to the logconcave distributions. Namely, (i) it employs the
maximum-entropy method to approximate an appropriate Bayes estimator (with respect to either the square-error or the Linex
loss functions and general increasing and bounded prior distribution) by a linear combination of claim sizes; (ii) it establishes that
such an approximation coincides with the exact credibility formula whenever the require conditions for the exact credibility (see
below) are held. Some properties of such an approximation are discussed. Application to crop insurance has been given.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

LINEAR PROGRAMMING

A coherent aggregation framework for stress testing and scenario analysis. Kwiatkowski, Jan; Rebonato, Riccardo [RKN: 45257]
Applied Mathematical Finance (2011) 18 (1-2) : 139-154.
We present a methodology to aggregate in a coherent manner conditional stress losses in a trading or banking book. The
approach bypasses the specification of unconditional probabilities of the individual stress events and ensures by a linear
programming approach so that the (subjective or frequentist) conditional probabilities chosen by the risk manager are internally
consistent. The admissibility requirement greatly reduces the degree of arbitrariness in the conditional probability matrix if this is
assigned subjectively. The approach can be used to address the requirements of the regulators on the Instantaneous Risk
Charge.
Available via Athens: Taylor & Francis Online
http://www.openathens.net


84

LONG-TAIL BUSINESS

Multivariate longitudinal modeling of insurance company expenses. Shi, Peng [RKN: 45737]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 204-215.
Insurers, investors and regulators are interested in understanding the behaviour of insurance company expenses, due to the high
operating cost of the industry. Expense models can be used for prediction, to identify unusual behavior, and to measure firm
efficiency. Current literature focuses on the study of total expenses that consist of three components: underwriting, investment and
loss adjustment. A joint study of expenses by type is to deliver more information and is critical in understanding their relationship.

This paper introduces a copula regression model to examine the three types of expenses in a longitudinal context. In our method,
elliptical copulas are employed to accommodate the between-subject contemporaneous and lag dependencies, as well as the
within-subject serial correlations of the three types. Flexible distributions are allowed for the marginals of each type with covariates
incorporated in distribution parameters. A model validation procedure based on a t-plot method is proposed for in-sample and
out-of-sample validation purposes. The multivariate longitudinal model effectively addresses the typical features of expenses
data: the heavy tails, the strong individual effects and the lack of balance.

The analysis is performed using propertycasualty insurance company expenses data from the National Association of Insurance
Commissioners of years 20012006. A unique set of covariates is determined for each type of expenses. We found that
underwriting expenses and loss adjustment expenses are complements rather than substitutes. The model is shown to be
successful in efficiency classification. Also, a multivariate predictive density is derived to quantify the future values of an insurers
expenses.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

LONG-TAIL LIABILITIES

Reserving long-tail liabilities : United we stand, divided we fall. Odell, David Staple Inn Actuarial Society, - 3 pages. [RKN: 74945]
Shelved at: Per: Actuary (Oxf) Per: Actuary (Lon) Shelved at: JOU
The Actuary (2012) March : 32-34.
David Odell puts forward a method to span the rift between uncertainty and best estimates
http://www.theactuary.com/

LONG TERM CARE COVER

Managing longevity and disability risks in life annuities with long term care. Levantesi, Susanna; Menzietti, Massimiliano [RKN:
45642]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 391-401.
The aim of the paper is twofold. Firstly, it develops a model for risk assessment in a portfolio of life annuities with long term care
benefits. These products are usually represented by a Markovian Multi-State model and are affected by both longevity and
disability risks. Here, a stochastic projection model is proposed in order to represent the future evolution of mortality and disability
transition intensities. Data from the Italian National Institute of Social Security (INPS) and from Human Mortality Database (HMD)
are used to estimate the model parameters. Secondly, it investigates the solvency in a portfolio of enhanced pensions. To this aim
a risk model based on the portfolio risk reserve is proposed and different rules to calculate solvency capital requirements for life
underwriting risk are examined. Such rules are then compared with the standard formula proposed by the Solvency II project.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

LONGEVITY

Entropy, longevity and the cost of annuities. Haberman, Steven; Khalaf-Allah, Marwa; Verrall, Richard [RKN: 40013]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 197-204.
This paper presents an extension of the application of the concept of entropy to annuity costs. Keyfitz (1985) introduced the
concept of entropy, and analysed this in the context of continuous changes in life expectancy. He showed that a higher level of
entropy indicates that the life expectancy has a greater propensity to respond to a change in the force of mortality than a lower
level of entropy. In other words, a high level of entropy means that further reductions in mortality rates would have an impact on
measures like life expectancy. In this paper, we apply this to the cost of annuities and show how it allows the sensitivity of the cost
of a life annuity contract to changes in longevity to be summarized in a single figure index.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

85

LONGEVITY RISK

A comparison of the LeeCarter model and ARARCH model for forecasting mortality rates. Giacometti, Rosella; Bertocchi,
Marida; Rachev, Svetlozar T; Fabozzi, Frank J [RKN: 44993]
Insurance: Mathematics & Economics (2012) 50 (1) : 85-93.
With the decline in the mortality level of populations, national social security systems and insurance companies of most developed
countries are reconsidering their mortality tables taking into account the longevity risk. The Lee and Carter model is the first
discrete-time stochastic model to consider the increased life expectancy trends in mortality rates and is still broadly used today. In
this paper, we propose an alternative to the LeeCarter model: an AR(1)ARCH(1) model. More specifically, we compare the
performance of these two models with respect to forecasting age-specific mortality in Italy. We fit the two models, with Gaussian
and t-student innovations, for the matrix of Italian death rates from 1960 to 2003. We compare the forecast ability of the two
approaches in out-of-sample analysis for the period 20042006 and find that the AR(1)ARCH(1) model with t-student innovations
provides the best fit among the models studied in this paper.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Evolution of coupled lives' dependency across generations and pricing impact. Luciano, Elisa; Spreeuw, Jaap; Vigna, Elena
(2012). - London: Cass Business School, 2012. - 19 pages. [RKN: 73995]
This paper studies the dependence between coupled lives - both within and across generations - and its effects on prices of
reversionary annuities in the presence of longevity risk. Longevity risk is represented via a stochastic mortality intensity.
Dependence is modelled through copula functions. We consider Archimedean single and multi-parameter copulas. We find that
dependence decreases when passing from older generations to younger generations. Not only the level of dependence but also
its features - as measured by the copula - change across generations: the best-fit Archimedean copula is not the same across
generations. Moreover, for all the generations under exam the single-parameter copula is dominated by the two-parameter one.
The independence assumption produces quantifiable mispricing of reversionary annuities. The misspecification of the copula
produces different mispricing effects on different generations. The research is conducted using a well-known dataset of double life
contracts.
1995 onwards available online. Download as PDF.
http://www.cass.city.ac.uk/research-and-faculty/faculties/faculty-of-actuarial-science-and-insurance/publications/actuarial-resear
ch-reports

Longevity/mortality risk modeling and securities pricing. Deng, Yinglu; Brockett, Patrick L; MacMinn, Richard D - 25 pages. [RKN:
70413]
Shelved at: Per: J.Risk Ins (Oxf) Shelved at: JOU
Journal of Risk and Insurance (2012) 79 (3) : 697-721.
Securitizing longevity/mortality risk can transfer longevity/mortality risk to capital markets. Modeling and forecasting mortality rate
is key to pricing mortality-linked securities. Catastrophic mortality and longevity jumps occur in historical data and have an
important impact on security pricing. This article introduces a stochastic diffusion model with a double-exponential jump diffusion
process that captures both asymmetric rate jumps up and down and also cohort effect in mortality trends. The model exhibits
calibration advantages and mathematical tractability while better fitting the data. The model provides a closed-form pricing solution
for J.P. Morgans q-forward contract usable as a building block for hedging.
Available online via Athens

Longevity risk management for life and variable annuities: the effectiveness of static hedging using longevity bonds and
derivatives. Ngai, Andrew; Sherris, Michael [RKN: 44980]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (1) : 100-114.
For many years, the longevity risk of individuals has been underestimated, as survival probabilities have improved across the
developed world. The uncertainty and volatility of future longevity has posed significant risk issues for both individuals and product
providers of annuities and pensions. This paper investigates the effectiveness of static hedging strategies for longevity risk
management using longevity bonds and derivatives (q-forwards) for the retail products: life annuity, deferred life annuity, indexed
life annuity, and variable annuity with guaranteed lifetime benefits. Improved market and mortality models are developed for the
underlying risks in annuities. The market model is a regime-switching vector error correction model for GDP, inflation, interest
rates, and share prices. The mortality model is a discrete-time logit model for mortality rates with age dependence. Models were
estimated using Australian data. The basis risk between annuitant portfolios and population mortality was based on UK
experience. Results show that static hedging using q-forwards or longevity bonds reduces the longevity risk substantially for life
annuities, but significantly less for deferred annuities. For inflation-indexed annuities, static hedging of longevity is less effective
because of the inflation risk. Variable annuities provide limited longevity protection compared to life annuities and indexed
annuities, and as a result longevity risk hedging adds little value for these products.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Managing longevity and disability risks in life annuities with long term care. Levantesi, Susanna; Menzietti, Massimiliano [RKN:
45642]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 391-401.
The aim of the paper is twofold. Firstly, it develops a model for risk assessment in a portfolio of life annuities with long term care
benefits. These products are usually represented by a Markovian Multi-State model and are affected by both longevity and
disability risks. Here, a stochastic projection model is proposed in order to represent the future evolution of mortality and disability
transition intensities. Data from the Italian National Institute of Social Security (INPS) and from Human Mortality Database (HMD)
are used to estimate the model parameters. Secondly, it investigates the solvency in a portfolio of enhanced pensions. To this aim
a risk model based on the portfolio risk reserve is proposed and different rules to calculate solvency capital requirements for life
underwriting risk are examined. Such rules are then compared with the standard formula proposed by the Solvency II project.
Available via Athens: Palgrave MacMillan
http://www.openathens.net
86


Modelling and management of longevity risk: approximations to survivor functions and dynamic hedging. Cairns, Andrew J G
[RKN: 44946]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 438-453.
This paper looks at the development of dynamic hedging strategies for typical pension plan liabilities using longevity-linked
hedging instruments. Progress in this area has been hindered by the lack of closed-form formulae for the valuation of
mortality-linked liabilities and assets, and the consequent requirement for simulations within simulations. We propose the use of
the probit function along with a Taylor expansion to approximate longevity-contingent values. This makes it possible to develop
and implement computationally efficient, discrete-time delta hedging strategies using -forwards as hedging instruments. The
methods are tested using the model proposed by (CBD). We find that the probit approximations are generally very accurate, and
that the discrete-time hedging strategy is very effective at reducing risk.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

No-good-deal, local mean-variance and ambiguity risk pricing and hedging for an insurance payment process. Delong, Lukasz -
30 pages. [RKN: 70749]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2012) 42 (1) : 203-232.
We study pricing and hedging for an insurance payment process. We investigate a Black-Scholes financial model with stochastic
coefficients and a payment process with death, survival and annuity claims driven by a point process with a stochastic intensity.
The dependence of the claims and the intensity on the financial market and on an additional background noise (correlated index,
longevity risk) is allowed. We establish a general modeling framework for no-good-deal, local mean-variance and ambiguity risk
pricing and hedging. We show that these three valuation approaches are equivalent under appropriate formulations. We
characterize the price and the hedging strategy as a solution to a backward stochastic differential equation. The results could be
applied to pricing and hedging of variable annuities, surrender options under an irrational lapse behavior and mortality derivatives.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

One-year Value-at-Risk for longevity and mortality. Plat, Richard [RKN: 44948]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 462-470.
Upcoming new regulation on regulatory required solvency capital for insurers will be predominantly based on a one-year
Value-at-Risk measure. This measure aims at covering the risk of the variation in the projection year as well as the risk of changes
in the best estimate projection for future years. This paper addresses the issue how to determine this Value-at-Risk for longevity
and mortality risk. Naturally, this requires stochastic mortality rates. In the past decennium, a vast literature on stochastic mortality
models has been developed. However, very few of them are suitable for determining the one-year Value-at-Risk. This requires a
model for mortality trends instead of mortality rates. Therefore, we will introduce a stochastic mortality trend model that fits this
purpose. The model is transparent, easy to interpret and based on well known concepts in stochastic mortality modeling.
Additionally, we introduce an approximation method based on duration and convexity concepts to apply the stochastic mortality
rates to specific insurance portfolios.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

A quantitative comparison of the Lee-Carter Model under different types of non-Gaussian innovations. Wang, Chou-Wen; Huang,
Hong-Chih; Liu, I-Chien Palgrave Macmillan, [RKN: 44909]
Shelved at: Per: Geneva (Oxf)
Geneva Papers on Risk and Insurance (2011) 36(4) : 675-696.
In the classical Lee-Carter model, the mortality indices that are assumed to be a random walk model with drift are normally
distributed. However, for the long-term mortality data, the error terms of the Lee-Carter model and the mortality indices have tails
thicker than those of a normal distribution and appear to be skewed. This study therefore adopts five non-Gaussian
distributionsStudents t-distribution and its skew extension (i.e., generalised hyperbolic skew Students t-distribution), one
finite-activity Lvy model (jump diffusion distribution), and two infinite-activity or pure jump models (variance gamma and normal
inverse Gaussian)to model the error terms of the Lee-Carter model. With mortality data from six countries over the period
19002007, both in-sample model selection criteria (e.g., Bayesian information criterion, KolmogorovSmirnov test,
AndersonDarling test, Cramrvon-Mises test) and out-of-sample projection errors indicate a preference for modelling the
Lee-Carter model with non-Gaussian innovations.

Available via Athens: Palgrave MacMillan
http://www.openathens.net

A recursive approach to mortality-linked derivative pricing. Shang, Zhaoning; Goovaerts, Marc; Dhaene, Jan [RKN: 44966]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (2) : 240-248.
In this paper, we develop a recursive method to derive an exact numerical and nearly analytical representation of the Laplace
transform of the transition density function with respect to the time variable for time-homogeneous diffusion processes. We further
apply this recursion algorithm to the pricing of mortality-linked derivatives. Given an arbitrary stochastic future lifetime T, the
probability distribution function of the present value of a cash flow depending on T can be approximated by a mixture of
exponentials, based on Jacobi polynomial expansions. In case of mortality-linked derivative pricing, the required Laplace inversion
can be avoided by introducing this mixture of exponentials as an approximation of the distribution of the survival time T in the
recursion scheme. This approximation significantly improves the efficiency of the algorithm.

Available via Athens: Palgrave MacMillan
http://www.openathens.net

87

LONGITUDINAL STUDIES

Multivariate longitudinal modeling of insurance company expenses. Shi, Peng [RKN: 45737]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 204-215.
Insurers, investors and regulators are interested in understanding the behaviour of insurance company expenses, due to the high
operating cost of the industry. Expense models can be used for prediction, to identify unusual behavior, and to measure firm
efficiency. Current literature focuses on the study of total expenses that consist of three components: underwriting, investment and
loss adjustment. A joint study of expenses by type is to deliver more information and is critical in understanding their relationship.

This paper introduces a copula regression model to examine the three types of expenses in a longitudinal context. In our method,
elliptical copulas are employed to accommodate the between-subject contemporaneous and lag dependencies, as well as the
within-subject serial correlations of the three types. Flexible distributions are allowed for the marginals of each type with covariates
incorporated in distribution parameters. A model validation procedure based on a t-plot method is proposed for in-sample and
out-of-sample validation purposes. The multivariate longitudinal model effectively addresses the typical features of expenses
data: the heavy tails, the strong individual effects and the lack of balance.

The analysis is performed using propertycasualty insurance company expenses data from the National Association of Insurance
Commissioners of years 20012006. A unique set of covariates is determined for each type of expenses. We found that
underwriting expenses and loss adjustment expenses are complements rather than substitutes. The model is shown to be
successful in efficiency classification. Also, a multivariate predictive density is derived to quantify the future values of an insurers
expenses.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

LOSS

Analytic loss distributional approach models for operational risk from the a-stable doubly stochastic compound processes and
implications for capital allocation. Peters, Gareth W; Shevchenko, Pavel V; Young, Mark; Yip, Wendy [RKN: 44957]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 565-579.
Under the Basel II standards, the Operational Risk (OpRisk) advanced measurement approach is not prescriptive regarding the
class of statistical model utilized to undertake capital estimation. It has however become well accepted to utilize a Loss
Distributional Approach (LDA) paradigm to model the individual OpRisk loss processes corresponding to the Basel II Business
line/event type. In this paper we derive a novel class of doubly stochastic -stable family LDA models. These models provide the
ability to capture the heavy tailed loss processes typical of OpRisk, whilst also providing analytic expressions for the compound
processes annual loss density and distributions, as well as the aggregated compound processes annual loss models. In particular
we develop models of the annual loss processes in two scenarios. The first scenario considers the loss processes with a
stochastic intensity parameter, resulting in inhomogeneous compound Poisson processes annually. The resulting arrival
processes of losses under such a model will have independent counts over increments within the year. The second scenario
considers discretization of the annual loss processes into monthly increments with dependent time increments as captured by a
Binomial processes with a stochastic probability of success changing annually. Each of these models will be coupled under an
LDA framework with heavy-tailed severity models comprised of -stable severities for the loss amounts per loss event. In this paper
we will derive analytic results for the annual loss distribution density and distribution under each of these models and study their
properties.

Available via Athens: Palgrave MacMillan
http://www.openathens.net

Claims development result in the paid-incurred chain reserving method. Happ, Sebastian; Merz, Michael; Wthrich, Mario V [RKN:
45724]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 66-72.
We present the one-year claims development result (CDR) in the paid-incurred chain (PIC) reserving model. The PIC reserving
model presented in Merz and Wthrich (2010) is a Bayesian stochastic claims reserving model that considers simultaneously
claims payments and incurred losses information and allows for deriving the full predictive distribution of the outstanding loss
liabilities. In this model we study the conditional mean square error of prediction (MSEP) for the one-year CDR uncertainty, which
is the crucial uncertainty view under Solvency II.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Composite LognormalPareto model with random threshold. Pigeon, Mathieu; Denuit, Michel [RKN: 45488]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 3 : 177-192.
This paper further considers the composite LognormalPareto model proposed by Cooray & Ananda (2005) and suitably modified
by Scollnik (2007). This model is based on a Lognormal density up to an unknown threshold value and a Pareto density thereafter.
Instead of using a single threshold value applying uniformly to the whole data set, the model proposed in the present paper allows
for heterogeneity with respect to the threshold and let it vary among observations. Specifically, the threshold value for a particular
observation is seen as the realization of a positive random variable and the mixed composite LognormalPareto model is obtained
by averaging over the population of interest. The performance of the composite LognormalPareto model and of its mixed
extension is compared using the well-known Danish fire losses data set.

88

Folded and log-folded-t distributions as models for insurance loss data. Brazauskas, Vytaras; Kleefeld, Andreas [RKN: 45149]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 1 : 59-74.
A rich variety of probability distributions has been proposed in the actuarial literature for fitting of insurance loss data. Examples
include: lognormal, log-t, various versions of Pareto, loglogistic, Weibull, gamma and its variants, and generalized beta of the
second kind distributions, among others. In this paper, we supplement the literature by adding the log-folded-normal and
log-folded-t families. Shapes of the density function and key distributional properties of the 'folded' distributions are presented
along with three methods for the estimation of parameters: method of maximum likelihood; method of moments; and method of
trimmed moments. Further, large and small-sample properties of these estimators are studied in detail. Finally, we fit the newly
proposed distributions to data which represent the total damage done by 827 fires in Norway for the year 1988. The fitted models
are then employed in a few quantitative risk management examples, where point and interval estimates for several value-at-risk
measures are calculated.

Future building water loss projections posed by climate change. Haug, Ola; Dimakos, Xeni K; Vardal, Jofrid F; Aldrin, Magne;
Meze-Hausken, Elisabeth [RKN: 45146]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 1 : 1-20.
The insurance industry, like other parts of the financial sector, is vulnerable to climate change. Life as well as non-life products are
affected and knowledge of future loss levels is valuable. Risk and premium calculations may be updated accordingly, and
dedicated loss-preventive measures can be communicated to customers and regulators. We have established statistical claims
models for the coherence between externally inflicted water damage to private buildings in Norway and selected meteorological
variables. Based on these models and downscaled climate predictions from the Hadley centre HadAM3H climate model, the
estimated loss level of a future scenario period (2071-2100) is compared to that of a control period (1961-1990). In spite of
substantial estimation uncertainty, our analyses identify an incontestable increase in the claims level along with some regional
variability. Of the uncertainties inherently involved in such predictions, only the error due to model fit is quantifiable.

Optimal asset allocation for passive investing with capital loss harvesting. Ostrov, Daniel N; Wong, Thomas G [RKN: 45461]
Applied Mathematical Finance (2011) 18 (3-4) : 291-329.
This article examines how to quantify and optimally utilize the beneficial effect that capital loss harvesting generates in a taxable
portfolio. We explicitly determine the optimal initial asset allocation for an investor who follows the continuous time dynamic trading
strategy of Constantinides (1983). This strategy sells and re-buys all stocks with losses, but is otherwise passive. Our model
allows the use of the stock position's full purchase history for computing the cost basis. The method can also be used to rebalance
at later times. For portfolios with one stock position and cash, the probability density function for the portfolio's state corresponds
to the solution of a 3-space + 1-time dimensional partial differential equation (PDE) with an oblique reflecting boundary condition.
Extensions of this PDE, including to the case of multiple correlated stocks, are also presented.

We detail a numerical algorithm for the PDE in the single stock case. The algorithm shows the significant effect capital loss
harvesting can have on the optimal stock allocation, and it also allows us to compute the expected additional return that capital
loss harvesting generates. Our PDE-based algorithm, compared with Monte Carlo methods, is shown to generate much more
precise results in a fraction of the time. Finally, we employ Monte Carlo methods to approximate the impact of many of our model's
assumptions.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

Pricing compound Poisson processes with the FarlieGumbelMorgenstern dependence structure. Marri, Fouad; Furman,
Edward [RKN: 45732]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 151-157.
Convenient expressions for the Esscher pricing functional in the context of the compound Poisson processes with dependent loss
amounts and loss inter-arrival times are developed. To this end, the moment generating function of the aforementioned dependent
processes is derived and studied. Various implications of the dependence are discussed and exemplified numerically.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

LOSS FUNCTIONS

Conditional tail expectation and premium calculation. Heras, Antonio; Balbs, Beatriz; Vilar, Jos Luis - 18 pages. [RKN: 70753]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2012) 42 (1) : 325-342.
In this paper we calculate premiums which are based on the minimization of the Expected Tail Loss or Conditional Tail Expectation
(CTE) of absolute loss functions. The methodology generalizes well known premium calculation procedures and gives sensible
results in practical applications. The choice of the absolute loss becomes advisable in this context since its CTE is easy to
calculate and to understand in intuitive terms. The methodology also can be applied to the calculation of the VaR and CTE of the
loss associated with a given premium.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

A multivariate aggregate loss model. Ren, Jiandong [RKN: 44799]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51(2) : 402-408.
In this paper, we introduce a multivariate aggregate loss model, where multiple categories of losses are considered. The model
assumes that different types of claims arrive according to a Marked Markovian arrival process (MMAP) introduced by He and
Neuts (1998) [Q M He, M F Neuts (1998), Markov chains with marked transitions, Stochastic Processes and their Applications, 74:
89

3752] in the queuing literature. This approach enables us to allow dependencies among the claim frequencies, and among the
claim sizes, as well as between claim frequencies and claim sizes. This model extends the (univariate) Markov modulated risk
processes (sometimes referred to as regime switching models) widely used in insurance and financial analysis. For the proposed
model, we provide formulas for calculating the joint moments of the present value of aggregate claims occurring in any time
interval (0,t]. Numerical examples are provided to show possible applications of the model.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

LOSSES

Claims development result in the paid-incurred chain reserving method. Happ, Sebastian; Merz, Michael; Wthrich, Mario V [RKN:
45724]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 66-72.
We present the one-year claims development result (CDR) in the paid-incurred chain (PIC) reserving model. The PIC reserving
model presented in Merz and Wthrich (2010) is a Bayesian stochastic claims reserving model that considers simultaneously
claims payments and incurred losses information and allows for deriving the full predictive distribution of the outstanding loss
liabilities. In this model we study the conditional mean square error of prediction (MSEP) for the one-year CDR uncertainty, which
is the crucial uncertainty view under Solvency II.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Stochastic comparisons for allocations of policy limits and deductibles with applications. Lu, ZhiYi; Meng, LiLi [RKN: 45127]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (3) : 338-343.
In this paper, we study the problem of comparing losses of a policyholder who has an increasing utility function when the form of
coverage is policy limit and deductible. The total retained losses of a policyholder [formula] are ordered in the usual stochastic
order sense when Xi(i=1,,n) are ordered with respect to the likelihood ratio order. The parallel results for the case of deductibles
are obtained in the same way. It is shown that the ordering of the losses are related to the characteristics (log-concavity or
log-convexity) of distributions of the risks. As an application of the comparison results, the optimal problems of allocations of policy
limits and deductibles are studied in usual stochastic order sense and the closed-form optimal solutions are obtained in some
special cases.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Stochastic projection for large individual losses. Drieskens, Damien; Henry, Marc; Walhin, Jean-Franois; Wielandts, Jrgen [RKN:
44929]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2012) 1 : 1-39.
In this paper we investigate how to estimate ultimate values of large losses. The method is based on the development of individual
losses and therefore allows to compute the netting impact of excess of loss reinsurance. In particular the index clause is properly
accounted for. A numerical example based on real-life data is provided.

MARINE INSURANCE

On maximum likelihood and pseudo-maximum likelihood estimation in compound insurance models with deductibles. Paulsen,
Jostein; Stubo, Knut [RKN: 45298]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU/AST
ASTIN Bulletin (2011) 41 (1) : 1-28.
Non-life insurance payouts consist of two factors: claimsizes and claim frequency. When calculating e.g. next years premium, it is
vital to correctly model these factors and to estimate the unknown parameters. A standard way is to separately estimate in the
claimsize and the claim frequency models.
Often there is a deductible with each single claim, and this deductible can be quite large, particularly in inhomogeneous cases
such as industrial fire insurance or marine insurance. Not taking the deductibles into account can lead to serious bias in the
estimates and consequent implications when applying the model.
When the deductibles are nonidentical, in a full maximum likelihood estimation all unknown parameters have to be estimated
simultaneously. An alternative is to use pseudo-maximum likelihood, i.e. first estimate the claimsize model, taking the deductibles
into account, and then use the estimated probability that a claim exceeds the deductible as an offset in the claim frequency
estimation. This latter method is less efficient, but due to complexity or time considerations, it may be the preferred option.
In this paper we will provide rather general formulas for the relative efficiency of the pseudo maximum likelihood estimators in the
i.i.d. case. Two special cases will be studied in detail, and we conclude the paper by comparing the methods on some marine
insurance data.
online access via International Actuarial Association:
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

90

MARKOV CHAIN

A Bayesian approach to parameter estimation for kernel density estimation via transformations. Liu, Qing; Pitt, David; Zhang,
Xibin; Wu, Xueyuan Institute and Faculty of Actuaries; Cambridge University Press, - 13 pages. [RKN: 74950]
Shelved at: Per: AAS (Oxf) Per: AAS (Lon)
Annals of Actuarial Science (2011) 5(2) : 181-193.
In this paper, we present a Markov chain Monte Carlo (MCMC) simulation algorithm for estimating parameters in the kernel
density estimation of bivariate insurance claim data via transformations. Our data set consists of two types of auto insurance claim
costs and exhibits a high-level of skewness in the marginal empirical distributions. Therefore, the kernel density estimator based
on original data does not perform well. However, the density of the original data can be estimated through estimating the density of
the transformed data using kernels. It is well known that the performance of a kernel density estimator is mainly determined by the
bandwidth, and only in a minor way by the kernel. In the current literature, there have been some developments in the area of
estimating densities based on transformed data, where bandwidth selection usually depends on pre-determined transformation
parameters. Moreover, in the bivariate situation, the transformation parameters were estimated for each dimension individually.
We use a Bayesian sampling algorithm and present a Metropolis-Hastings sampling procedure to sample the bandwidth and
transformation parameters from their posterior density. Our contribution is to estimate the bandwidths and transformation
parameters simultaneously within a Metropolis-Hastings sampling procedure. Moreover, we demonstrate that the correlation
between the two dimensions is better captured through the bivariate density estimator based on transformed data.
http://www.actuaries.org.uk/research-and-resources/pages/access-journals

Modelling claims run-off with reversible jump Markov chain Monte Carlo methods. Verrall, Richard; Hssjer, Ola; Bjrkwall,
Susanna - 24 pages. [RKN: 70742]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2012) 42 (1) : 35-58.
In this paper we describe a new approach to modelling the development of claims run-off triangles. This method replaces the usual
ad hoc practical process of extrapolating a development pattern to obtain tail factors with an objective procedure. An example is
given, illustrating the results in a practical context, and the WinBUGS code is supplied.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Viterbi-based estimation for Markov switching GARCH model. Routledge, [RKN: 45838]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (3-4) : 219-231.
We outline a two-stage estimation method for a Markov-switching Generalized Autoregressive Conditional Heteroscedastic
(GARCH) model modulated by a hidden Markov chain. The first stage involves the estimation of a hidden Markov chain using the
Vitberi algorithm given the model parameters. The second stage uses the maximum likelihood method to estimate the model
parameters given the estimated hidden Markov chain. Applications to financial risk management are discussed through simulated
data.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

MARKOV PROCESSES

Markov decision processes with applications to finance. Buerle, Nicole; Rieder, Ulrich (2011). - London: Springer, 2011. - 388
pages. [RKN: 73684]
Shelved at: 368.01
The theory of Markov decision processes focuses on controlled Markov chains in discrete time. The authors establish the theory
for general state and action spaces and at the same time show its application by means of numerous examples, mostly taken from
the fields of finance and operations research. By using a structural approach many technicalities (concerning measure theory) are
avoided. They cover problems with finite and infinite horizons, as well as partially observable Markov decision processes,
piecewise deterministic Markov decision processes and stopping problems. The book presents Markov decision processes in
action and includes various state-of-the-art applications with a particular view towards finance. It is useful for upper-level
undergraduates, Master's students and researchers in both applied probability and finance, and provides exercises (without
solutions).

Markowitz's mean-variance asset-liability management with regime switching: a multi-period model. Chen, Peng; Yang, Hailiang
[RKN: 45252]
Applied Mathematical Finance (2011) 18 (1-2) : 29-50.
This paper considers an optimal portfolio selection problem under Markowitz's mean-variance portfolio selection problem in a
multi-period regime-switching model. We assume that there are n + 1 securities in the market. Given an economic state which is
modelled by a finite state Markov chain, the return of each security at a fixed time point is a random variable. The return random
variables may be different if the economic state is changed even for the same security at the same time point. We start our
analysis from the no-liability case, in the spirit of Li and Ng (2000), both the optimal investment strategy and the efficient frontier
are derived. Then we add uncontrollable liability into the model. By direct comparison with the no-liability case, the optimal strategy
can be derived explicitly.

Available via Athens: Taylor & Francis Online
http://www.openathens.net



91

Multi-period mean-variance portfolio selection with regime switching and a stochastic cash flow. Wu, Huiling; Li, Zhongfei [RKN:
45640]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 371-384.
This paper investigates a non-self-financing portfolio optimization problem under the framework of multi-period meanvariance
with Markov regime switching and a stochastic cash flow. The stochastic cash flow can be explained as capital additions or
withdrawals during the investment process. Specially, the cash flow is the surplus process or the risk process of an insurer at each
period. The returns of assets and amount of the cash flow all depend on the states of a stochastic market which are assumed to
follow a discrete-time Markov chain. We analyze the existence of optimal solutions, and derive the optimal strategy and the
efficient frontier in closed-form. Several special cases are discussed and numerical examples are given to demonstrate the effect
of cash flow.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

A multivariate aggregate loss model. Ren, Jiandong [RKN: 44799]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51(2) : 402-408.
In this paper, we introduce a multivariate aggregate loss model, where multiple categories of losses are considered. The model
assumes that different types of claims arrive according to a Marked Markovian arrival process (MMAP) introduced by He and
Neuts (1998) [Q M He, M F Neuts (1998), Markov chains with marked transitions, Stochastic Processes and their Applications, 74:
3752] in the queuing literature. This approach enables us to allow dependencies among the claim frequencies, and among the
claim sizes, as well as between claim frequencies and claim sizes. This model extends the (univariate) Markov modulated risk
processes (sometimes referred to as regime switching models) widely used in insurance and financial analysis. For the proposed
model, we provide formulas for calculating the joint moments of the present value of aggregate claims occurring in any time
interval (0,t]. Numerical examples are provided to show possible applications of the model.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

On modelling and pricing rainfall derivatives with seasonality. Leobacher, Gunther; Ngare, Philip [RKN: 45254]
Applied Mathematical Finance (2011) 18 (1-2) : 71-91.
We are interested in pricing rainfall options written on precipitation at specific locations. We assume the existence of a tradeable
financial instrument in the market whose price process is affected by the quantity of rainfall. We then construct a suitable
'Markovian gamma' model for the rainfall process which accounts for the seasonal change of precipitation and show how
maximum likelihood estimators can be obtained for its parameters.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

One-dimensional pricing of CPPI. Paulot, Louis; Lacroze, Xavier [RKN: 45457]
Applied Mathematical Finance (2011) 18 (3-4) : 207-225.
Constant Proportion Portfolio Insurance (CPPI) is an investment strategy designed to give participation in the performance of a
risky asset while protecting the invested capital. This protection is, however, not perfect and the gap risk must be quantified. CPPI
strategies are path dependent and may have American exercise which makes their valuation complex. A naive description of the
state of the portfolio would involve three or even four variables. In this article we prove that the system can be described as a
discrete-time Markov process in one single variable if the underlying asset follows a process with independent increments. This
yields an efficient pricing scheme using transition probabilities. Our framework is flexible enough to handle most features of traded
CPPIs including profit lock-in and other kinds of strategies with discrete-time reallocation.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

Option valuation with a discrete-time double Markovian regime-switching model. Siu, Tak Kuen; Fung, Eric S; Ng, Michael K
Routledge, [RKN: 45524]
Applied Mathematical Finance (2011) 18 (5-6) : 473-490.
This article develops an option valuation model in the context of a discrete-time double Markovian regime-switching (DMRS)
model with innovations having a generic distribution. The DMRS model is more flexible than the traditional Markovian
regime-switching model in the sense that the drift and the volatility of the price dynamics of the underlying risky asset are
modulated by two observable, discrete-time and finite-state Markov chains, so that they are not perfectly correlated. The states of
each of the chains represent states of proxies of (macro)economic factors. Here we consider the situation that one
(macro)economic factor is caused by the other (macro)economic factor. The market model is incomplete, and so there is more
than one equivalent martingale measure. We employ a discrete-time version of the regime-switching Esscher transform to
determine an equivalent martingale measure for valuation. Different parametric distributions for the innovations of the price
dynamics of the underlying risky asset are considered. Simulation experiments are conducted to illustrate the implementation of
the model and to document the impacts of the macroeconomic factors described by the chains on the option prices under various
different parametric models for the innovations.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

Viterbi-based estimation for Markov switching GARCH model. Routledge, [RKN: 45838]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (3-4) : 219-231.
We outline a two-stage estimation method for a Markov-switching Generalized Autoregressive Conditional Heteroscedastic
(GARCH) model modulated by a hidden Markov chain. The first stage involves the estimation of a hidden Markov chain using the
Vitberi algorithm given the model parameters. The second stage uses the maximum likelihood method to estimate the model
parameters given the estimated hidden Markov chain. Applications to financial risk management are discussed through simulated
data.
Available via Athens: Taylor & Francis Online
http://www.openathens.net
92


MARTINGALE METHODS

No-good-deal, local mean-variance and ambiguity risk pricing and hedging for an insurance payment process. Delong, Lukasz -
30 pages. [RKN: 70749]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2012) 42 (1) : 203-232.
We study pricing and hedging for an insurance payment process. We investigate a Black-Scholes financial model with stochastic
coefficients and a payment process with death, survival and annuity claims driven by a point process with a stochastic intensity.
The dependence of the claims and the intensity on the financial market and on an additional background noise (correlated index,
longevity risk) is allowed. We establish a general modeling framework for no-good-deal, local mean-variance and ambiguity risk
pricing and hedging. We show that these three valuation approaches are equivalent under appropriate formulations. We
characterize the price and the hedging strategy as a solution to a backward stochastic differential equation. The results could be
applied to pricing and hedging of variable annuities, surrender options under an irrational lapse behavior and mortality derivatives.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN
MATHEMATICAL MODELS

Calculating catastrophe. Woo, Gordon (2011). - London: Imperial College Press, 2011. - 355 pages. [RKN: 73989]
Shelved at: 363.34
Calculating Catastrophe has been written to explain, to a general readership, the underlying philosophical ideas and scientific
principles that govern catastrophic events, both natural and man-made. Knowledge of the broad range of catastrophes deepens
understanding of individual modes of disaster. This book will be of interest to anyone aspiring to understand catastrophes better,
but will be of particular value to those engaged in public and corporate policy, and the financial markets.

The distributions of some quantities for Erlang(2) risk models. Dickson, David C M; Li, Shuanming (2012). - Victoria: University of
Melbourne, 2012. - 18 pages. [RKN: 73947]
We study the distributions of [1] the first time that the surplus reaches a given level and [2] the duration of negative surplus in a
Sparre Andersen risk process with the inter-claim times being Erlang(2) distributed. These distributions can be obtained through
the inversion of Laplace transforms using the inversion relationship for the Erlang(2) risk model given by Dickson and Li (2010).
No.1 (1993) onwards available online. Download as PDF.
http://www.economics.unimelb.edu.au/ACT/papers.shtml

Gram-Charlier processes and equity-indexed annuities. Chateau, Jean-Pierre; Dufresne, Daniel (2012). - Victoria: University of
Melbourne, 2012. - 32 pages. [RKN: 73948]
A Gram-Charlier distribution has a density that is a polynomial times a normal density. The historical connection between actuarial
science and the Gram- Charlier expansions goes back to the 19th century. A critical review of the financial literature on the
Gram-Charlier distribution is made. Properties of the Gram-Charlier distributions are derived, including moments, tail estimates,
moment indeterminacy of the exponential of a Gram-Charlier distributed variable, non-existence of a continuoustime Levy process
with Gram-Charlier increments, as well as formulas for option prices and their sensitivities. A procedure for simulating
Gram-Charlier distributions is given. Multiperiod Gram-Charlier modelling of asset returns is described, apparently for the first
time. Formulas for equity indexed annuities premium option values are given, and a numerical illustration shows the importance of
skewness and kurtosis of the risk neutral density.
No.1 (1993) onwards available online. Download as PDF.
http://www.economics.unimelb.edu.au/ACT/papers.shtml

Modeling dependent yearly claim totals including zero claims in private health insurance. Erhardt, Vinzenz; Czado, Claudia [RKN:
45781]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2012) 2 : 106-129.
In insurance applications yearly claim totals of different coverage fields are often dependent. In many cases there are numerous
claim totals which are zero. A marginal claim distribution will have an additional point mass at zero, hence this probability function
(pf) will not be continuous at zero and the cumulative distribution functions will not be uniform. Therefore using a copula approach
to model dependency is not straightforward. We will illustrate how to express the joint pf by copulas with discrete and continuous
margins. A pair-copula construction will be used for the fit of the continuous copula allowing to choose appropriate copulas for
each pair of margins.
http://www.openathens.net/

Optimal loss-carry-forward taxation for the Lvy risk model. Wang, Wen-Yuan; Hu, Yijun [RKN: 44997]
Insurance: Mathematics & Economics (2012) 50 (1) : 121-130.
In the spirit of Albrecher and Hipp (2007), Albrecher et al. (2008b) and Kyprianou and Zhou (2009), we consider the reserve
process of an insurance company which is governed by [formula unable to display], where X is a spectrally negative Levy process
with the usual exclusion of negative subordinator or deterministic drift, [formula unable to display] the running supremum of X, and
[formula unable to display] the rate of loss-carry-forward tax at time t which is subject to the taxation allocation policy p and is a
function of [formula unable to display]. The objective is to find the optimal policy which maximizes the total (discounted) taxation
pay-out: [formula unable to display], where [formula unable to display] and [formula unable to display] refer to the expectation
corresponding to the law of X such that [formula unable to display], and the time of ruin, respectively. With the scale function of X
denoted by [formula unable to display] and [formula unable to display] allowed to vary in , two situations are considered.

Available via Athens: Palgrave MacMillan: http://www.openathens.net
93


Pricing of Parisian options for a jump-diffusion model with two-sided jumps. Albrecher, Hansjorg; Kortschak, Dominik; Zhou,
Xiaowen Routledge, [RKN: 45796]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (1-2) : 97-129.
Using the solution of one-sided exit problem, a procedure to price Parisian barrier options in a jump-diffusion model with two-sided
exponential jumps is developed. By extending the method developed in Chesney, Jeanblanc-Picqu and Yor (1997; Brownian
excursions and Parisian barrier options, Advances in Applied Probability, 29(1), pp. 165184) for the diffusion case to the more
general set-up, we arrive at a numerical pricing algorithm that significantly outperforms Monte Carlo simulation for the prices of
such products.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

Reserving long-tail liabilities : United we stand, divided we fall. Odell, David Staple Inn Actuarial Society, - 3 pages. [RKN: 74945]
Shelved at: Per: Actuary (Oxf) Per: Actuary (Lon) Shelved at: JOU
The Actuary (2012) March : 32-34.
David Odell puts forward a method to span the rift between uncertainty and best estimates
http://www.theactuary.com/

A two-dimensional extension of Bougerols identity in law for the exponential functional of Brownian motion. Dufresne, D; Yor, M
(2011). - Victoria: University of Melbourne, 2011. - 15 pages. [RKN: 74772]
No.1 (1993) onwards available online. Download as PDF.
http://www.economics.unimelb.edu.au/ACT/papers.shtml

Unconditional distributions obtained from conditional specification models with applications in risk theory. Gomez-Deniz, E;
Calderin-Ojeda, E (2012). - Victoria: University of Melbourne, 2012. - 21 pages. [RKN: 73801]
Bivariate distributions, specified in terms of their conditional distributions, provide a powerful tool to obtain flexible distributions.
These distributions play an important role in specifying the conjugate prior in certain multiparameter Bayesian settings. In this
paper, the conditional specification technique is applied to look for more flexible distributions than the traditional ones used in the
actuarial literature, as the Poisson, negative binomial and others. The new specification draws inferences about parameters of
interest in problems appearing in actuarial statistics. Two unconditional (discrete) distributions obtained are studied and used in
the collective risk model to compute the right-tail probability of the aggregate claim size distribution. Comparisons with the
compound Poisson and compound negative binomial are made.
No.1 (1993) onwards available online. Download as PDF.
http://www.economics.unimelb.edu.au/ACT/papers.shtml

MATHEMATICS

Alarm system for insurance companies : A strategy for capital allocation. Das, S; Kratz, M [RKN: 45723]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 53-65.
One possible way of risk management for an insurance company is to develop an early and appropriate alarm system before the
possible ruin. The ruin is defined through the status of the aggregate risk process, which in turn is determined by premium
accumulation as well as claim settlement outgo for the insurance company. The main purpose of this work is to design an effective
alarm system, i.e. to define alarm times and to recommend augmentation of capital of suitable magnitude at those points to reduce
the chance of ruin. To draw a fair measure of effectiveness of alarm system, comparison is drawn between an alarm system, with
capital being added at the sound of every alarm, and the corresponding system without any alarm, but an equivalently higher initial
capital. Analytical results are obtained in general setup and this is backed up by simulated performances with various types of loss
severity distributions. This provides a strategy for suitably spreading out the capital and yet addressing survivability concerns at
factory level.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Ambiguity aversion, higher-order risk attitude and optimal effort. Huang, Rachel J [RKN: 45637]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 338-345.
In this paper, we examine whether a more ambiguity-averse individual will invest in more effort to shift her initial starting wealth
distribution toward a better target distribution. We assume that the individual has ambiguous beliefs regarding two target (starting)
distributions and that one distribution is preferred to the other. We find that an increase in ambiguity aversion will decrease
(increase) the optimal effort when the cost of effort is non-monetary. When the cost of effort is monetary, the effect depends on
whether the individual would make more effort when the target (starting) distribution is the preferred distribution than the target
(starting) distributions, the inferior one. We further characterize the individuals higher-order risk preferences to examine the
sufficient conditions.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Are quantile risk measures suitable for risk-transfer decisions?. Guerra, Manuel; Centeno, M L [RKN: 45648]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 446-461.
Although controversial from the theoretical point of view, quantile risk measures are widely used by institutions and regulators. In
this paper, we use a unified approach to find the optimal treaties for an agent who seeks to minimize one of these measures,
assuming premium calculation principles of various types. We show that the use of measures like Value at Risk or Conditional Tail
94

Expectation as optimization criteria for insurance or reinsurance leads to treaties that are not enforceable and/or are clearly bad
for the cedent. We argue that this is one further argument against the use of quantile risk measures, at least for the purpose of
risk-transfer decisions.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Characterization of left-monotone risk aversion in the RDEU model. Mao, Tiantian; Hu, Taizhong [RKN: 45644]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 413-422.
We extend the characterization of the left-monotone risk aversion developed by Ryan (2006) to the case of unbounded random
variables. The notion of weak convergence is insufficient for such an extension. It requires the solution of a host of delicate
convergence problems. To this end, some further intrinsic properties of the location independent risk order are investigated. The
characterization of the right-monotone risk aversion for unbounded random variables is also mentioned. Moreover, we remove the
gap in the proof of the main result in Ryan (2006).
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Claims development result in the paid-incurred chain reserving method. Happ, Sebastian; Merz, Michael; Wthrich, Mario V [RKN:
45724]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 66-72.
We present the one-year claims development result (CDR) in the paid-incurred chain (PIC) reserving model. The PIC reserving
model presented in Merz and Wthrich (2010) is a Bayesian stochastic claims reserving model that considers simultaneously
claims payments and incurred losses information and allows for deriving the full predictive distribution of the outstanding loss
liabilities. In this model we study the conditional mean square error of prediction (MSEP) for the one-year CDR uncertainty, which
is the crucial uncertainty view under Solvency II.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Comparison and bounds for functionals of future lifetimes consistent with life tables. Barz, Christiane; Muller, Alfred [RKN: 45596]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (2) : 229-235.
We derive a new crossing criterion of hazard rates to identify a stochastic order relation between two random variables. We apply
this crossing criterion in the context of life tables to derive stochastic ordering results among three families of fractional age
assumptions: the family of linear force of mortality functions, the family of quadratic survival functions and the power family.
Further, this criterion is used to derive tight bounds for functionals of future lifetimes that exhibit an increasing force of mortality
with given one-year survival probabilities. Numerical examples illustrate our findings.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Comparison of increasing directionally convex transformations of random vectors with a common copula. Belzunce, Felix;
Suarez-Llorens, Alfonso; Sordo, Miguel A [RKN: 45641]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 385-390.
Let X and Y be two random vectors in Rn sharing the same dependence structure, that is, with a common copula. As many authors
have pointed out, results of the following form are of interest: under which conditions, the stochastic comparison of the marginals
of X and Y is a sufficient condition for the comparison of the expected values for some transformations of these random vectors?
Assuming that the components are ordered in the univariate dispersive orderwhich can be interpreted as a multivariate
dispersion ordering between the vectorsthe main purpose of this work is to show that a weak positive dependence property, such
as the positive association property, is enough for the comparison of the variance of any increasing directionally convex
transformation of the vectors. Some applications in premium principles, optimization and multivariate distortions are described.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Copula based hierarchical risk aggregation through sample reordering. Arbenz, Philipp; Hummel, Christoph; Mainik, Georg [RKN:
45729]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 122-133.
For high-dimensional risk aggregation purposes, most popular copula classes are too restrictive in terms of attainable
dependence structures. These limitations aggravate with increasing dimension. We study a hierarchical risk aggregation method
which is flexible in high dimensions. With this method it suffices to specify a low dimensional copula for each aggregation step in
the hierarchy. Copulas and margins of arbitrary kind can be combined. We give an algorithm for numerical approximation which
introduces dependence between originally independent marginal samples through reordering.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

DeltaGamma hedging of mortality and interest rate risk. Luciano, Elisa; Regis, Luca; Vigna, Elena [RKN: 45643]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 402-412.
One of the major concerns of life insurers and pension funds is the increasing longevity of their beneficiaries. This paper studies
the hedging problem of annuity cash flows when mortality and interest rates are stochastic. We first propose a DeltaGamma
hedging technique for mortality risk. The risk factor against which to hedge is the difference between the actual mortality intensity
in the future and its forecast today, the forward intensity. We specialize the hedging technique first to the case in which mortality
intensities are affine, then to OrnsteinUhlenbeck and Feller processes, providing actuarial justifications for this selection. We
show that, without imposing no arbitrage, we can get equivalent probability measures under which the HJM condition for no
95

arbitrage is satisfied. Last, we extend our results to the presence of both interest rate and mortality risk. We provide a UK
calibrated example of DeltaGamma hedging of both mortality and interest rate risk.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Dependence modeling in non-life insurance using the Bernstein copula. Diers, Dorothea; Eling, Martin; Marek, Sebastian D [RKN:
45646]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 430-436.
This paper illustrates the modeling of dependence structures of non-life insurance risks using the Bernstein copula. We conduct a
goodness-of-fit analysis and compare the Bernstein copula with other widely used copulas. Then, we illustrate the use of the
Bernstein copula in a value-at-risk and tail-value-at-risk simulation study. For both analyses we utilize German claims data on
storm, flood, and water damage insurance for calibration. Our results highlight the advantages of the Bernstein copula, including
its flexibility in mapping inhomogeneous dependence structures and its easy use in a simulation context due to its representation
as mixture of independent Beta densities. Practitioners and regulators working toward appropriate modeling of dependences in a
risk management and solvency context can benefit from our results.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Determination of the probability distribution measures from market option prices using the method of maximum entropy in the
mean. Gzyl, Henryk; Mayoral, Silvia Routledge, [RKN: 45841]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (3-4) : 299-312.
We consider the problem of recovering the risk-neutral probability distribution of the price of an asset, when the information
available consists of the market price of derivatives of European type having the asset as underlying. The information available
may or may not include the spot value of the asset as data. When we only know the true empirical law of the underlying, our
method will provide a measure that is absolutely continuous with respect to the empirical law, thus making our procedure model
independent. If we assume that the prices of the derivatives include risk premia and/or transaction prices, using this method it is
possible to estimate those values, as well as the no-arbitrage prices. This is of interest not only when the market is not complete,
but also if for some reason we do not have information about the model for the price of the underlying.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

Discussion of Paper Already Published : A Bayesian Log-Normal Model for Multivariate Loss Reserving. Wthrich, Mario V
Society of Actuaries, - 4 pages. [RKN: 70668]
Shelved at: Per: NAAJ (Oxf) Per NAAJ (Lon) Shelved at: JOU
North American Actuarial Journal (2012) 16 (3) : 398-401.
Discussion of the previously published paper A Bayesian Log-Normal Model for Multivariate Loss Reserving" by Peng Shi, Sanjib
Basu and Glenn G. Meyers.
http://www.soa.org/news-and-publications/publications/journals/naaj/naaj-detail.aspx

Dividends and reinsurance under a penalty for ruin. Liang, Zhibin; Young, Virginia R [RKN: 45647]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 437-445.
We find the optimal dividend strategy in a diffusion risk model under a penalty for ruin, as in Thonhauser and Albrecher (2007),
although we allow for both a positive and a negative penalty. Furthermore, we determine the optimal proportional reinsurance
strategy, when so-called expensive reinsurance is available; that is, the premium loading on reinsurance is greater than the
loading on the directly written insurance. One can think of our model as taking the one in Taksar (2000, Section 6) and adding a
penalty for ruin. We use the Legendre transform to obtain the optimal dividend and reinsurance strategies. Not surprisingly, the
optimal dividend strategy is a barrier strategy. Also, we investigate the effect of the penalty P on the optimal strategies. In
particular, we show that the optimal barrier increases with respect to P, while the optimal proportion retained and the value
function decrease with respect to P. In the end, we explore the time of ruin, and find that the expected time of ruin increases with
respect to P under a net profit condition.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Dynamic hedging of conditional value-at-risk. Melnikov, Alexander; Smirnov, Ivan [RKN: 45735]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 182-190.
In this paper, the problem of partial hedging is studied by constructing hedging strategies that minimize conditional value-at-risk
(CVaR) of the portfolio. Two dual versions of the problem are considered: minimization of CVaR with the initial wealth bounded
from above, and minimization of hedging costs subject to a CVaR constraint. The NeymanPearson lemma approach is used to
deduce semi-explicit solutions. Our results are illustrated by constructing CVaR-efficient hedging strategies for a call option in the
BlackScholes model and also for an embedded call option in an equity-linked life insurance contract.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Dynamic portfolio optimization in discrete-time with transaction costs. Atkinson, Colin; Quek, Gary Routledge, [RKN: 45840]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (3-4) : 265-298.
A discrete-time model of portfolio optimization is studied under the effects of proportional transaction costs. A general class of
underlying probability distributions is assumed for the returns of the asset prices. An investor with an exponential utility function
seeks to maximize the utility of terminal wealth by determining the optimal investment strategy at the start of each time step.
Dynamic programming is used to derive an algorithm for computing the optimal value function and optimal boundaries of the
96

no-transaction region at each time step. In the limit of small transaction costs, perturbation analysis is applied to obtain the optimal
value function and optimal boundaries at any time step in the rebalancing of the portfolio.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

Fundamentals of actuarial mathematics. Promislow, S David (2011). - 2nd ed. - Chicester: John Wiley & Sons Ltd, 2011. - 449
pages. [RKN: 45062]
Shelved at: EM/VA Shelved at: 368.01 PRO
This book provides a comprehensive introduction to actuarial mathematics, covering both deterministic and stochastic models of
life contingencies, as well as more advanced topics such as risk theory, credibility theory and multistate models. This new edition
includes additional material on credibility theory, continuous time multistate models, more complex types of contingent
insurances, flexible contracts such as universal life, the risk measures VaR and TVaR. Key Features: Covers much of the syllabus
material on the modeling examinations of the Society of Actuaries, Canadian Institute of Actuaries and the Casualty Actuarial
Society. (SOACIA exams MLC and C, CSA exams 3L and 4.), Extensively revised and updated with new material, Orders the
topics specifically to facilitate learning, Provides a streamlined approach to actuarial notation, Employs modern computational
methods, Contains a variety of exercises, both computational and theoretical, together with answers, enabling use for selfstudy.
An ideal text for students planning for a professional career as actuaries, providing a solid preparation for the modeling
examinations of the major actuarial associations. Furthermore, this book is highly suitable reference for those wanting a sound
introduction to the subject, and for those working in insurance, annuities and pensions.

A generalized penalty function for a class of discrete renewal processes. Woo, Jae-Kyung [RKN: 45782]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2012) 2 : 130-152.
Analysis of a generalized GerberShiu function is considered in a discrete-time (ordinary) Sparre Andersen renewal risk process
with time-dependent claim sizes. The results are then applied to obtain ruin-related quantities under some renewal risk processes
assuming specific interclaim distributions such as a discrete K n distribution and a truncated geometric distribution (i.e. compound
binomial process). Furthermore, the discrete delayed renewal risk process is considered and results related to the ordinary
process are derived as well.
http://www.openathens.net/

HaezendonckGoovaerts risk measures and Orlicz quantiles. Bellini, Fabio; Gianin, Emanuela Rosazza [RKN: 45727]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 107-114.
In this paper, we study the well-known HaezendonckGoovaerts risk measures on their natural domain, that is on Orlicz spaces
and, in particular, on Orlicz hearts. We provide a dual representation as well as the optimal scenario in such a representation and
investigate the properties of the minimizer (that we call Orlicz quantile) in the definition of the HaezendonckGoovaerts risk
measure. Since Orlicz quantiles fail to satisfy an internality property, bilateral Orlicz quantiles are also introduced and analyzed.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

The herd behavior index : A new measure for the implied degree of co-movement in stock markets. Dhaene, Jan; Linders, Daniel;
Schoutens, Wim; Vyncke, David [RKN: 45639]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 357-370.
We introduce a new and easy-to-calculate measure for the expected degree of herd behaviour or co-movement between stock
prices. This forward looking measure is model-independent and based on observed option data. It is baptized the Herd Behavior
Index (HIX). The degree of co-movement in a stock market can be determined by comparing the observed market situation with
the extreme (theoretical) situation under which the whole system is driven by a single factor. The HIX is then defined as the ratio of
an option-based estimate of the risk-neutral variance of the market index and an option-based estimate of the corresponding
variance in case of the extreme single factor market situation. The HIX can be determined for any market index provided an
appropriate series of vanilla options is traded on this index as well as on its components. As an illustration, we determine historical
values of the 30-days HIX for the Dow Jones Industrial Average, covering the period January 2003 to October 2009.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

An improvement of the BerryEsseen inequality with applications to Poisson and mixed Poisson random sums. Korolev, Victor;
Shevtsova, Irina [RKN: 45780]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2012) 2 : 81-105.
By a modification of the method that was applied in study of Korolev & Shevtsova (2009), here the inequalities * and * are proved
for the uniform distance (F n ,F) between the standard normal distribution function F and the distribution function F n of the
normalized sum of an arbitrary number n=1 of independent identically distributed random variables with zero mean, unit variance,
and finite third absolute moment 3. The first of these two inequalities is a structural improvement of the classical BerryEsseen
inequality and as well sharpens the best known upper estimate of the absolute constant in the classical BerryEsseen inequality
since 0.33477(3+0.429)=0.33477(1+0.429)3<0.47843 by virtue of the condition 3=1. The latter inequality is applied to
lowering the upper estimate of the absolute constant in the analog of the BerryEsseen inequality for Poisson random sums to
0.3041 which is strictly less than the least possible value 0.4097 of the absolute constant in the classical BerryEsseen
inequality. As corollaries, the estimates of the rate of convergence in limit theorems for compound mixed Poisson distributions are
refined.
http://www.openathens.net/

Insurance pricing with complete information, state-dependent utility, and production costs. Ramsay, Colin M; Oguledo, Victor I
[RKN: 45649]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 462-469.
97

We consider a group of identical risk-neutral insurers selling single-period indemnity insurance policies. The insurance market
consists of individuals with common state-dependent utility function who are identical except for their known accident probability q.
Insurers incur production costs (commonly called expenses or transaction costs by actuaries) that are proportional to the amount
of insurance purchased and to the premium charged. By introducing the concept of insurance desirability, we prove that the
existence of insurer expenses generates a pair of constants qmin and qmax that naturally partitions the applicant pool into three
mutually exclusive and exhaustive groups of individuals: those individuals with accident probability q [0,qmin) are insurable but do
not desire insurance, those individuals with accident probability q [qmin,qmax] are insurable and desire insurance, and those
individuals with accident probability q (qmax,1] desire insurance but are uninsurable. We also prove that, depending on the level of
q and the marginal rate of substitution between states, it may be optimal for individuals to buy complete (full) insurance, partial
insurance, or no insurance at all. Finally, we prove that when q is known in monopolistic markets (i.e., markets with a single
insurer), applicants may be induced to over insure whenever partial insurance is bought.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Jackknife empirical likelihood method for some risk measures and related quantities. Peng, Liang; Qi, Yongcheng; Wang, Ruodu;
Yang, Jingping [RKN: 45731]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 142-150.
Quantifying risks is of importance in insurance. In this paper, we employ the jackknife empirical likelihood method to construct
confidence intervals for some risk measures and related quantities studied by Jones and Zitikis (2003). A simulation study shows
the advantages of the new method over the normal approximation method and the naive bootstrap method.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

The joint distribution of the time to ruin and the number of claims until ruin in the classical risk model. Dickson, David C M [RKN:
45636]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 334-337.
We use probabilistic arguments to derive an expression for the joint density of the time to ruin and the number of claims until ruin
in the classical risk model. From this we obtain a general expression for the probability function of the number of claims until ruin.
We also consider the moments of the number of claims until ruin and illustrate our results in the case of exponentially distributed
individual claims. Finally, we briefly discuss joint distributions involving the surplus prior to ruin and deficit at ruin.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Longevity/mortality risk modeling and securities pricing. Deng, Yinglu; Brockett, Patrick L; MacMinn, Richard D - 25 pages. [RKN:
70413]
Shelved at: Per: J.Risk Ins (Oxf) Shelved at: JOU
Journal of Risk and Insurance (2012) 79 (3) : 697-721.
Securitizing longevity/mortality risk can transfer longevity/mortality risk to capital markets. Modeling and forecasting mortality rate
is key to pricing mortality-linked securities. Catastrophic mortality and longevity jumps occur in historical data and have an
important impact on security pricing. This article introduces a stochastic diffusion model with a double-exponential jump diffusion
process that captures both asymmetric rate jumps up and down and also cohort effect in mortality trends. The model exhibits
calibration advantages and mathematical tractability while better fitting the data. The model provides a closed-form pricing solution
for J.P. Morgans q-forward contract usable as a building block for hedging.
Available online via Athens

Managing longevity and disability risks in life annuities with long term care. Levantesi, Susanna; Menzietti, Massimiliano [RKN:
45642]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 391-401.
The aim of the paper is twofold. Firstly, it develops a model for risk assessment in a portfolio of life annuities with long term care
benefits. These products are usually represented by a Markovian Multi-State model and are affected by both longevity and
disability risks. Here, a stochastic projection model is proposed in order to represent the future evolution of mortality and disability
transition intensities. Data from the Italian National Institute of Social Security (INPS) and from Human Mortality Database (HMD)
are used to estimate the model parameters. Secondly, it investigates the solvency in a portfolio of enhanced pensions. To this aim
a risk model based on the portfolio risk reserve is proposed and different rules to calculate solvency capital requirements for life
underwriting risk are examined. Such rules are then compared with the standard formula proposed by the Solvency II project.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

A maximum-entropy approach to the linear credibility formula. Najafabadi, Amir T. Payandeh; Hatami, Hamid; Najafabadi, Maryam
Omidi [RKN: 45738]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 216-221.
Payandeh [Payandeh Najafabadi, A.T., 2010. A new approach to credibility formula. Insurance: Mathematics and Economy 46,
334338] introduced a new technique to approximate a Bayes estimator with the exact credibilitys form. This article employs a
well known and powerful maximum-entropy method (MEM) to extend results of Payandeh Najafabadi (2010) to a class of linear
credibility, whenever claim sizes have been distributed according to the logconcave distributions. Namely, (i) it employs the
maximum-entropy method to approximate an appropriate Bayes estimator (with respect to either the square-error or the Linex
loss functions and general increasing and bounded prior distribution) by a linear combination of claim sizes; (ii) it establishes that
such an approximation coincides with the exact credibility formula whenever the require conditions for the exact credibility (see
below) are held. Some properties of such an approximation are discussed. Application to crop insurance has been given.
Available via Athens: Palgrave MacMillan
http://www.openathens.net
98


Measurement and modelling of dependencies in economic capital. Shaw, R A; Smith, A D; Spivak, G S - 99 pages. [RKN: 73863]
Shelved at: Per: BAJ (Oxf) Per: BAJ (Lon) Shelved at: REF
BAJ (2011) 16 (3) : 601-699.
This paper covers a number of different topics related to the measurement and modelling of dependency within economic capital
models. The scope of the paper is relatively wide. We address in some detail the different approaches to modelling dependencies
ranging from the more common variance-covariance matrix approach, to the consideration of the use of copulas and the more
sophisticated causal models that feature feedback loops and other systems design ideas.
There are many data and model uncertainties in modelling dependency and so we have also endeavoured to cover topics such as
spurious relationships and wrong-way risk to highlight some of the uncertainties involved.
With the advent of the internal model approval process under Solvency II, senior management needs to have a greater
understanding of dependency methodology. We have devoted a section of this paper to a discussion of possible different ways to
communicate the results of modelling to the board, senior management and other interested parties within an insurance company.
We have endeavoured throughout this paper to include as many numerical examples as possible to help in the understanding of
the key points, including our discussion of model parameterisation and the communication to an insurance executive of the impact
of dependency on economic capital modelling results.
The economic capital model can be seen as a combination of two key components: the marginal risk distribution of each risk and
the aggregation methodology which combines these into a single aggregate distribution or capital number. This paper is
concerned with the aggregation part, the methods and assumptions employed and the issues arising, and not the determination of
the marginal risk distributions which is equally of importance and in many cases equally as complex.
http://www.actuaries.org.uk/research-and-resources/pages/members-access-journals

Measurement and modelling of dependencies in economic capital : Abstract of the London discussion. Shaw, Richard - 21 pages.
[RKN: 73864]
Shelved at: Per: BAJ (Oxf) Per: BAJ (Lon) Shelved at: REF
BAJ (2011) 16 (3) : 701-721.
This abstract relates to the following paper:

Shaw, R.A., Smith, A.D. & Spivak, G.S. Measurement and modelling of dependencies in economic
capital. British Actuarial Journal, 16 (3).
http://www.actuaries.org.uk/research-and-resources/pages/members-access-journals

Modeling dependence dynamics through copulas with regime switching. Silvo Filho, Osvaldo Candido da; Ziegelmann, Flavio
Augusto; Dueker, Michael J [RKN: 45638]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 346-356.
Measuring dynamic dependence between international financial markets has recently attracted great interest in financial
econometrics because the observed correlations rose dramatically during the 200809 global financial crisis. Here, we propose a
novel approach for measuring dependence dynamics. We include a hidden Markov chain (MC) in the equation describing
dependence dynamics, allowing the unobserved time-varying dependence parameter to vary according to both a restricted ARMA
process and an unobserved two-state MC. Estimation is carried out via the inference for the margins in conjunction with
filtering/smoothing algorithms. We use block bootstrapping to estimate the covariance matrix of our estimators. Monte Carlo
simulations compare the performance of regime switching and no switching models, supporting the regime-switching
specification. Finally the proposed approach is applied to empirical data, through the study of the S&P500 (USA), FTSE100 (UK)
and BOVESPA (Brazil) stock market indexes.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Multi-period mean-variance portfolio selection with regime switching and a stochastic cash flow. Wu, Huiling; Li, Zhongfei [RKN:
45640]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 371-384.
This paper investigates a non-self-financing portfolio optimization problem under the framework of multi-period meanvariance
with Markov regime switching and a stochastic cash flow. The stochastic cash flow can be explained as capital additions or
withdrawals during the investment process. Specially, the cash flow is the surplus process or the risk process of an insurer at each
period. The returns of assets and amount of the cash flow all depend on the states of a stochastic market which are assumed to
follow a discrete-time Markov chain. We analyze the existence of optimal solutions, and derive the optimal strategy and the
efficient frontier in closed-form. Several special cases are discussed and numerical examples are given to demonstrate the effect
of cash flow.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Multivariate insurance models: an overview. Anastasiadis, Simon; Chukova, Stefanka [RKN: 45739]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 222-227.
This literature review summarizes the results from a collection of research papers that relate to modeling insurance claims and the
processes associated with them. We consider work by more than 55 authors, published or presented between 1971 and 2008.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Multivariate longitudinal modeling of insurance company expenses. Shi, Peng [RKN: 45737]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 204-215.
Insurers, investors and regulators are interested in understanding the behaviour of insurance company expenses, due to the high
operating cost of the industry. Expense models can be used for prediction, to identify unusual behavior, and to measure firm
efficiency. Current literature focuses on the study of total expenses that consist of three components: underwriting, investment and
99

loss adjustment. A joint study of expenses by type is to deliver more information and is critical in understanding their relationship.

This paper introduces a copula regression model to examine the three types of expenses in a longitudinal context. In our method,
elliptical copulas are employed to accommodate the between-subject contemporaneous and lag dependencies, as well as the
within-subject serial correlations of the three types. Flexible distributions are allowed for the marginals of each type with covariates
incorporated in distribution parameters. A model validation procedure based on a t-plot method is proposed for in-sample and
out-of-sample validation purposes. The multivariate longitudinal model effectively addresses the typical features of expenses
data: the heavy tails, the strong individual effects and the lack of balance.

The analysis is performed using propertycasualty insurance company expenses data from the National Association of Insurance
Commissioners of years 20012006. A unique set of covariates is determined for each type of expenses. We found that
underwriting expenses and loss adjustment expenses are complements rather than substitutes. The model is shown to be
successful in efficiency classification. Also, a multivariate predictive density is derived to quantify the future values of an insurers
expenses.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Multivariate stress scenarios and solvency. McNeil, Alexander J; Smith, Andrew D [RKN: 45634]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 299-308.
We show how the probabilistic concepts of half-space trimming and depth may be used to define convex scenario sets Qa for
stress testing the risk factors that affect the solvency of an insurance company over a prescribed time period. By choosing the
scenario in Qa which minimizes net asset value at the end of the time period, we propose the idea of the least solvent likely event
(LSLE) as a solution to the forward stress testing problem. By considering the support function of the convex scenario set Qa, we
establish theoretical properties of the LSLE when financial risk factors can be assumed to have a linear effect on the net assets of
an insurer. In particular, we show that the LSLE may be interpreted as a scenario causing a loss equivalent to the Value-at-Risk
(VaR) at confidence level a, provided the a-quantile is a subadditive risk measure on linear combinations of the risk factors. In this
case, we also show that the LSLE has an interpretation as a per-unit allocation of capital to the underlying risk factors when the
overall capital is determined according to the VaR. These insights allow us to define alternative scenario sets that relate in similar
ways to coherent measures, such as expected shortfall. We also introduce the most likely ruin event (MLRE) as a solution to the
problem of reverse stress testing.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

A new class of models for heavy tailed distributions in finance and insurance risk. Ahn, Soohan; Kim, Joseph H T; Ramaswami,
Vaidyanathan [RKN: 45722]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 43-52.
Many insurance loss data are known to be heavy-tailed. In this article we study the class of Log phase-type (LogPH) distributions
as a parametric alternative in fitting heavy tailed data. Transformed from the popular phase-type distribution class, the LogPH
introduced by Ramaswami exhibits several advantages over other parametric alternatives. We analytically derive its tail related
quantities including the conditional tail moments and the mean excess function, and also discuss its tail thickness in the context of
extreme value theory. Because of its denseness proved herein, we argue that the LogPH can offer a rich class of heavy-tailed loss
distributions without separate modeling for the tail side, which is the case for the generalized Pareto distribution (GPD). As a
numerical example we use the well-known Danish fire data to calibrate the LogPH model and compare the result with that of the
GPD. We also present fitting results for a set of insurance guarantee loss data.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

On a multi-threshold compound Poisson surplus process with interest. Mitric, Ilie-Radu [RKN: 45353]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 2 : 75-95.
We consider a multi-threshold compound Poisson surplus process. When the initial surplus is between any two consecutive
thresholds, the insurer has the option to choose the respective premium rate and interest rate. Also, the model allows for
borrowing the current amount of deficit whenever the surplus falls below zero. Starting from the integro-differential equations
satisfied by the Gerber-Shiu function that appear in Yang et al. (2008), we consider exponentially and phase-type(2) distributed
claim sizes, in which cases we are able to transform the integro-differential equations into ordinary differential equations. As a
result, we obtain explicit expressions for the Gerber-Shiu function.

On allocation of upper limits and deductibles with dependent frequencies and comonotonic severities. Li, Xiaohu; You, Yinping
[RKN: 45645]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 423-429.
With the assumption of Archimedean copula for the occurrence frequencies of the risks covered by an insurance policy, this note
further investigates the allocation problem of upper limits and deductibles addressed in Hua and Cheung (2008a). Sufficient
conditions for a risk averse policyholder to well allocate the upper limits and the deductibles are built, respectively.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

On the analysis of a general class of dependent risk processes. Willmot, Gordon E; Woo, Jae-Kyung [RKN: 45730]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 134-141.
A generalized Sparre Andersen risk process is examined, whereby the joint distribution of the interclaim time and the ensuing
claim amount is assumed to have a particular mathematical structure. This structure is present in various dependency models
which have previously been proposed and analyzed. It is then shown that this structure in turn often implies particular functional
forms for joint discounted densities of ruin related variables including some or all of the deficit at ruin, the surplus immediately prior
100

to ruin, and the surplus after the second last claim. Then, employing a fairly general interclaim time structure which involves a
combination of Erlang type densities, a complete identification of a generalized GerberShiu function is provided. An application is
given applying these results to a situation involving a mixed Erlang type of claim amount assumption. Various examples and
special cases of the model are then considered, including one involving a bivariate Erlang mixture model.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

On the interplay between distortion, mean value and Haezendonck-Goovaerts risk measures. Goovaerts, Marc; Linders, Daniel;
Van Weert, Koen; Tank, Fatih [RKN: 45719]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 10-18.
In the actuarial research, distortion, mean value and HaezendonckGoovaerts risk measures are concepts that are usually treated
separately. In this paper we indicate and characterize the relation between these different risk measures, as well as their relation
to convex risk measures. While it is known that the mean value principle can be used to generate premium calculation principles,
we will show how they also allow to generate solvency calculation principles. Moreover, we explain the role provided for the
distortion risk measures as an extension of the Tail Value-at-Risk (TVaR) and Conditional Tail Expectation (CTE).
Available via Athens: Palgrave MacMillan
http://www.openathens.net

On the wrong foot : Letter to the editor. Cox, Andrew Staple Inn Actuarial Society, - 1 pages. [RKN: 73932]
Shelved at: Per: Actuary (Oxf) Per: Actuary (Lon) Shelved at: JOU
The Actuary (2012) May : 6.
Suggesting that Verrall and England should have accepted Leong's use of the phrase 'bootstrap model' as it has changed in
meaning to become a shorthand for the Poisson model in some actuarial circles.
http://www.theactuary.com/

Optimal asset allocation for DC pension plans under inflation. Han, Nan-wei; Hung, Mao-Wei [RKN: 45734]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 172-181.
In this paper, the stochastic dynamic programming approach is used to investigate the optimal asset allocation for a
defined-contribution pension plan with downside protection under stochastic inflation. The plan participant invests the fund wealth
and the stochastic interim contribution flows into the financial market. The nominal interest rate model is described by the
CoxIngersollRoss (Cox et al., 1985) dynamics. To cope with the inflation risk, the inflation indexed bond is included in the asset
menu. The retired individuals receive an annuity that is indexed by inflation and a downside protection on the amount of this
annuity is considered. The closed-form solution is derived under the CRRA utility function. Finally, a numerical application is
presented to characterize the dynamic behavior of the optimal investment strategy.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Optimal dividend policies for compound Poisson processes : The case of bounded dividend rates. Azcue, Pablo; Muler, Nora
[RKN: 45721]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 26-42.
We consider in this paper the optimal dividend problem for an insurance company whose uncontrolled reserve process evolves as
a classical CramrLundberg model with arbitrary claim-size distribution. Our objective is to find the dividend payment policy
which maximizes the cumulative expected discounted dividend pay-outs until the time of bankruptcy imposing a ceiling on the
dividend rates. We characterize the optimal value function as the unique bounded viscosity solution of the associated
HamiltonJacobiBellman equation. We prove that there exists an optimal dividend strategy and that this strategy is stationary
with a band structure. We study the regularity of the optimal value function. We find a characterization result to check optimality
even in the case where the optimal value function is not differentiable. We construct examples where the claim-size distribution is
smooth but the optimal dividend policy is not threshold and the optimal value function is not differentiable. We study the survival
probability of the company under the optimal dividend policy. We also present examples where the optimal dividend policy has
infinitely many bands even in the case that the claim-size distribution has a bounded density.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Optimal time-consistent investment and reinsurance strategies for insurers under Hestons SV model. Li, Zhongfei; Zeng, Yan;
Lai, Yongzeng [RKN: 45736]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 191-203.
This paper considers the optimal time-consistent investment and reinsurance strategies for an insurer under Hestons stochastic
volatility (SV) model. Such an SV model applied to insurers portfolio problems has not yet been discussed as far as we know. The
surplus process of the insurer is approximated by a Brownian motion with drift. The financial market consists of one risk-free asset
and one risky asset whose price process satisfies Hestons SV model. Firstly, a general problem is formulated and a verification
theorem is provided. Secondly, the closed-form expressions of the optimal strategies and the optimal value functions for the
meanvariance problem without precommitment are derived under two cases: one is the investmentreinsurance case and the
other is the investment-only case. Thirdly, economic implications and numerical sensitivity analysis are presented for our results.
Finally, some interesting phenomena are found and discussed.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Parametric mortality improvement rate modelling and projecting. Haberman, Steven; Renshaw, Arthur [RKN: 45635]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 309-333.
We investigate the modelling of mortality improvement rates and the feasibility of projecting mortality improvement rates (as
101

opposed to projecting mortality rates), using parametric predictor structures that are amenable to simple time series forecasting.
This leads to our proposing a parallel dual approach to the direct parametric modelling and projecting of mortality rates.
Comparisons of simulated life expectancy predictions (by the cohort method) using the England and Wales population mortality
experiences for males and females under a variety of controlled data trimming exercises are presented in detail and comparisons
are also made between the parallel modelling approaches.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

A performance analysis of participating life insurance contracts. Faust, Roger; Schmeiser, Hato; Zemp, Alexandra [RKN: 45733]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 158-171.
Participating life insurance contracts are one of the most important products in the European life insurance market. Even though
these contract forms are very common, only very little research has been conducted in respect to their performance. Hence, we
conduct a performance analysis to provide a decision support for policyholders. We decompose a participating life insurance
contract in a term life insurance and a savings part and simulate the cash flow distribution of the latter. Simulation results are
compared with cash flows resulting from two benchmarks investing in the same portfolio of assets but without investment
guarantees and bonus distribution schemes, in order to measure the impact of these two product features. To provide a realistic
picture within the two alternatives, we take transaction costs and wealth transfers between different groups of policyholders into
account. We show that the payoff distribution strongly depends on the initial reserve situation and managerial discretion. Results
indicate that policyholders will in general profit from a better payoff distribution of the participating life insurance compared to a
mutual fund benchmark but not compared to an exchange-traded fund benchmark portfolio.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Portfolio selection through an extremality stochastic order. Laniado, Henry; Lillo, Rosa E; Pellerey, Franco; Romo, Juan [RKN:
45718]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 1-9.
In this paper, we introduce a new multivariate stochastic order that compares random vectors in a direction which is determined by
a unit vector, generalizing the previous upper and lower orthant orders. The main properties of this new order, together with its
relationships with other multivariate stochastic orders, are investigated and, we present some examples of application in the
determination of optimal allocations of wealth among risks in single period portfolio problems.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Pricing compound Poisson processes with the FarlieGumbelMorgenstern dependence structure. Marri, Fouad; Furman,
Edward [RKN: 45732]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 151-157.
Convenient expressions for the Esscher pricing functional in the context of the compound Poisson processes with dependent loss
amounts and loss inter-arrival times are developed. To this end, the moment generating function of the aforementioned dependent
processes is derived and studied. Various implications of the dependence are discussed and exemplified numerically.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Rational thinking : Letter to the editor. Karsten, H Staple Inn Actuarial Society, - 1 pages. [RKN: 73908]
Shelved at: Per: Actuary (Oxf) Per: Actuary (Lon) Shelved at: JOU
The Actuary (2011) October : 6.
Outlining the background to computable numbers.
http://www.theactuary.com/

RPI paper ignores geometric query : Letter to the Editor. Sibbett, Trevor A Staple Inn Actuarial Society, - 1 pages. [RKN: 71128]
Shelved at: Per: Actuary (Oxf) Per: Actuary (Lon) Shelved at: JOU
The Actuary (2012) November : 6.
The National Statistician has again ignored the query on the geometric mean raised by Trevor Sibbett and Roy Colbran in letters
to The Actuary.
http://www.theactuary.com/

Ruin by dynamic contagion claims. Dassios, Angelos; Zhao, Hongbiao [RKN: 45726]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 93-106.
In this paper, we consider a risk process with the arrival of claims modelled by a dynamic contagion process, a generalisation of
the Cox process and Hawkes process introduced by Dassios and Zhao (2011). We derive results for the infinite horizon model that
are generalisations of the CramrLundberg approximation, Lundbergs fundamental equation, some asymptotics as well as
bounds for the probability of ruin. Special attention is given to the case of exponential jumps and a numerical example is provided.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Solutions of the problems and riders proposed in the Senate-House examination for 1854. MacKenzie, C F; Walton, W (2012).
2012. - 225 pages. [RKN: 70002]
Shelved at: 510.29
Reprint of a digital scan


102

Stochastic comparisons of capital allocations with applications. Xu, Maochao; Hu, Taizhong [RKN: 45633]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 293-298.
This paper studies capital allocation problems using a general loss function. Stochastic comparisons are conducted for general
loss functions in several scenarios: independent and identically distributed risks; independent but non-identically distributed risks;
comonotonic risks. Applications in optimal capital allocations and policy limits allocations are discussed as well.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Tail distortion risk and its asymptotic analysis. Zhu, Li; Li, Haijun [RKN: 45728]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 115-121.
A distortion risk measure used in finance and insurance is defined as the expected value of potential loss under a scenario
probability measure. In this paper, the tail distortion risk measure is introduced to assess tail risks of excess losses modeled by the
right tails of loss distributions. The asymptotic linear relation between tail distortion and value-at-risk is derived for heavy-tailed
losses with the linear proportionality constant depending only on the distortion function and the tail index. Various examples
involving tail distortions for location-invariant, scale-invariant, and shape-invariant loss distribution families are also presented to
illustrate the results.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

These boots are made for working : Letter to the editor. Verrall, Richard; England, Peter Staple Inn Actuarial Society, - 1 pages.
[RKN: 73929]
Shelved at: Per: Actuary (Oxf) Per: Actuary (Lon) Shelved at: JOU
The Actuary (2012) April : 6.
Corrects Jessica Leong (December edition) in her article referring to the Bootstrap model. This should be referred to as a Chain
Ladder model, as bootstrapping is a statistical procedure.
http://www.theactuary.com/

The time to ruin and the number of claims until ruin for phase-type claims. Frostig, Esther; Pitts, Susan M; Politis, Konstadinos
[RKN: 45720]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 19-25.
We consider a renewal risk model with phase-type claims, and obtain an explicit expression for the joint transform of the time to
ruin and the number of claims until ruin, with a penalty function applied to the deficit at ruin. The approach is via the duality
between a risk model with phase-type claims and a particular single server queueing model with phase-type customer interarrival
times; see Frostig (2004). This result specializes to one for the probability generating function of the number of claims until ruin.
We obtain explicit expressions for the distribution of the number of claims until ruin for exponentially distributed claims when the
inter-claim times have an Erlang-n distribution.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Using the censored gamma distribution for modelling fractional response variables with an application to loss given default.
Sigrist, Fabio; Stahel, Werner A - 38 pages. [RKN: 74901]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2011) 41 (2) : 673-710.
Regression models for limited continuous dependent variables having a non-negligible probability of attaining exactly their limits
are presented. The models differ in the number of parameters and in their flexibility. Fractional data being a special case of limited
dependent data, the models also apply to variables that are a fraction or a proportion. It is shown how to fit these models and they
are applied to a Loss Given Default dataset from insurance to which they provide a good fit.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Valuing equity-linked death benefits and other contingent options : A discounted density approach. Gerber, Hans U; Shiu, Elias S
W; Yang, Hailiang [RKN: 45725]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 73-92.
Motivated by the Guaranteed Minimum Death Benefits in various deferred annuities, we investigate the calculation of the expected
discounted value of a payment at the time of death. The payment depends on the price of a stock at that time and possibly also on
the history of the stock price. If the payment turns out to be the payoff of an option, we call the contract for the payment a (life)
contingent option. Because each time-until-death distribution can be approximated by a combination of exponential distributions,
the analysis is made for the case where the time until death is exponentially distributed, i.e., under the assumption of a constant
force of mortality. The time-until-death random variable is assumed to be independent of the stock price process which is a
geometric Brownian motion. Our key tool is a discounted joint density function. A substantial series of closed-form formulas is
obtained, for the contingent call and put options, for lookback options, for barrier options, for dynamic fund protection, and for
dynamic withdrawal benefits. In a section on several stocks, the method of Esscher transforms proves to be useful for finding
among others an explicit result for valuing contingent Margrabe options or exchange options. For the case where the contracts
have a finite expiry date, closed-form formulas are found for the contingent call and put options. From these, results for De
Moivres law are obtained as limits. We also discuss equity-linked death benefit reserves and investment strategies for maintaining
such reserves. The elasticity of the reserve with respect to the stock price plays an important role. Whereas in the most important
applications the stopping time is the time of death, it could be different in other applications, for example, the time of the next
catastrophe.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

103

MATHEMATICS OF FINANCE

The S&P 500 index as a Sato process travelling at the speed of the VIX. Madan, Dilip B; Yor, Marc [RKN: 45458]
Applied Mathematical Finance (2011) 18 (3-4) : 227-244.
The logarithm of the S&P 500 Index is modelled as a Sato process running at a speed proportional to the current level of the VIX.
When the VIX is itself modelled as the exponential of a compound Poisson process with drift, we show that exact expressions are
available for the prices of equity options, taken at an independent exponential maturity. The parameters for the compound Poisson
process are calibrated from VIX options whereas the parameters for the Sato process driving the stock may be inferred from
market option prices. Results confirm that both the S&P 500 index option surface and the parameters of the VIX time-changed
Sato process have volatilities, skews and term volatility spreads that are responsive to the VIX level and the VIX option surface.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

MODELLING

Adaptive Importance Sampling for simulating copula-based distributions. Bee, Marco [RKN: 40018]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 237-245.
In this paper, we propose a generalization of importance sampling, called Adaptive Importance Sampling, to approximate
simulation of copula-based distributions. Unlike existing methods for copula simulation that have appeared in the literature, this
algorithm is broad enough to be used for any absolutely continuous copula. We provide details of the algorithm including rules for
stopping the iterative process and consequently assess its performance using extensive Monte Carlo experiments. To assist in its
extension to several dimensions, we discuss procedures for identifying the crucial parameters in order to achieve desirable results
especially as the size of the dimension increases. Finally, for practical illustration, we demonstrate the use of the algorithm to price
First-to-Default credit swap, an important credit derivative instrument in the financial market. The method works exquisitely well
even for large dimensions making it a valuable tool for simulating from many different classes of copulas including those which
have been difficult to sample from using traditional techniques.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

An application of comonotonicity theory in a stochastic life annuity framework. Liu, Xiaoming; Jang, Jisoo; Kim, Sun Mee [RKN:
40022]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 271-279.
A life annuity contract is an insurance instrument which pays pre-scheduled living benefits conditional on the survival of the
annuitant. In order to manage the risk borne by annuity providers, one needs to take into account all sources of uncertainty that
affect the value of future obligations under the contract. In this paper, we define the concept of annuity rate as the conditional
expected present value random variable of future payments of the annuity, given the future dynamics of its risk factors. The
annuity rate deals with the non-diversifiable systematic risk contained in the life annuity contract, and it involves mortality risk as
well as investment risk. While it is plausible to assume that there is no correlation between the two risks, each affects the annuity
rate through a combination of dependent random variables. In order to understand the probabilistic profile of the annuity rate, we
apply comonotonicity theory to approximate its quantile function. We also derive accurate upper and lower bounds for prediction
intervals for annuity rates. We use the LeeCarter model for mortality risk and the Vasicek model for the term structure of interest
rates with an annually renewable fixed-income investment policy. Different investment strategies can be handled using this
framework.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Approximation of bivariate copulas by patched bivariate Frchet copulas. Zheng, Yanting; Yang, Jingping; Huang, Jianhua Z [RKN:
40019]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 246-256.
Bivariate Frchet (BF) copulas characterize dependence as a mixture of three simple structures: comonotonicity, independence
and countermonotonicity. They are easily interpretable but have limitations when used as approximations to general dependence
structures. To improve the approximation property of the BF copulas and keep the advantage of easy interpretation, we develop a
new copula approximation scheme by using BF copulas locally and patching the local pieces together. Error bounds and a
probabilistic interpretation of this approximation scheme are developed. The new approximation scheme is compared with several
existing copula approximations, including shuffle of min, checkmin, checkerboard and Bernstein approximations and exhibits
better performance, especially in characterizing the local dependence. The utility of the new approximation scheme in insurance
and finance is illustrated in the computation of the rainbow option prices and stop-loss premiums.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Bayesian modelling of the time delay between diagnosis and settlement for Critical Illness Insurance using a Burr
generalised-linear-type model. Ozkok, Erengul; Streftaris, George; Waters, Howard R; Wilkie, A David [RKN: 45600]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (2) : 266-279.
We discuss Bayesian modelling of the delay between dates of diagnosis and settlement of claims in Critical Illness Insurance
using a Burr distribution. The data are supplied by the UK Continuous Mortality Investigation and relate to claims settled in the
years 19992005. There are non-recorded dates of diagnosis and settlement and these are included in the analysis as missing
104

values using their posterior predictive distribution and MCMC methodology. The possible factors affecting the delay (age, sex,
smoker status, policy type, benefit amount, etc.) are investigated under a Bayesian approach. A 3-parameter Burr
generalised-linear-type model is fitted, where the covariates are linked to the mean of the distribution. Variable selection using
Bayesian methodology to obtain the best model with different prior distribution setups for the parameters is also applied. In
particular, Gibbs variable selection methods are considered, and results are confirmed using exact marginal likelihood findings
and related Laplace approximations. For comparison purposes, a lognormal model is also considered.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Bayesian multivariate Poisson models for insurance ratemaking. Bermudez, Lluis; Karlis, Dimitris [RKN: 40017]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 226-236.
When actuaries face the problem of pricing an insurance contract that contains different types of coverage, such as a motor
insurance or a homeowners insurance policy, they usually assume that types of claim are independent. However, this assumption
may not be realistic: several studies have shown that there is a positive correlation between types of claim. Here we introduce
different multivariate Poisson regression models in order to relax the independence assumption, including zero-inflated models to
account for excess of zeros and overdispersion. These models have been largely ignored to date, mainly because of their
computational difficulties. Bayesian inference based on MCMC helps to resolve this problem (and also allows us to derive, for
several quantities of interest, posterior summaries to account for uncertainty). Finally, these models are applied to an automobile
insurance claims database with three different types of claim. We analyse the consequences for pure and loaded premiums when
the independence assumption is relaxed by using different multivariate Poisson regression models together with their zero-inflated
versions.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Characterization of upper comonotonicity via tail convex order. Nam, Hee Seok; Tang, Qihe; Yang, Fan [RKN: 45130]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (3) : 368-373.
In this paper, we show a characterization of upper comonotonicity via tail convex order. For any given marginal distributions, a
maximal random vector with respect to tail convex order is proved to be upper comonotonic under suitable conditions. As an
application, we consider the computation of the Haezendonck risk measure of the sum of upper comonotonic random variables
with exponential marginal distributions.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Classical and singular stochastic control for the optimal dividend policy when there is regime switching. Sotomayor, Luz R;
Cadenillas, Abel [RKN: 45128]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (3) : 344-354.
Motivated by economic and empirical arguments, we consider a company whose cash surplus is affected by macroeconomic
conditions. Specifically, we model the cash surplus as a Brownian motion with drift and volatility modulated by an observable
continuous-time Markov chain that represents the regime of the economy. The objective of the management is to select the
dividend policy that maximizes the expected total discounted dividend payments to be received by the shareholders. We study two
different cases: bounded dividend rates and unbounded dividend rates. These cases generate, respectively, problems of classical
stochastic control with regime switching and singular stochastic control with regime switching. We solve these problems, and
obtain the first analytical solutions for the optimal dividend policy in the presence of business cycles. We prove that the optimal
dividend policy depends strongly on macroeconomic conditions.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

A comparative study of parametric mortality projection models. Haberman, Steven; Renshaw, Arthur [RKN: 17150]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 35-55.
The relative merits of different parametric models for making life expectancy and annuity value predictions at both pensioner and
adult ages are investigated. This study builds on current published research and considers recent model enhancements and the
extent to which these enhancements address the deficiencies that have been identified of some of the models. The England &
Wales male mortality experience is used to conduct detailed comparisons at pensioner ages, having first established a common
basis for comparison across all models. The model comparison is then extended to include the England & Wales female
experience and both the male and female USA mortality experiences over a wider age range, encompassing also the working
ages.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Convolutions of multivariate phase-type distributions. Berdel, Jasmin; Hipp, Christian [RKN: 45131]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (3) : 374-377.
This paper is concerned with multivariate phase-type distributions introduced by Assaf et al. (1984). We show that the sum of two
independent bivariate vectors each with a bivariate phase-type distribution is again bivariate phase-type and that this is no longer
true for higher dimensions. Further, we show that the distribution of the sum over different components of a vector with multivariate
phase-type distribution is not necessarily multivariate phase-type either, if the dimension of the components is two or larger.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Dependence modeling in non-life insurance using the Bernstein copula. Diers, Dorothea; Eling, Martin; Marek, Sebastian D [RKN:
45646]
105

Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 430-436.
This paper illustrates the modeling of dependence structures of non-life insurance risks using the Bernstein copula. We conduct a
goodness-of-fit analysis and compare the Bernstein copula with other widely used copulas. Then, we illustrate the use of the
Bernstein copula in a value-at-risk and tail-value-at-risk simulation study. For both analyses we utilize German claims data on
storm, flood, and water damage insurance for calibration. Our results highlight the advantages of the Bernstein copula, including
its flexibility in mapping inhomogeneous dependence structures and its easy use in a simulation context due to its representation
as mixture of independent Beta densities. Practitioners and regulators working toward appropriate modeling of dependences in a
risk management and solvency context can benefit from our results.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Diagonal effects in claims reserving. Jessen, Anders Hedegaard; Rietdorf, Niels [RKN: 45147]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 1 : 21-37.
In this paper we present two different approaches to how one can include diagonal effects in non-life claims reserving based on
run-off triangles. Empirical analyses suggest that the approaches in Zehnwirth (2003) and Kuang et al. (2008a, 2008b) do not work
well with low-dimensional run-off triangles because estimation uncertainty is too large. To overcome this problem we consider
similar models with a smaller number of parameters. These are closely related to the framework considered in Verbeek (1972) and
Taylor (1977, 2000); the separation method. We explain that these models can be interpreted as extensions of the multiplicative
Poisson models introduced by Hachemeister & Stanard (1975) and Mack (1991).

Efficient algorithms for basket default swap pricing with multivariate Archimedean copulas. Choe, Geon Ho; Jang, Hyun Jin [RKN:
40014]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 205-213.
We introduce a new importance sampling method for pricing basket default swaps employing exchangeable Archimedean copulas
and nested Gumbel copulas. We establish more realistic dependence structures than existing copula models for credit risks in the
underlying portfolio, and propose an appropriate density for importance sampling by analyzing multivariate Archimedean copulas.
To justify efficiency and accuracy of the proposed algorithms, we present numerical examples and compare them with the crude
Monte Carlo simulation, and finally show that our proposed estimators produce considerably smaller variances.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Entropy, longevity and the cost of annuities. Haberman, Steven; Khalaf-Allah, Marwa; Verrall, Richard [RKN: 40013]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 197-204.
This paper presents an extension of the application of the concept of entropy to annuity costs. Keyfitz (1985) introduced the
concept of entropy, and analysed this in the context of continuous changes in life expectancy. He showed that a higher level of
entropy indicates that the life expectancy has a greater propensity to respond to a change in the force of mortality than a lower
level of entropy. In other words, a high level of entropy means that further reductions in mortality rates would have an impact on
measures like life expectancy. In this paper, we apply this to the cost of annuities and show how it allows the sensitivity of the cost
of a life annuity contract to changes in longevity to be summarized in a single figure index.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Erlangian approximation to finite time ruin probabilities in perturbed risk models. Stanford, David A; Yu, Kaiqi; Ren, Jiandong
[RKN: 45148]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 1 : 38-58.
In this paper, we consider a class of perturbed risk processes that have an underlying Markov structure, including
Markov-modulated risk processes, and Sparre-Andersen risk processes when both inter-claim times and claim sizes are
phase-type. We apply the Erlangization method to the risk process in the class in order to obtain an accurate approximation of the
finite time ruin probability. In addition, we develop an efficient recursive procedure by recognizing a repeating structure in the
probability matrices we work with. We believe the present work is among the first to either compute or approximate finite time ruin
probabilities in the perturbed risk model.

Explicit ruin formulas for models with dependence among risks. Albrecher, Hansjrg; Constantinescu, Corina; Loisel, Stphane
[RKN: 40021]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 265-270.
We show that a simple mixing idea allows one to establish a number of explicit formulas for ruin probabilities and related quantities
in collective risk models with dependence among claim sizes and among claim inter-occurrence times. Examples include
compound Poisson risk models with completely monotone marginal claim size distributions that are dependent according to
Archimedean survival copulas as well as renewal risk models with dependent inter-occurrence times.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Extreme Events: Robust Portfolio Construction in the Presence of Fat Tails. Kemp, Malcolm H D (2011). - Chichester: John Wiley
& Sons Ltd, 2011. - 312 pages. [RKN: 13141]
Shelved at: EF/JNH (Lon) Shelved at: 368.01 KEM
Markets are fat-tailed; extreme outcomes occur more often than many might hope, or indeed the statistics or normal distributions
might indicate. In this book, the author provides readers with the latest tools and techniques on how best to adapt portfolio
construction techniques to cope with extreme events. Beginning with an overview of portfolio construction and market drivers, the
book will analyze fat tails, what they are, their behavior, how they can differ and what their underlying causes are. The book will
then move on to look at portfolio construction techniques which take into account fat tailed behavior, and how to stress test your
106

portfolio against extreme events. Finally, the book will analyze really extreme events in the context of portfolio choice and
problems. The book will offer readers: Ways of understanding and analyzing sources of extreme events, Tools for analyzing the
key drivers of risk and return, their potential magnitude and how they might interact, Methodologies for achieving efficient portfolio
construction and risk budgeting, Approaches for catering for the time-varying nature of the world in which we live, Back-stop
approaches for coping with really extreme events, Illustrations and real life examples of extreme events across asset classes. This
will be an indispensible guide for portfolio and risk managers who will need to better protect their portfolios against extreme events
which, within the financial markets, occur more frequently than we might expect.

Folded and log-folded-t distributions as models for insurance loss data. Brazauskas, Vytaras; Kleefeld, Andreas [RKN: 45149]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 1 : 59-74.
A rich variety of probability distributions has been proposed in the actuarial literature for fitting of insurance loss data. Examples
include: lognormal, log-t, various versions of Pareto, loglogistic, Weibull, gamma and its variants, and generalized beta of the
second kind distributions, among others. In this paper, we supplement the literature by adding the log-folded-normal and
log-folded-t families. Shapes of the density function and key distributional properties of the 'folded' distributions are presented
along with three methods for the estimation of parameters: method of maximum likelihood; method of moments; and method of
trimmed moments. Further, large and small-sample properties of these estimators are studied in detail. Finally, we fit the newly
proposed distributions to data which represent the total damage done by 827 fires in Norway for the year 1988. The fitted models
are then employed in a few quantitative risk management examples, where point and interval estimates for several value-at-risk
measures are calculated.

Future building water loss projections posed by climate change. Haug, Ola; Dimakos, Xeni K; Vardal, Jofrid F; Aldrin, Magne;
Meze-Hausken, Elisabeth [RKN: 45146]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 1 : 1-20.
The insurance industry, like other parts of the financial sector, is vulnerable to climate change. Life as well as non-life products are
affected and knowledge of future loss levels is valuable. Risk and premium calculations may be updated accordingly, and
dedicated loss-preventive measures can be communicated to customers and regulators. We have established statistical claims
models for the coherence between externally inflicted water damage to private buildings in Norway and selected meteorological
variables. Based on these models and downscaled climate predictions from the Hadley centre HadAM3H climate model, the
estimated loss level of a future scenario period (2071-2100) is compared to that of a control period (1961-1990). In spite of
substantial estimation uncertainty, our analyses identify an incontestable increase in the claims level along with some regional
variability. Of the uncertainties inherently involved in such predictions, only the error due to model fit is quantifiable.

A generalized penalty function in Sparre Andersen risk models with surplus-dependent premium. Cheung, Eric C K [RKN: 45133]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (3) : 384-397.
In a general Sparre Andersen risk model with surplus-dependent premium income, the generalization of GerberShiu function
proposed by Cheung et al. (2010a) is studied. A general expression for such GerberShiu function is derived, and it is shown that
its determination reduces to the evaluation of a transition function which is independent of the penalty function. Properties of and
explicit expressions for such a transition function are derived when the surplus process is subject to (i) constant premium; (ii) a
threshold dividend strategy; or (iii) credit interest. Extension of the approach is discussed for an absolute ruin model with debit
interest.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Household consumption, investment and life insurance. Bruhn, Kenneth; Steffensen, Mogens [RKN: 45125]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (3) : 315-325.
This paper develops a continuous-time Markov model for utility optimization of households. The household optimizes expected
future utility from consumption by controlling consumption, investments and purchase of life insurance for each person in the
household. The optimal controls are investigated in the special case of a two-person household, and we present graphics
illustrating how differences between the two persons affect the controls.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Impact of insurance for operational risk: Is it worthwhile to insure or be insured for severe losses?. Peters, Gareth W; Byrnes,
Aaron D; Shevchenko, Pavel V [RKN: 40024]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 287-303.
Under the Basel II standards, the Operational Risk (OpRisk) advanced measurement approach allows a provision for reduction of
capital as a result of insurance mitigation of up to 20%. This paper studies different insurance policies in the context of capital
reduction for a range of extreme loss models and insurance policy scenarios in a multi-period, multiple risk setting. A Loss
Distributional Approach (LDA) for modeling of the annual loss process, involving homogeneous compound Poisson processes for
the annual losses, with heavy-tailed severity models comprised of a-stable severities is considered. There has been little analysis
of such models to date and it is believed insurance models will play more of a role in OpRisk mitigation and capital reduction in
future. The first question of interest is when would it be equitable for a bank or financial institution to purchase insurance for
heavy-tailed OpRisk losses under different insurance policy scenarios? The second question pertains to Solvency II and
addresses quantification of insurer capital for such operational risk scenarios. Considering fundamental insurance policies
available, in several two risk scenarios, we can provide both analytic results and extensive simulation studies of insurance
mitigation for important basic policies, the intention being to address questions related to VaR reduction under Basel II, SCR under
Solvency II and fair insurance premiums in OpRisk for different extreme loss scenarios. In the process we provide closed-form
solutions for the distribution of loss processes and claims processes in an LDA structure as well as closed-form analytic solutions
for the Expected Shortfall, SCR and MCR under Basel II and Solvency II. We also provide closed-form analytic solutions for the
annual loss distribution of multiple risks including insurance mitigation.
Available via Athens: Palgrave MacMillan: http://www.openathens.net
107


The influence of individual claims on the chain-ladder estimates: Analysis and diagnostic tool. Verdonck, T; Debruyne, M [RKN:
39932]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 85-98.
The chain-ladder method is a widely used technique to forecast the reserves that have to be kept regarding claims that are known
to exist, but for which the actual size is unknown at the time the reserves have to be set. In practice it can be easily seen that even
one outlier can lead to a huge over- or underestimation of the overall reserve when using the chain-ladder method. This indicates
that individual claims can be very influential when determining the chain-ladder estimates. In this paper the effect of contamination
is mathematically analyzed by calculating influence functions in the generalized linear model framework corresponding to the
chain-ladder method. It is proven that the influence functions are unbounded, confirming the sensitivity of the chain-ladder method
to outliers. A robust alternative is introduced to estimate the generalized linear model parameters in a more outlier resistant way.
Finally, based on the influence functions and the robust estimators, a diagnostic tool is presented highlighting the influence of
every individual claim on the classical chain-ladder estimates. With this tool it is possible to detect immediately which claims have
an abnormally positive or negative influence on the reserve estimates. Further examination of these influential points is then
advisable. A study of artificial and real run-off triangles shows the good performance of the robust chain-ladder method and the
diagnostic tool.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Log-supermodularity of weight functions, ordering weighted losses, and the loading monotonicity of weighted premiums.
Sendov, Hristo S; Wang, Ying; Zitikis, Ricardas [RKN: 40020]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 257-264.
The paper is motivated by a problem concerning the monotonicity of insurance premiums with respect to their loading parameter:
the larger the parameter, the larger the insurance premium is expected to be. This property, usually called the loading
monotonicity, is satisfied by premiums that appear in the literature. The increased interest in constructing new insurance
premiums has raised a question as to what weight functions would produce loading-monotonic premiums. In this paper, we
demonstrate a decisive role of log-supermodularity or, equivalently, of total positivity of order 2 (TP2) in answering this question.
As a consequence, we establishat a strokethe loading monotonicity of a number of well-known insurance premiums, and offer a
host of further weight functions, and consequently of premiums, thus illustrating the power of the herein suggested methodology
for constructing loading-monotonic insurance premiums.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Mathematical investigation of the GerberShiu function in the case of dependent inter-claim time and claim size. Mihalyko, Eva
Orban; Mihalyko, Csaba [RKN: 45132]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (3) : 378-383.
In this paper we investigate the well-known GerberShiu expected discounted penalty function in the case of dependence
between the inter-claim times and the claim amounts. We set up an integral equation for it and we prove the existence and
uniqueness of its solution in the set of bounded functions. We show that if d>0, the limit property of the solution is not a regularity
condition, but the characteristic of the solution even in the case when the net profit condition is not fulfilled. It is the consequence
of the choice of the penalty function for a given density function. We present an example when the GerberShiu function is not
bounded, consequently, it does not tend to zero. Using an operator technique we also prove exponential boundedness.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Model conunudrum : Letter to the Editor. O'Brien, Chris Staple Inn Actuarial Society, - 1 pages. [RKN: 70911]
Shelved at: Per: Actuary (Oxf) Per: Actuary (Lon) Shelved at: JOU
The Actuary (2012) October : 6.
On the difficulties of using models to project solvency risk.
http://www.theactuary.com/

Modeling dependence dynamics through copulas with regime switching. Silvo Filho, Osvaldo Candido da; Ziegelmann, Flavio
Augusto; Dueker, Michael J [RKN: 45638]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 346-356.
Measuring dynamic dependence between international financial markets has recently attracted great interest in financial
econometrics because the observed correlations rose dramatically during the 200809 global financial crisis. Here, we propose a
novel approach for measuring dependence dynamics. We include a hidden Markov chain (MC) in the equation describing
dependence dynamics, allowing the unobserved time-varying dependence parameter to vary according to both a restricted ARMA
process and an unobserved two-state MC. Estimation is carried out via the inference for the margins in conjunction with
filtering/smoothing algorithms. We use block bootstrapping to estimate the covariance matrix of our estimators. Monte Carlo
simulations compare the performance of regime switching and no switching models, supporting the regime-switching
specification. Finally the proposed approach is applied to empirical data, through the study of the S&P500 (USA), FTSE100 (UK)
and BOVESPA (Brazil) stock market indexes.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Modeling with Weibull-Pareto models. Scollnik, David P M; Sun, Chenchen Society of Actuaries, - 13 pages. [RKN: 70138]
Shelved at: Per: NAAJ (Oxf) Per NAAJ (Lon) Shelved at: JOU
North American Actuarial Journal (2012) 16 (2) : 260-272.
In this paper we develop several composite Weibull-Pareto models and suggest their use to model loss payments and other forms
of actuarial data. These models all comprise a Weibull distribution up to a threshold point, and some form of Pareto distribution
thereafter. They are similar in spirit to some composite lognormal-Pareto models that have previously been considered in the
108

literature. All of these models are applied, and their performance compared, in the context of a real world fire insurance data set.
http://www.soa.org/news-and-publications/publications/journals/naaj/naaj-detail.aspx

Mortality density forecasts: An analysis of six stochastic mortality models. Cairns, Andrew J G; Blake, David; Dowd, Kevin; Epstein,
David; Khalaf-Allah, Marwa; Coughlan, Guy D [RKN: 45129]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (3) : 355-367.
This paper develops a framework for developing forecasts of future mortality rates. We discuss the suitability of six stochastic
mortality models for forecasting future mortality and estimating the density of mortality rates at different ages. In particular, the
models are assessed individually with reference to the following qualitative criteria that focus on the plausibility of their forecasts:
biological reasonableness; the plausibility of predicted levels of uncertainty in forecasts at different ages; and the robustness of the
forecasts relative to the sample period used to fit the model. An important, though unsurprising, conclusion is that a good fit to
historical data does not guarantee sensible forecasts. We also discuss the issue of model risk, common to many modelling
situations in demography and elsewhere. We find that even for those models satisfying our qualitative criteria, there are significant
differences among central forecasts of mortality rates at different ages and among the distributions surrounding those central
forecasts.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Mortality projections using generalized additive models with applications to annuity values for the Irish population. Hall, M; Friel,
N Faculty of Actuaries and Institute of Actuaries; Cambridge University Press, [RKN: 39999]
Shelved at: Per: AAS (Oxf) Per: AAS (Lon)
Annals of Actuarial Science (2011) 5(1) : 19-32.
Generalized Additive Models (GAMs) with age, period and cohort as possible covariates are used to predict future mortality
improvements for the Irish population. The GAMs considered are the 1-dimensional age + period and age + cohort models and the
2-dimensional age-period and age-cohort models. In each case thin plate regression splines are used as the smoothing functions.
The generalized additive models are compared with the P-Spline (Currie et al., 2004) and Lee-Carter (Lee & Carter, 1992) models
included in version 1.0 of the Continuous Mortality Investigation (CMI) library of mortality projections. Using the Root Mean Square
Error to assess the accuracy of future predictions, the GAMs outperform the P-Spline and Lee-Carter models over intervals of 25
and 35 years in the age range 60 to 90. The GAMs allow intuitively simple models of mortality to be specified whilst also providing
the flexibility to model complex relationships between the covariates. The majority of morality improvements derived from the
projections of future Irish mortality yield annuity values at ages 60, 65, 70 and 80 in 2007 in the range of annuity values calculated,
assuming a 2 to 4 percent annual compound improvement in mortality rates for both males and females.
http://www.actuaries.org.uk/research-and-resources/pages/access-journals

Multivariate density estimation using dimension reducing information and tail flattening transformations. Buch-Kromann, Tine;
Guilln, Montserrat; Linton, Oliver; Nielsen, Jens Perch [RKN: 39933]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 99-110.
We propose a nonparametric multiplicative bias corrected transformation estimator designed for heavy tailed data. The
multiplicative correction is based on prior knowledge and has a dimension reducing effect at the same time as the original
dimension of the estimation problem is retained. Adding a tail flattening transformation improves the estimation
significantlyparticularly in the tailand provides significant graphical advantages by allowing the density estimation to be
visualized in a simple way. The combined method is demonstrated on a fire insurance data set and in a data-driven simulation
study.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

A new discrete distribution with actuarial applications. Gomez-Deniz, Emilio; Sarabia, Jose Maria; Caldern-Ojeda, Enrique [RKN:
45135]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (3) : 406-412.
A new discrete distribution depending on two parameters, a<1,a 0 and 0< <1, is introduced in this paper. The new distribution is
unimodal with a zero vertex and overdispersion (mean larger than the variance) and underdispersion (mean lower than the
variance) are encountered depending on the values of its parameters. Besides, an equation for the probability density function of
the compound version, when the claim severities are discrete is derived. The particular case obtained when a tends to zero is
reduced to the geometric distribution. Thus, the geometric distribution can be considered as a limiting case of the new distribution.
After reviewing some of its properties, we investigated the problem of parameter estimation. Expected frequencies were
calculated for numerous examples, including short and long tailed count data, providing a very satisfactory fit.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

A new proof of Cheungs characterization of comonotonicity. Mao, Tiantian; Hu, Taizhong [RKN: 40015]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 214-216.
It is well known that if a random vector with given marginal distributions is comonotonic, it has the largest sum in the sense of the
convex order. Cheung (2008) proved that the converse of this assertion is also true, provided that all marginal distribution
functions are continuous and that the underlying probability space is atomless. This continuity assumption on the marginals was
removed by Cheung (2010). In this short note, we give a new and simple proof of Cheungs result without the assumption that the
underlying probability space is atomless.
Available via Athens: Palgrave MacMillan
http://www.openathens.net



109

On 1-convexity and nucleolus of co-insurance games. Driessen, Theo S H; Vito, Fragnelli; Katsev, Ilya V; Khmelnitskaya, Anna B
[RKN: 40016]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 217-225.
The insurance situation in which an enormous risk is insured by a number of insurance companies is modeled through a
cooperative TU game, the so-called co-insurance game, first introduced in Fragnelli and Marina (2004). In this paper we present
certain conditions on the parameters of the model that guarantee the 1-convexity property of co-insurance games which in turn
ensures the nonemptiness of the core and the linearity of the nucleolus as a function of the variable premium. Further we reveal
conditions when a co-insurance game is representable in the form of a veto-removed game and present an efficient final algorithm
for computing the nucleolus of a veto-removed game.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

On absolute ruin minimization under a diffusion approximation model. Luo, Shangzhen; Taksar, Michael [RKN: 39935]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 123-133.
In this paper, we assume that the surplus process of an insurance entity is represented by a pure diffusion. The company can
invest its surplus into a BlackScholes risky asset and a risk free asset. We impose investment restrictions that only a limited
amount is allowed in the risky asset and that no short-selling is allowed. We further assume that when the surplus level becomes
negative, the company can borrow to continue financing. The ultimate objective is to seek an optimal investment strategy that
minimizes the probability of absolute ruin, i.e. the probability that the liminf of the surplus process is negative infinity. The
corresponding HamiltonJacobiBellman (HJB) equation is analyzed and a verification theorem is proved; applying the HJB
method we obtain explicit expressions for the S-shaped minimal absolute ruin function and its associated optimal investment
strategy. In the second part of the paper, we study the optimization problem with both investment and proportional reinsurance
control. There the minimal absolute ruin function and the feedback optimal investmentreinsurance control are found explicitly as
well.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

On modelling and pricing rainfall derivatives with seasonality. Leobacher, Gunther; Ngare, Philip [RKN: 45254]
Applied Mathematical Finance (2011) 18 (1-2) : 71-91.
We are interested in pricing rainfall options written on precipitation at specific locations. We assume the existence of a tradeable
financial instrument in the market whose price process is affected by the quantity of rainfall. We then construct a suitable
'Markovian gamma' model for the rainfall process which accounts for the seasonal change of precipitation and show how
maximum likelihood estimators can be obtained for its parameters.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

On the distribution of the (un)bounded sum of random variables. Cherubini, Umberto; Mulinacci, Sabrina; Romagnoli, Silvia [RKN:
62610]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 56-63.
We propose a general treatment of random variables aggregation accounting for the dependence among variables and bounded
or unbounded support of their sum. The approach is based on the extension to the concept of convolution to dependent variables,
involving copula functions. We show that some classes of copula functions (such as MarshallOlkin and elliptical) cannot be used
to represent the dependence structure of two variables whose sum is bounded, while Archimedean copulas can be applied only if
the generator becomes linear beyond some point. As for the application, we study the problem of capital allocation between risks
when the sum of losses is bounded.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

On the threshold dividend strategy for a generalized jumpdiffusion risk model. Chi, Yichun; Lin, X. Sheldon [RKN: 45126]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (3) : 326-337.
In this paper, we generalize the CramrLundberg risk model perturbed by diffusion to incorporate jumps due to surplus
fluctuation and to relax the positive loading condition. Assuming that the surplus process has exponential upward and arbitrary
downward jumps, we analyze the expected discounted penalty (EDP) function of Gerber and Shiu (1998) under the threshold
dividend strategy. An integral equation for the EDP function is derived using the WienerHopf factorization. As a result, an explicit
analytical expression is obtained for the EDP function by solving the integral equation. Finally, phase-type downward jumps are
considered and a matrix representation of the EDP function is presented.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

An operator-based approach to the analysis of ruin-related quantities in jump diffusion risk models. Feng, Runhuan [RKN: 40025]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 304-313.
Recent developments in ruin theory have seen the growing popularity of jump diffusion processes in modeling an insurers assets
and liabilities. Despite the variations of technique, the analysis of ruin-related quantities mostly relies on solutions to certain
differential equations. In this paper, we propose in the context of Lvy-type jump diffusion risk models a solution method to a
general class of ruin-related quantities. Then we present a novel operator-based approach to solving a particular type of
integro-differential equations. Explicit expressions for resolvent densities for jump diffusion processes killed on exit below zero are
obtained as by-products of this work.
Available via Athens: Palgrave MacMillan
http://www.openathens.net
110


Optimal control and dependence modeling of insurance portfolios with Lvy dynamics. Bauerle, Nicole; Blatter, Anja [RKN: 45134]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (3) : 398-405.
In this paper we are interested in optimizing proportional reinsurance and investment policies in a multidimensional Lvy-driven
insurance model. The criterion is that of maximizing exponential utility. Solving the classical HamiltonJacobiBellman equation
yields that the optimal retention level keeps a constant amount of claims regardless of time and the companys wealth level. A
special feature of our construction is to allow for dependencies of the risk reserves in different business lines. Dependence is
modeled via an Archimedean Lvy copula. We derive a sufficient and necessary condition for an Archimedean Lvy generator to
create a multidimensional positive Lvy copula in arbitrary dimension. Based on these results we identify structure conditions for
the generator and the Lvy measure of an Archimedean Lvy copula under which an insurance company reinsures a larger
fraction of claims from one business line than from another.

Available via Athens: Palgrave MacMillan
http://www.openathens.net

Optimal investment and consumption decision of a family with life insurance. Kwak, Minsuk; Shin, Yong Hyun; Choi, U Jin [RKN:
40011]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 176-188.
We study an optimal portfolio and consumption choice problem of a family that combines life insurance for parents who receive
deterministic labor income until the fixed time T. We consider utility functions of parents and children separately and assume that
parents have an uncertain lifetime. If parents die before time T, children have no labor income and they choose the optimal
consumption and portfolio with remaining wealth and life insurance benefit. The object of the family is to maximize the weighted
average of utility of parents and that of children. We obtain analytic solutions for the value function and the optimal policies, and
then analyze how the changes of the weight of the parents utility function and other factors affect the optimal policies.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Optimal non-proportional reinsurance control and stochastic differential games. Taksar, Michael; Zeng, Xudong [RKN: 38228]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 64-71.
We study stochastic differential games between two insurance companies who employ reinsurance to reduce risk exposure. We
consider competition between two companies and construct a single payoff function of two companies surplus processes. One
company chooses a dynamic reinsurance strategy in order to maximize the payoff function while its opponent is simultaneously
choosing a dynamic reinsurance strategy so as to minimize the same quantity. We describe the Nash equilibrium of the game and
prove a verification theorem for a general payoff function. For the payoff function being the probability that the difference between
two surplus reaches an upper bound before it reaches a lower bound, the game is solved explicitly.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Optimal strategies for hedging portfolios of unit-linked life insurance contracts with minimum death guarantee. Nteukam T,
Oberlain; Planchet, Frdric; Therond, Pierre [RKN: 40010]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 161-175.
In this paper, we are interested in hedging strategies which allow the insurer to reduce the risk to their portfolio of unit-linked life
insurance contracts with minimum death guarantee. Hedging strategies are developed in the Black and Scholes model and in the
Merton jumpdiffusion model. According to the new frameworks (IFRS, Solvency II and MCEV), risk premium is integrated into our
valuations. We will study the optimality of hedging strategies by comparing risk indicators (Expected loss, volatility, VaR and CTE)
in relation to transaction costs and costs generated by the re-hedging error. We will analyze the robustness of hedging strategies
by stress-testing the effect of a sharp rise in future mortality rates and a severe depreciation in the price of the underlying asset.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Parametric mortality improvement rate modelling and projecting. Haberman, Steven; Renshaw, Arthur [RKN: 45635]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 309-333.
We investigate the modelling of mortality improvement rates and the feasibility of projecting mortality improvement rates (as
opposed to projecting mortality rates), using parametric predictor structures that are amenable to simple time series forecasting.
This leads to our proposing a parallel dual approach to the direct parametric modelling and projecting of mortality rates.
Comparisons of simulated life expectancy predictions (by the cohort method) using the England and Wales population mortality
experiences for males and females under a variety of controlled data trimming exercises are presented in detail and comparisons
are also made between the parallel modelling approaches.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

The philosophy of modelling. Edwards, Matthew; Hoosain, Zaid (2012). - London: Staple Inn Actuarial Society, 2012. - [6], 82 pages.
[RKN: 43539]
Shelved at: Online only Shelved at: Online only
Modelling is at the heart of almost all actuarial work. But what exactly is a model? What are we really doing when we design a
model, or run one, or use its results? In this presentation the authors explore the fundamentals of modelling not technical
fundamentals but deeper issues which take us in places to the borderline of philosophy. Areas covered include:
- How has modelling developed over the last millennium?
- What is a model? Why is a model a model?
- What are the component steps in the modelling process, and what are the fundamental assumptions underlying their validity?
111

- Where does model risk arise, and how can we mitigate it?
- What are the lessons from all this? How can we improve our modelling, and achieve more robust validations?
- And what do Ludwig Wittgenstein, Bertrand Russell, William of Ockham and Thomas Aquinas have to say on the subject?
The presentation is aimed at young and experienced actuaries from all practice areas.
Paper presented to Staple Inn Actuarial Society, 26 June 2012
http://www.sias.org.uk/siaspapers/pastmeetings/view meeting?id=SIASMeetingJune2012

Portfolio selection and duality under mean variance preferences. Eichner, Thomas [RKN: 39937]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 146-152.
This paper uses duality to analyze an investors behavior in a n-asset portfolio selection problem when the investor has mean
variance preferences. The indirect utility and wealth requirement functions are used to derive Roys identity, Shephards lemma
and the Slutsky equation. In our simple Slutsky equation the income effect is characterized by decreasing absolute risk aversion
(DARA) and the substitution effect is always positive [negative] with respect to an assets holding if the assets mean return [risk]
increases. Substitution effect and income effect work in the same direction presupposed mean variance preferences display
DARA.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Quantile hedging for equity-linked contracts. Klusik, Przemyslaw; Palmowski, Zbigniew [RKN: 40023]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 280-286.
We consider an equity-linked contract whose payoff depends on the lifetime of the policy holder and the stock price. We provide
the best strategy for an insurance company assuming limited capital for the hedging. The main idea of the proof consists in
reducing the construction of such strategies for a given claim to a problem of superhedging for a modified claim, which is the
solution to a static optimization problem of the NeymanPearson type. This modified claim is given via some sets constructed in
an iterative way. Some numerical examples are also given.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Repeat performance. Hursey, Chris Staple Inn Actuarial Society, - 2 pages. [RKN: 74930]
Shelved at: Per: Actuary (Oxf) Per: Actuary (Lon) Shelved at: JOU
The Actuary (2012) January/February : 32-33.
Chris Hursey describes a new and original theory on determining optimal calibration nodes for replicating formulae
http://www.theactuary.com/

Risk measures in ordered normed linear spaces with non-empty cone-interior. Konstantinides, Dimitrios G; Kountzakis, Christos E
[RKN: 39934]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 111-122.
In this paper, we use tools from the theory of partially ordered normed linear spaces, especially the bases of cones. This work
extends the well-known results for convex and coherent risk measures. Its linchpin consists in the replacement of the riskless bond
by some interior point in the cone of the space of risks, which stands as the alternative numeraire.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Risk models based on time series for count random variables. Cossette, Hlne; Marceau, tienne; Toureille, Florent [RKN: 10962]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 19-28.
In this paper, we generalize the classical discrete time risk model by introducing a dependence relationship in time between the
claim frequencies. The models used are the Poisson autoregressive model and the Poisson moving average model. In particular,
the aggregate claim amount and related quantities such as the stop-loss premium, value at risk and tail value at risk are discussed
within this framework.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Risk processes with shot noise Cox claim number process and reserve dependent premium rate. Macci, Claudio; Torrisi, Giovanni
Luca [RKN: 39936]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 134-145.
We consider a suitable scaling, called the slow Markov walk limit, for a risk process with shot noise Cox claim number process and
reserve dependent premium rate. We provide large deviation estimates for the ruin probability. Furthermore, we find an
asymptotically efficient law for the simulation of the ruin probability using importance sampling. Finally, we present asymptotic
bounds for ruin probabilities in the Bayesian setting.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Robustefficient credibility models with heavy-tailed claims: A mixed linear models perspective. Dornheim, Harald; Brazauskas,
Vytaras [RKN: 39365]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 72-84.
In actuarial practice, regression models serve as a popular statistical tool for analyzing insurance data and tariff ratemaking. In this
paper, we consider classical credibility models that can be embedded within the framework of mixed linear models. For inference
about fixed effects and variance components, likelihood-based methods such as (restricted) maximum likelihood estimators are
commonly pursued. However, it is well-known that these standard and fully efficient estimators are extremely sensitive to small
112

deviations from hypothesized normality of random components as well as to the occurrence of outliers. To obtain better estimators
for premium calculation and prediction of future claims, various robust methods have been successfully adapted to credibility
theory in the actuarial literature. The objective of this work is to develop robust and efficient methods for credibility when
heavy-tailed claims are approximately log-locationscale distributed. To accomplish that, we first show how to express additive
credibility models such as BhlmannStraub and Hachemeister ones as mixed linear models with symmetric or asymmetric
errors. Then, we adjust adaptively truncated likelihood methods and compute highly robust credibility estimates for the ordinary
but heavy-tailed claims part. Finally, we treat the identified excess claims separately and find robustefficient credibility premiums.
Practical performance of this approach is examinedvia simulationsunder several contaminating scenarios. A widely studied
real-data set from workers compensation insurance is used to illustrate functional capabilities of the new robust credibility
estimators.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Smoothing dispersed counts with applications to mortality data. Djeundje, V A B; Currie, I D Faculty of Actuaries and Institute of
Actuaries; Cambridge University Press, [RKN: 40000]
Shelved at: Per: AAS (Oxf) Per: AAS (Lon)
Annals of Actuarial Science (2011) 5(1) : 33-52.
Mortality data are often classified by age at death and year of death. This classification results in a heterogeneous risk set and this
can cause problems for the estimation and forecasting of mortality. In the modelling of such data, we replace the classical
assumption that the numbers of claims follow the Poisson distribution with the weaker assumption that the numbers of claims have
a variance proportional to the mean. The constant of proportionality is known as the dispersion parameter and it enables us to
allow for heterogeneity; in the case of insurance data the dispersion parameter also allows for the presence of duplicates in a
portfolio. We use both the quasi-likelihood and the extended quasi-likelihood to estimate models for the smoothing and forecasting
of mortality tables jointly with smooth estimates of the dispersion parameters. We present three main applications of our method:
first, we show how taking account of dispersion reduces the volatility of a forecast of a mortality table; second, we smooth mortality
data by amounts, ie, when deaths are amounts claimed and exposed-to-risk are sums assured; third, we present a joint model for
mortality by lives and by amounts with the property that forecasts by lives and by amounts are consistent. Our methods are
illustrated with data from the Continuous Mortality Investigation.
http://www.actuaries.org.uk/research-and-resources/pages/access-journals

Stochastic comparisons for allocations of policy limits and deductibles with applications. Lu, ZhiYi; Meng, LiLi [RKN: 45127]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (3) : 338-343.
In this paper, we study the problem of comparing losses of a policyholder who has an increasing utility function when the form of
coverage is policy limit and deductible. The total retained losses of a policyholder [formula] are ordered in the usual stochastic
order sense when Xi(i=1,,n) are ordered with respect to the likelihood ratio order. The parallel results for the case of deductibles
are obtained in the same way. It is shown that the ordering of the losses are related to the characteristics (log-concavity or
log-convexity) of distributions of the risks. As an application of the comparison results, the optimal problems of allocations of policy
limits and deductibles are studied in usual stochastic order sense and the closed-form optimal solutions are obtained in some
special cases.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

The strictest common relaxation of a family of risk measures. Roorda, Berend; Schumacher, J M [RKN: 13027]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 29-34.
Operations which form new risk measures from a collection of given (often simpler) risk measures have been used extensively in
the literature. Examples include convex combination, convolution, and the worst-case operator. Here we study the risk measure
that is constructed from a family of given risk measures by the best-case operator; that is, the newly constructed risk measure is
defined as the one that is as restrictive as possible under the condition that it accepts all positions that are accepted under any of
the risk measures from the family. In fact we define this operation for conditional risk measures, to allow a multiperiod setting. We
show that the well-known VaR risk measure can be constructed from a family of conditional expectations by a combination that
involves both worst-case and best-case operations. We provide an explicit description of the acceptance set of the conditional risk
measure that is obtained as the strictest common relaxation of two given conditional risk measures.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Tails of correlation mixtures of elliptical copulas. Manner, Hans; Segers, Johan [RKN: 39938]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 153-160.
Correlation mixtures of elliptical copulas arise when the correlation parameter is driven itself by a latent random process. For such
copulas, both penultimate and asymptotic tail dependence are much larger than for ordinary elliptical copulas with the same
unconditional correlation. Furthermore, for Gaussian and Student t-copulas, tail dependence at sub-asymptotic levels is generally
larger than in the limit, which can have serious consequences for estimation and evaluation of extreme risk. Finally, although
correlation mixtures of Gaussian copulas inherit the property of asymptotic independence, at the same time they fall in the newly
defined category of near asymptotic dependence. The consequences of these findings for modeling are assessed by means of a
simulation study and a case study involving financial time series.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Three retirement decision models for defined contribution pension plan members: A simulation study. MacDonald,
Bonnie-Jeanne; Cairns, Andrew J G [RKN: 14542]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 1-18.
This paper examines the hypothetical retirement behavior of defined contribution (DC) pension plan participants. Using a Monte
Carlo simulation approach, we compare and discuss three retirement decision models: the two-thirds replacement ratio
113

benchmark model, the option-value of continued work model and a newly-developed one-year retirement decision model. Unlike
defined benefit (DB) pension plans where economic incentives create spikes in retirement at particular ages, all three retirement
decision models suggest that the retirement ages of DC participants are much more smoothly distributed over a wide range of
ages. We find that the one-year model possesses several advantages over the other two models when representing the
theoretical retirement choice of a DC pension plan participant. First, its underlying theory for retirement decision-making is more
feasible given the distinct features and pension drivers of a DC plan. Second, its specifications produce a more logical relationship
between an individuals decision to retire and his/her age and accumulated retirement wealth. Lastly, although the one-year model
is less complex than the option-value model as the DC participants scope is only one year, the retirement decision is optimal over
all future projected years if projections are made using reasonable financial assumptions.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Yet more on a stochastic economic model : Part 1: updating and refitting, 1995 to 2009. Wilkie, A D; Sahin, Sule; Cairns, A J G;
Kleinow, Torsten Faculty of Actuaries and Institute of Actuaries; Cambridge University Press, [RKN: 40001]
Shelved at: Per: AAS (Oxf) Per: AAS (Lon)
Annals of Actuarial Science (2011) 5(1) : 53-99.
In this paper we review the Wilkie asset model for a variety of UK economic indices, including the Retail Prices Index, both without
and with an ARCH model, the wages index, share dividend yields, share dividends and share prices, long term bond yields, short
term bond yields and index-linked bond yields, in each case by updating the parameters to June 2009. We discuss how the model
has performed from 1994 to 2009 and estimate the values of the parameters and their confidence intervals over various
sub-periods to study their stability. Our analysis shows that the residuals of many of the series are much fatter-tailed than in a
normal distribution. We observe also that besides the stochastic uncertainty built into the model by the random innovations there
is also parameter uncertainty arising from the estimated values of the parameters.
http://www.actuaries.org.uk/research-and-resources/pages/access-journals

MODELS

A comparison of the LeeCarter model and ARARCH model for forecasting mortality rates. Giacometti, Rosella; Bertocchi,
Marida; Rachev, Svetlozar T; Fabozzi, Frank J [RKN: 44993]
Insurance: Mathematics & Economics (2012) 50 (1) : 85-93.
With the decline in the mortality level of populations, national social security systems and insurance companies of most developed
countries are reconsidering their mortality tables taking into account the longevity risk. The Lee and Carter model is the first
discrete-time stochastic model to consider the increased life expectancy trends in mortality rates and is still broadly used today. In
this paper, we propose an alternative to the LeeCarter model: an AR(1)ARCH(1) model. More specifically, we compare the
performance of these two models with respect to forecasting age-specific mortality in Italy. We fit the two models, with Gaussian
and t-student innovations, for the matrix of Italian death rates from 1960 to 2003. We compare the forecast ability of the two
approaches in out-of-sample analysis for the period 20042006 and find that the AR(1)ARCH(1) model with t-student innovations
provides the best fit among the models studied in this paper.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Composite LognormalPareto model with random threshold. Pigeon, Mathieu; Denuit, Michel [RKN: 45488]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 3 : 177-192.
This paper further considers the composite LognormalPareto model proposed by Cooray & Ananda (2005) and suitably modified
by Scollnik (2007). This model is based on a Lognormal density up to an unknown threshold value and a Pareto density thereafter.
Instead of using a single threshold value applying uniformly to the whole data set, the model proposed in the present paper allows
for heterogeneity with respect to the threshold and let it vary among observations. Specifically, the threshold value for a particular
observation is seen as the realization of a positive random variable and the mixed composite LognormalPareto model is obtained
by averaging over the population of interest. The performance of the composite LognormalPareto model and of its mixed
extension is compared using the well-known Danish fire losses data set.

Dividends and reinsurance under a penalty for ruin. Liang, Zhibin; Young, Virginia R [RKN: 45647]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 437-445.
We find the optimal dividend strategy in a diffusion risk model under a penalty for ruin, as in Thonhauser and Albrecher (2007),
although we allow for both a positive and a negative penalty. Furthermore, we determine the optimal proportional reinsurance
strategy, when so-called expensive reinsurance is available; that is, the premium loading on reinsurance is greater than the
loading on the directly written insurance. One can think of our model as taking the one in Taksar (2000, Section 6) and adding a
penalty for ruin. We use the Legendre transform to obtain the optimal dividend and reinsurance strategies. Not surprisingly, the
optimal dividend strategy is a barrier strategy. Also, we investigate the effect of the penalty P on the optimal strategies. In
particular, we show that the optimal barrier increases with respect to P, while the optimal proportion retained and the value
function decrease with respect to P. In the end, we explore the time of ruin, and find that the expected time of ruin increases with
respect to P under a net profit condition.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

A general formula for option prices in a stochastic volatility model. Chin, Stephen; Dufresne, Daniel Routledge, [RKN: 45842]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (3-4) : 313-340.
We consider the pricing of European derivatives in a BlackScholes model with stochastic volatility. We show how Parseval's
theorem may be used to express those prices as Fourier integrals. This is a significant improvement over Monte Carlo simulation.
114

The main ingredient in our method is the Laplace transform of the ordinary (constant volatility) price of a put or call in the
BlackScholes model, where the transform is taken with respect to maturity (T); this does not appear to have been used before in
pricing options under stochastic volatility. We derive these formulas and then apply them to the case where volatility is modelled as
a continuous-time Markov chain, the so-called Markov regime-switching model. This model has been used previously in stochastic
volatility modelling, but mostly with only states. We show how to use states without difficulty, and how larger number of states
can be handled. Numerical illustrations are given, including the implied volatility curve in two- and three-state models. The curves
have the smile shape observed in practice.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

The herd behavior index : A new measure for the implied degree of co-movement in stock markets. Dhaene, Jan; Linders, Daniel;
Schoutens, Wim; Vyncke, David [RKN: 45639]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 357-370.
We introduce a new and easy-to-calculate measure for the expected degree of herd behaviour or co-movement between stock
prices. This forward looking measure is model-independent and based on observed option data. It is baptized the Herd Behavior
Index (HIX). The degree of co-movement in a stock market can be determined by comparing the observed market situation with
the extreme (theoretical) situation under which the whole system is driven by a single factor. The HIX is then defined as the ratio of
an option-based estimate of the risk-neutral variance of the market index and an option-based estimate of the corresponding
variance in case of the extreme single factor market situation. The HIX can be determined for any market index provided an
appropriate series of vanilla options is traded on this index as well as on its components. As an illustration, we determine historical
values of the 30-days HIX for the Dow Jones Industrial Average, covering the period January 2003 to October 2009.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

The joint distribution of the time to ruin and the number of claims until ruin in the classical risk model. Dickson, David C M [RKN:
45636]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 334-337.
We use probabilistic arguments to derive an expression for the joint density of the time to ruin and the number of claims until ruin
in the classical risk model. From this we obtain a general expression for the probability function of the number of claims until ruin.
We also consider the moments of the number of claims until ruin and illustrate our results in the case of exponentially distributed
individual claims. Finally, we briefly discuss joint distributions involving the surplus prior to ruin and deficit at ruin.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Levy risk model with two-sided jumps and a barrier dividend strategy. Bo, Lijun; Song, Renming; Tang, Dan; Wang, Tongjin; Yang,
Xuewei [RKN: 45601]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (2) : 280-291.
In this paper, we consider a general Lvy risk model with two-sided jumps and a constant dividend barrier. We connect the ruin
problem of the ex-dividend risk process with the first passage problem of the Lvy process reflected at its running maximum. We
prove that if the positive jumps of the risk model form a compound Poisson process and the remaining part is a spectrally negative
Lvy process with unbounded variation, the Laplace transform (as a function of the initial surplus) of the upward entrance time of
the reflected (at the running infimum) Lvy process exhibits the smooth pasting property at the reflecting barrier. When the surplus
process is described by a double exponential jump diffusion in the absence of dividend payment, we derive some explicit
expressions for the Laplace transform of the ruin time, the distribution of the deficit at ruin, and the total expected discounted
dividends. Numerical experiments concerning the optimal barrier strategy are performed and new empirical findings are
presented.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Managing longevity and disability risks in life annuities with long term care. Levantesi, Susanna; Menzietti, Massimiliano [RKN:
45642]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 391-401.
The aim of the paper is twofold. Firstly, it develops a model for risk assessment in a portfolio of life annuities with long term care
benefits. These products are usually represented by a Markovian Multi-State model and are affected by both longevity and
disability risks. Here, a stochastic projection model is proposed in order to represent the future evolution of mortality and disability
transition intensities. Data from the Italian National Institute of Social Security (INPS) and from Human Mortality Database (HMD)
are used to estimate the model parameters. Secondly, it investigates the solvency in a portfolio of enhanced pensions. To this aim
a risk model based on the portfolio risk reserve is proposed and different rules to calculate solvency capital requirements for life
underwriting risk are examined. Such rules are then compared with the standard formula proposed by the Solvency II project.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Minimising expected discounted capital injections by reinsurance in a classical risk model. Eisenberg, Julia; Schmidli, Hanspeter
[RKN: 45487]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 3 : 155-176.
In this paper we consider a classical continuous time risk model, where the claims are reinsured by some reinsurance with
retention level [unable to display equation], where [equation] means no reinsurance and b=0 means full reinsurance. The insurer
can change the retention level continuously. To prevent negative surplus the insurer has to inject additional capital. The problem is
to minimise the expected discounted cost over all admissible reinsurance strategies. We show that an optimal reinsurance
strategy exists. For some special cases we will be able to give the optimal strategy explicitly. In other cases the method will be
115

illustrated only numerically.

Multidimensional LeeCarter model with switching mortality processes. Hainaut, Donatien [RKN: 45597]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (2) : 236-246.
This paper proposes a multidimensional LeeCarter model, in which the time dependent components are ruled by switching
regime processes. The main feature of this model is its ability to replicate the changes of regimes observed in the mortality
evolution. Changes of measure, preserving the dynamics of the mortality process under a pricing measure, are also studied. After
a review of the calibration method, a 2D, 2-regimes model is fitted to the male and female French population, for the period
19462007. Our analysis reveals that one regime corresponds to longevity conditions observed during the decade following the
second world war, while the second regime is related to longevity improvements observed during the last 30 years. To conclude,
we analyze, in a numerical application, the influence of changes of measure affecting transition probabilities, on prices of life and
death insurances.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Multivariate insurance models: an overview. Anastasiadis, Simon; Chukova, Stefanka [RKN: 45739]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 222-227.
This literature review summarizes the results from a collection of research papers that relate to modeling insurance claims and the
processes associated with them. We consider work by more than 55 authors, published or presented between 1971 and 2008.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

On cross-currency models with stochastic volatility and correlated interest rates. Grzelak, Lech A; Oosterleeac, Cornelis W
Routledge, [RKN: 45793]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (1-2) : 1-35.
We construct multi-currency models with stochastic volatility (SV) and correlated stochastic interest rates with a full matrix of
correlations. We first deal with a foreign exchange (FX) model of Heston-type, in which the domestic and foreign interest rates are
generated by the short-rate process of HullWhite (Hull, J. and White, A. [1990] Pricing interest-rate derivative securities, Review
of Financial Studies, 3, pp. 573592). We then extend the framework by modelling the interest rate by an SV displaced-diffusion
(DD) Libor Market Model (Andersen, L. B. G. and Andreasen, J. [2000] Volatility skews and extensions of the libor market model,
Applied Mathematics Finance, 1[7], pp. 132), which can model an interest rate smile. We provide semi-closed form
approximations which lead to efficient calibration of the multi-currency models. Finally, we add a correlated stock to the framework
and discuss the construction, model calibration and pricing of equityFXinterest rate hybrid pay-offs.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

On the spurious correlation between sample betas and mean returns. Levy, Moshe Routledge, [RKN: 45843]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (3-4) : 341-360.
Cornerstone asset pricing models, such as capital asset pricing model (CAPM) and arbitrage pricing theory (APT), yield
theoretical predictions about the relationship between expected returns and exposure to systematic risk, as measured by beta(s).
Numerous studies have investigated the empirical validity of these models. We show that even if no relationship holds between
true expected returns and betas in the population, the existence of low-probability extreme outcomes induces a spurious
correlation between the sample means and the sample betas. Moreover, the magnitude of this purely spurious correlation is
similar to the empirically documented correlation, and the regression slopes and intercepts are very similar as well. This result
does not necessarily constitute evidence against the theoretical asset pricing models, but it does shed new light on previous
empirical results, and it points to an issue that should be carefully considered in the empirical testing of these models. The analysis
points to the dangers of relying on simple least squares regression for drawing conclusions about the validity of equilibrium pricing
models.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

Pricing fixed-income securities in an information-based framework. Hughston, Lane P; Macrina, Andrea Routledge, [RKN: 45844]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (3-4) : 361-379.
The purpose of this article is to introduce a class of information-based models for the pricing of fixed-income securities. We
consider a set of continuous-time processes that describe the flow of information concerning market factors in a monetary
economy. The nominal pricing kernel is assumed to be given at any specified time by a function of the values of information
processes at that time. Using a change-of-measure technique, we derive explicit expressions for the prices of nominal discount
bonds and deduce the associated dynamics of the short rate of interest and the market price of risk. The interest rate positivity
condition is expressed as a differential inequality. An example that shows how the model can be calibrated to an arbitrary initial
yield curve is presented. We proceed to model the price level, which is also taken at any specified time to be given by a function of
the values of the information processes at that time. A simple model for a stochastic monetary economy is introduced in which the
prices of the nominal discount bonds and inflation-linked notes can be expressed in terms of aggregate consumption and the
liquidity benefit generated by the money supply.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

Recursive methods for a multi-dimensional risk process with common shocks. Gong, Lan; Badescu, Andrei L; Cheung, Eric C K
[RKN: 44996]
116

Insurance: Mathematics & Economics (2012) 50 (1) : 109-120.
In this paper, a multi-dimensional risk model with common shocks is studied. Using a simple probabilistic approach via observing
the risk processes at claim instants, recursive integral formulas are developed for the survival probabilities as well as for a class of
GerberShiu expected discounted penalty functions that include the surplus levels at ruin. Under the assumption of exponential or
mixed Erlang claims, the recursive integrals can be simplified to give recursive sums which are computationally more tractable.
Numerical examples including an optimal capital allocation problem are also given towards the end.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

The Solvency II square-root formula for systematic biometric risk. Christiansen, Marcus C; Denuit, Michel M; Lazar, Dorina [RKN:
45599]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (2) : 257-265.
In this paper, we develop a model supporting the so-called square-root formula used in Solvency II to aggregate the modular life
SCR. Describing the insurance policy by a Markov jump process, we can obtain expressions similar to the square-root formula in
Solvency II by means of limited expansions around the best estimate. Numerical illustrations are given, based on German
population data. Even if the square-root formula can be supported by theoretical considerations, it is shown that the QIS
correlation matrix is highly questionable.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

MONTE CARLO

A Bayesian approach to parameter estimation for kernel density estimation via transformations. Liu, Qing; Pitt, David; Zhang,
Xibin; Wu, Xueyuan Institute and Faculty of Actuaries; Cambridge University Press, - 13 pages. [RKN: 74950]
Shelved at: Per: AAS (Oxf) Per: AAS (Lon)
Annals of Actuarial Science (2011) 5(2) : 181-193.
In this paper, we present a Markov chain Monte Carlo (MCMC) simulation algorithm for estimating parameters in the kernel
density estimation of bivariate insurance claim data via transformations. Our data set consists of two types of auto insurance claim
costs and exhibits a high-level of skewness in the marginal empirical distributions. Therefore, the kernel density estimator based
on original data does not perform well. However, the density of the original data can be estimated through estimating the density of
the transformed data using kernels. It is well known that the performance of a kernel density estimator is mainly determined by the
bandwidth, and only in a minor way by the kernel. In the current literature, there have been some developments in the area of
estimating densities based on transformed data, where bandwidth selection usually depends on pre-determined transformation
parameters. Moreover, in the bivariate situation, the transformation parameters were estimated for each dimension individually.
We use a Bayesian sampling algorithm and present a Metropolis-Hastings sampling procedure to sample the bandwidth and
transformation parameters from their posterior density. Our contribution is to estimate the bandwidths and transformation
parameters simultaneously within a Metropolis-Hastings sampling procedure. Moreover, we demonstrate that the correlation
between the two dimensions is better captured through the bivariate density estimator based on transformed data.
http://www.actuaries.org.uk/research-and-resources/pages/access-journals

MONTE CARLO TECHNIQUES

An affine two-factor heteroskedastic macro-finance term structure model. Spreij, Peter; Veerman, Enno; Vlaar, Peter [RKN: 45462]
Applied Mathematical Finance (2011) 18 (3-4) : 331-352.
We propose an affine macro-finance term structure model for interest rates that allows for both constant volatilities
(homoskedastic model) and state-dependent volatilities (heteroskedastic model). In a homoskedastic model, interest rates are
symmetric, which means that either very low interest rates are predicted too often or very high interest rates not often enough. This
undesirable symmetry for constant volatility models motivates the use of heteroskedastic models where the volatility depends on
the driving factors.

For a truly heteroskedastic model in continuous time, which involves a multivariate square root process, the so-called Feller
conditions are usually imposed to ensure that the roots have non-negative arguments. For a discrete time approximate model, the
Feller conditions do not give this guarantee. Moreover, in a macro-finance context, the restrictions imposed might be economically
unappealing. It has also been observed that even without the Feller conditions imposed, for a practically relevant term structure
model, negative arguments rarely occur.

Using models estimated on German data, we compare the yields implied by (approximate) analytic exponentially affine
expressions to those obtained through Monte Carlo simulations of very high numbers of sample paths. It turns out that the
differences are rarely statistically significant, whether the Feller conditions are imposed or not. Moreover, economically, the
differences are negligible, as they are always below one basis point.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

Assessing the performance of different volatility estimators : A Monte Carlo analysis. Cartea, Alvaro; Karyampas, Dimitrios [RKN:
45882]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (5-6) : 535-552.
117

We test the performance of different volatility estimators that have recently been proposed in the literature and have been
designed to deal with problems arising when ultra high-frequency data are employed: microstructure noise and price
discontinuities. Our goal is to provide an extensive simulation analysis for different levels of noise and frequency of jumps to
compare the performance of the proposed volatility estimators. We conclude that the maximum likelihood estimator filter (MLE-F),
a two-step parametric volatility estimator proposed by Cartea and Karyampas (2011a10. Cartea , . and Karyampas , D. 2011a .
The relationship between the volatility of returns and the number of jumps in financial markets, SSRN eLibrary , Working Paper
Series, SSRN .
View all references; The relationship between the volatility returns and the number of jumps in financial markets, SSRN eLibrary,
Working Paper Series, SSRN), outperforms most of the well-known high-frequency volatility estimators when different
assumptions about the path properties of stock dynamics are used.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

Efficient algorithms for basket default swap pricing with multivariate Archimedean copulas. Choe, Geon Ho; Jang, Hyun Jin [RKN:
40014]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 205-213.
We introduce a new importance sampling method for pricing basket default swaps employing exchangeable Archimedean copulas
and nested Gumbel copulas. We establish more realistic dependence structures than existing copula models for credit risks in the
underlying portfolio, and propose an appropriate density for importance sampling by analyzing multivariate Archimedean copulas.
To justify efficiency and accuracy of the proposed algorithms, we present numerical examples and compare them with the crude
Monte Carlo simulation, and finally show that our proposed estimators produce considerably smaller variances.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Modelling claims run-off with reversible jump Markov chain Monte Carlo methods. Verrall, Richard; Hssjer, Ola; Bjrkwall,
Susanna - 24 pages. [RKN: 70742]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2012) 42 (1) : 35-58.
In this paper we describe a new approach to modelling the development of claims run-off triangles. This method replaces the usual
ad hoc practical process of extrapolating a development pattern to obtain tail factors with an objective procedure. An example is
given, illustrating the results in a practical context, and the WinBUGS code is supplied.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

On modelling and pricing rainfall derivatives with seasonality. Leobacher, Gunther; Ngare, Philip [RKN: 45254]
Applied Mathematical Finance (2011) 18 (1-2) : 71-91.
We are interested in pricing rainfall options written on precipitation at specific locations. We assume the existence of a tradeable
financial instrument in the market whose price process is affected by the quantity of rainfall. We then construct a suitable
'Markovian gamma' model for the rainfall process which accounts for the seasonal change of precipitation and show how
maximum likelihood estimators can be obtained for its parameters.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

MORTALITY

Automated graduation using Bayesian trans-dimensional models. Verrall, Richard J; Haberman, S Institute and Faculty of
Actuaries; Cambridge University Press, - 21 pages. [RKN: 74953]
Shelved at: Per: AAS (Oxf) Per: AAS (Lon)
Annals of Actuarial Science (2011) 5(2) : 231-251.
This paper presents a new method of graduation which uses parametric formulae together with Bayesian reversible jump Markov
chain Monte Carlo methods. The aim is to provide a method which can be applied to a wide range of data, and which does not
require a lot of adjustment or modification. The method also does not require one particular parametric formula to be selected:
instead, the graduated values are a weighted average of the values from a range of formulae. In this way, the new method can be
seen as an automatic graduation method which we believe can be applied in many cases without any adjustments and provide
satisfactory graduated values. An advantage of a Bayesian approach is that it allows for model uncertainty unlike standard
methods of graduation.
http://www.actuaries.org.uk/research-and-resources/pages/access-journals

Bayesian stochastic mortality modelling for two populations. Cairns, Andrew J G; Blake, David; Dowd, Kevin; Coughlan, Guy D;
Khalaf-Allah, Marwa [RKN: 45299]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU/AST
ASTIN Bulletin (2011) 41 (1) : 29-59.
This paper introduces a new framework for modelling the joint development over time of mortality rates in a pair of related
populations with the primary aim of producing consistent mortality forecasts for the two populations. The primary aim is achieved
by combining a number of recent and novel developments in stochastic mortality modelling, but these, additionally, provide us with
a number of side benefits and insights for stochastic mortality modelling. By way of example, we propose an Age-Period-Cohort
model which incorporates a mean-reverting stochastic spread that allows for different trends in mortality improvement rates in the
short-run, but parallel improvements in the long run. Second, we fit the model using a Bayesian framework that allows us to
combine estimation of the unobservable state variables and the parameters of the stochastic processes driving them into a single
procedure. Key benefits of this include dampening down of the impact of Poisson variation in death counts, full allowance for
paramater uncertainty, and the flexibility to deal with missing data. The framework is designed for large populations coupled with
118

a small sub-population and is applied to the England & Wales national and Continuous Mortality Investigation assured lives males
populations. We compare and contrast results based on the two-population approach with single-population results.
online access via International Actuarial Association:
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Calibrating affine stochastic mortality models using term assurance premiums. Russo, Vincenzo; Giacometti, Rosella; Ortobelli,
Sergio; Rachev, Svetlozar; Fabozzi, Frank J [RKN: 44975]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (1) : 53-60.
In this paper, we focus on the calibration of affine stochastic mortality models using term assurance premiums. We view term
assurance contracts as a swap in which policyholders exchange cash flows (premiums vs. benefits) with an insurer analogous to
a generic interest rate swap or credit default swap. Using a simple bootstrapping procedure, we derive the term structure of
mortality rates from a stream of contract quotes with different maturities. This term structure is used to calibrate the parameters of
affine stochastic mortality models where the survival probability is expressed in closed form. The Vasicek, CoxIngersollRoss,
and jump-extended Vasicek models are considered for fitting the survival probabilities term structure. An evaluation of the
performance of these models is provided with respect to premiums of three Italian insurance companies.

Available via Athens: Palgrave MacMillan
http://www.openathens.net

A comparative study of parametric mortality projection models. Haberman, Steven; Renshaw, Arthur [RKN: 17150]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 35-55.
The relative merits of different parametric models for making life expectancy and annuity value predictions at both pensioner and
adult ages are investigated. This study builds on current published research and considers recent model enhancements and the
extent to which these enhancements address the deficiencies that have been identified of some of the models. The England &
Wales male mortality experience is used to conduct detailed comparisons at pensioner ages, having first established a common
basis for comparison across all models. The model comparison is then extended to include the England & Wales female
experience and both the male and female USA mortality experiences over a wider age range, encompassing also the working
ages.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Entropy, longevity and the cost of annuities. Haberman, Steven; Khalaf-Allah, Marwa; Verrall, Richard [RKN: 40013]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 197-204.
This paper presents an extension of the application of the concept of entropy to annuity costs. Keyfitz (1985) introduced the
concept of entropy, and analysed this in the context of continuous changes in life expectancy. He showed that a higher level of
entropy indicates that the life expectancy has a greater propensity to respond to a change in the force of mortality than a lower
level of entropy. In other words, a high level of entropy means that further reductions in mortality rates would have an impact on
measures like life expectancy. In this paper, we apply this to the cost of annuities and show how it allows the sensitivity of the cost
of a life annuity contract to changes in longevity to be summarized in a single figure index.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Extending the Lee-Carter model: a three-way decomposition. Russolillo, Maria; Giordano, Giuseppe; Haberman, Steven [RKN:
45354]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 2 : 96-117.
In this paper, we focus on a Multi-dimensional Data Analysis approach to the Lee-Carter (LC) model of mortality trends. In
particular, we extend the bilinear LC model and specify a new model based on a three-way structure, which incorporates a further
component in the decomposition of the log-mortality rates. A multi-way component analysis is performed using the Tucker3 model.
The suggested methodology allows us to obtain combined estimates for the three modes: (1) time, (2) age groups and (3) different
populations. From the results obtained by the Tucker3 decomposition, we can jointly compare, in both a numerical and graphical
way, the relationships among all three modes and obtain a time-series component as a leading indicator of the mortality trend for
a group of populations. Further, we carry out a correlation analysis of the estimated trends in order to assess the reliability of the
results of the three-way decomposition. The model's goodness of fit is assessed using an analysis of the residuals. Finally, we
discuss how the synthesised mortality index can be used to build concise projected life tables for a group of populations. An
application which compares 10 European countries is used to illustrate the approach and provide a deeper insight into the model
and its implementation.

Longevity/mortality risk modeling and securities pricing. Deng, Yinglu; Brockett, Patrick L; MacMinn, Richard D - 25 pages. [RKN:
70413]
Shelved at: Per: J.Risk Ins (Oxf) Shelved at: JOU
Journal of Risk and Insurance (2012) 79 (3) : 697-721.
Securitizing longevity/mortality risk can transfer longevity/mortality risk to capital markets. Modeling and forecasting mortality rate
is key to pricing mortality-linked securities. Catastrophic mortality and longevity jumps occur in historical data and have an
important impact on security pricing. This article introduces a stochastic diffusion model with a double-exponential jump diffusion
process that captures both asymmetric rate jumps up and down and also cohort effect in mortality trends. The model exhibits
calibration advantages and mathematical tractability while better fitting the data. The model provides a closed-form pricing solution
for J.P. Morgans q-forward contract usable as a building block for hedging.
Available online via Athens


119

Mortality projections using generalized additive models with applications to annuity values for the Irish population. Hall, M; Friel,
N Faculty of Actuaries and Institute of Actuaries; Cambridge University Press, [RKN: 39999]
Shelved at: Per: AAS (Oxf) Per: AAS (Lon)
Annals of Actuarial Science (2011) 5(1) : 19-32.
Generalized Additive Models (GAMs) with age, period and cohort as possible covariates are used to predict future mortality
improvements for the Irish population. The GAMs considered are the 1-dimensional age + period and age + cohort models and the
2-dimensional age-period and age-cohort models. In each case thin plate regression splines are used as the smoothing functions.
The generalized additive models are compared with the P-Spline (Currie et al., 2004) and Lee-Carter (Lee & Carter, 1992) models
included in version 1.0 of the Continuous Mortality Investigation (CMI) library of mortality projections. Using the Root Mean Square
Error to assess the accuracy of future predictions, the GAMs outperform the P-Spline and Lee-Carter models over intervals of 25
and 35 years in the age range 60 to 90. The GAMs allow intuitively simple models of mortality to be specified whilst also providing
the flexibility to model complex relationships between the covariates. The majority of morality improvements derived from the
projections of future Irish mortality yield annuity values at ages 60, 65, 70 and 80 in 2007 in the range of annuity values calculated,
assuming a 2 to 4 percent annual compound improvement in mortality rates for both males and females.
http://www.actuaries.org.uk/research-and-resources/pages/access-journals

Multidimensional LeeCarter model with switching mortality processes. Hainaut, Donatien [RKN: 45597]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (2) : 236-246.
This paper proposes a multidimensional LeeCarter model, in which the time dependent components are ruled by switching
regime processes. The main feature of this model is its ability to replicate the changes of regimes observed in the mortality
evolution. Changes of measure, preserving the dynamics of the mortality process under a pricing measure, are also studied. After
a review of the calibration method, a 2D, 2-regimes model is fitted to the male and female French population, for the period
19462007. Our analysis reveals that one regime corresponds to longevity conditions observed during the decade following the
second world war, while the second regime is related to longevity improvements observed during the last 30 years. To conclude,
we analyze, in a numerical application, the influence of changes of measure affecting transition probabilities, on prices of life and
death insurances.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

One-year Value-at-Risk for longevity and mortality. Plat, Richard [RKN: 44948]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 462-470.
Upcoming new regulation on regulatory required solvency capital for insurers will be predominantly based on a one-year
Value-at-Risk measure. This measure aims at covering the risk of the variation in the projection year as well as the risk of changes
in the best estimate projection for future years. This paper addresses the issue how to determine this Value-at-Risk for longevity
and mortality risk. Naturally, this requires stochastic mortality rates. In the past decennium, a vast literature on stochastic mortality
models has been developed. However, very few of them are suitable for determining the one-year Value-at-Risk. This requires a
model for mortality trends instead of mortality rates. Therefore, we will introduce a stochastic mortality trend model that fits this
purpose. The model is transparent, easy to interpret and based on well known concepts in stochastic mortality modeling.
Additionally, we introduce an approximation method based on duration and convexity concepts to apply the stochastic mortality
rates to specific insurance portfolios.
Available via Athens: Palgrave MacMillan: http://www.openathens.net

A quantitative comparison of the Lee-Carter Model under different types of non-Gaussian innovations. Wang, Chou-Wen; Huang,
Hong-Chih; Liu, I-Chien Palgrave Macmillan, [RKN: 44909]
Shelved at: Per: Geneva (Oxf)
Geneva Papers on Risk and Insurance (2011) 36(4) : 675-696.
In the classical Lee-Carter model, the mortality indices that are assumed to be a random walk model with drift are normally
distributed. However, for the long-term mortality data, the error terms of the Lee-Carter model and the mortality indices have tails
thicker than those of a normal distribution and appear to be skewed. This study therefore adopts five non-Gaussian
distributionsStudents t-distribution and its skew extension (i.e., generalised hyperbolic skew Students t-distribution), one
finite-activity Lvy model (jump diffusion distribution), and two infinite-activity or pure jump models (variance gamma and normal
inverse Gaussian)to model the error terms of the Lee-Carter model. With mortality data from six countries over the period
19002007, both in-sample model selection criteria (e.g., Bayesian information criterion, KolmogorovSmirnov test,
AndersonDarling test, Cramrvon-Mises test) and out-of-sample projection errors indicate a preference for modelling the
Lee-Carter model with non-Gaussian innovations.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Smoothing dispersed counts with applications to mortality data. Djeundje, V A B; Currie, I D Faculty of Actuaries and Institute of
Actuaries; Cambridge University Press, [RKN: 40000]
Shelved at: Per: AAS (Oxf) Per: AAS (Lon)
Annals of Actuarial Science (2011) 5(1) : 33-52.
Mortality data are often classified by age at death and year of death. This classification results in a heterogeneous risk set and this
can cause problems for the estimation and forecasting of mortality. In the modelling of such data, we replace the classical
assumption that the numbers of claims follow the Poisson distribution with the weaker assumption that the numbers of claims have
a variance proportional to the mean. The constant of proportionality is known as the dispersion parameter and it enables us to
allow for heterogeneity; in the case of insurance data the dispersion parameter also allows for the presence of duplicates in a
portfolio. We use both the quasi-likelihood and the extended quasi-likelihood to estimate models for the smoothing and forecasting
of mortality tables jointly with smooth estimates of the dispersion parameters. We present three main applications of our method:
first, we show how taking account of dispersion reduces the volatility of a forecast of a mortality table; second, we smooth mortality
data by amounts, ie, when deaths are amounts claimed and exposed-to-risk are sums assured; third, we present a joint model for
mortality by lives and by amounts with the property that forecasts by lives and by amounts are consistent. Our methods are
illustrated with data from the Continuous Mortality Investigation.
http://www.actuaries.org.uk/research-and-resources/pages/access-journals
120


MORTALITY PROJECTIONS

A comparative study of parametric mortality projection models. Haberman, Steven; Renshaw, Arthur [RKN: 17150]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 35-55.
The relative merits of different parametric models for making life expectancy and annuity value predictions at both pensioner and
adult ages are investigated. This study builds on current published research and considers recent model enhancements and the
extent to which these enhancements address the deficiencies that have been identified of some of the models. The England &
Wales male mortality experience is used to conduct detailed comparisons at pensioner ages, having first established a common
basis for comparison across all models. The model comparison is then extended to include the England & Wales female
experience and both the male and female USA mortality experiences over a wider age range, encompassing also the working
ages.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Mortality density forecasts: An analysis of six stochastic mortality models. Cairns, Andrew J G; Blake, David; Dowd, Kevin; Epstein,
David; Khalaf-Allah, Marwa; Coughlan, Guy D [RKN: 45129]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (3) : 355-367.
This paper develops a framework for developing forecasts of future mortality rates. We discuss the suitability of six stochastic
mortality models for forecasting future mortality and estimating the density of mortality rates at different ages. In particular, the
models are assessed individually with reference to the following qualitative criteria that focus on the plausibility of their forecasts:
biological reasonableness; the plausibility of predicted levels of uncertainty in forecasts at different ages; and the robustness of the
forecasts relative to the sample period used to fit the model. An important, though unsurprising, conclusion is that a good fit to
historical data does not guarantee sensible forecasts. We also discuss the issue of model risk, common to many modelling
situations in demography and elsewhere. We find that even for those models satisfying our qualitative criteria, there are significant
differences among central forecasts of mortality rates at different ages and among the distributions surrounding those central
forecasts.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Parametric mortality improvement rate modelling and projecting. Haberman, Steven; Renshaw, Arthur [RKN: 45635]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 309-333.
We investigate the modelling of mortality improvement rates and the feasibility of projecting mortality improvement rates (as
opposed to projecting mortality rates), using parametric predictor structures that are amenable to simple time series forecasting.
This leads to our proposing a parallel dual approach to the direct parametric modelling and projecting of mortality rates.
Comparisons of simulated life expectancy predictions (by the cohort method) using the England and Wales population mortality
experiences for males and females under a variety of controlled data trimming exercises are presented in detail and comparisons
are also made between the parallel modelling approaches.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Time-simultaneous prediction bands: a new look at the uncertainty involved in forecasting mortality. Li, Johnny Siu-Hang; Chan,
Wai-Sum [RKN: 44978]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (1) : 81-88.
Conventionally, isolated (point-wise) prediction intervals are used to quantify the uncertainty in future mortality rates and other
demographic quantities such as life expectancy. A pointwise interval reflects uncertainty in a variable at a single time point, but it
does not account for any dynamic property of the time-series. As a result, in situations when the path or trajectory of future
mortality rates is important, a band of pointwise intervals might lead to an invalid inference. To improve the communication of
uncertainty, a simultaneous prediction band can be used. The primary objective of this paper is to demonstrate how simultaneous
prediction bands can be created for prevalent stochastic models, including the CairnsBlakeDowd and LeeCarter models. The
illustrations in this paper are based on mortality data from the general population of England and Wales.
Available via Athens: Palgrave MacMillan
http://www.openathens.net
MORTALITY RATES

Comparison and bounds for functionals of future lifetimes consistent with life tables. Barz, Christiane; Muller, Alfred [RKN: 45596]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (2) : 229-235.
We derive a new crossing criterion of hazard rates to identify a stochastic order relation between two random variables. We apply
this crossing criterion in the context of life tables to derive stochastic ordering results among three families of fractional age
assumptions: the family of linear force of mortality functions, the family of quadratic survival functions and the power family.
Further, this criterion is used to derive tight bounds for functionals of future lifetimes that exhibit an increasing force of mortality
with given one-year survival probabilities. Numerical examples illustrate our findings.
Available via Athens: Palgrave MacMillan
http://www.openathens.net
121


A comparison of the LeeCarter model and ARARCH model for forecasting mortality rates. Giacometti, Rosella; Bertocchi,
Marida; Rachev, Svetlozar T; Fabozzi, Frank J [RKN: 44993]
Insurance: Mathematics & Economics (2012) 50 (1) : 85-93.
With the decline in the mortality level of populations, national social security systems and insurance companies of most developed
countries are reconsidering their mortality tables taking into account the longevity risk. The Lee and Carter model is the first
discrete-time stochastic model to consider the increased life expectancy trends in mortality rates and is still broadly used today. In
this paper, we propose an alternative to the LeeCarter model: an AR(1)ARCH(1) model. More specifically, we compare the
performance of these two models with respect to forecasting age-specific mortality in Italy. We fit the two models, with Gaussian
and t-student innovations, for the matrix of Italian death rates from 1960 to 2003. We compare the forecast ability of the two
approaches in out-of-sample analysis for the period 20042006 and find that the AR(1)ARCH(1) model with t-student innovations
provides the best fit among the models studied in this paper.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

DeltaGamma hedging of mortality and interest rate risk. Luciano, Elisa; Regis, Luca; Vigna, Elena [RKN: 45643]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 402-412.
One of the major concerns of life insurers and pension funds is the increasing longevity of their beneficiaries. This paper studies
the hedging problem of annuity cash flows when mortality and interest rates are stochastic. We first propose a DeltaGamma
hedging technique for mortality risk. The risk factor against which to hedge is the difference between the actual mortality intensity
in the future and its forecast today, the forward intensity. We specialize the hedging technique first to the case in which mortality
intensities are affine, then to OrnsteinUhlenbeck and Feller processes, providing actuarial justifications for this selection. We
show that, without imposing no arbitrage, we can get equivalent probability measures under which the HJM condition for no
arbitrage is satisfied. Last, we extend our results to the presence of both interest rate and mortality risk. We provide a UK
calibrated example of DeltaGamma hedging of both mortality and interest rate risk.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

A dynamic parameterization modeling for the age-period-cohort mortality. Hatzopoulos, P; Haberman, Steven [RKN: 44959]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (2) : 155-174.
An extended version of dynamic parametric model is proposed for analyzing mortality structures, incorporating the cohort effect. A
one-factor parameterized exponential polynomial in age effects within the generalized linear models (GLM) framework is used.
Sparse principal component analysis (SPCA) is then applied to time-dependent GLM parameter estimates and provides
(marginal) estimates for a two-factor principal component (PC) approach structure. Modeling the two-factor residuals in the same
way, in age-cohort effects, provides estimates for the (conditional) three-factor ageperiodcohort model. The age-time and
cohort related components are extrapolated using dynamic linear regression (DLR) models. An application is presented for
England & Wales males (18412006).
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Optimal retirement consumption with a stochastic force of mortality. Huang, Huaxiong; Milevsky, Moshe A; Salisbury, Thomas S
[RKN: 44787]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51(2) : 282-291.
We extend the lifecycle model (LCM) of consumption over a random horizon (also known as the Yaari model) to a world in which
(i) the force of mortality obeys a diffusion process as opposed to being deterministic, and (ii) consumers can adapt their
consumption strategy to new information about their mortality rate (also known as health status) as it becomes available. In
particular, we derive the optimal consumption rate and focus on the impact of mortality rate uncertainty versus simple lifetime
uncertainty assuming that the actuarial survival curves are initially identical in the retirement phase where this risk plays a
greater role. In addition to deriving and numerically solving the partial differential equation (PDE) for the optimal consumption rate,
our main general result is that when the utility preferences are logarithmic the initial consumption rates are identical. But, in a
constant relative risk aversion (CRRA) framework in which the coefficient of relative risk aversion is greater (smaller) than one, the
consumption rate is higher (lower) and a stochastic force of mortality does make a difference. That said, numerical experiments
indicate that, even for non-logarithmic preferences, the stochastic mortality effect is relatively minor from the individuals
perspective. Our results should be relevant to researchers interested in calibrating the lifecycle model as well as those who
provide normative guidance (also known as financial advice) to retirees.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Parametric mortality improvement rate modelling and projecting. Haberman, Steven; Renshaw, Arthur [RKN: 45635]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 309-333.
We investigate the modelling of mortality improvement rates and the feasibility of projecting mortality improvement rates (as
opposed to projecting mortality rates), using parametric predictor structures that are amenable to simple time series forecasting.
This leads to our proposing a parallel dual approach to the direct parametric modelling and projecting of mortality rates.
Comparisons of simulated life expectancy predictions (by the cohort method) using the England and Wales population mortality
experiences for males and females under a variety of controlled data trimming exercises are presented in detail and comparisons
are also made between the parallel modelling approaches.
Available via Athens: Palgrave MacMillan
http://www.openathens.net



122

Time-simultaneous prediction bands: a new look at the uncertainty involved in forecasting mortality. Li, Johnny Siu-Hang; Chan,
Wai-Sum [RKN: 44978]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (1) : 81-88.
Conventionally, isolated (point-wise) prediction intervals are used to quantify the uncertainty in future mortality rates and other
demographic quantities such as life expectancy. A pointwise interval reflects uncertainty in a variable at a single time point, but it
does not account for any dynamic property of the time-series. As a result, in situations when the path or trajectory of future
mortality rates is important, a band of pointwise intervals might lead to an invalid inference. To improve the communication of
uncertainty, a simultaneous prediction band can be used. The primary objective of this paper is to demonstrate how simultaneous
prediction bands can be created for prevalent stochastic models, including the CairnsBlakeDowd and LeeCarter models. The
illustrations in this paper are based on mortality data from the general population of England and Wales.
Available via Athens: Palgrave MacMillan
http://www.openathens.net
MOTOR INSURANCE

A copula approach to test asymmetric information with applications to predictive modeling. Shi, Peng; Valdez, Emiliano A [RKN:
44965]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (2) : 226-239.
In this article, we present a copula regression model for testing asymmetric information as well as for predictive modeling
applications in automobile insurance market. We use the Frank copula to jointly model the type of coverage and the number of
accidents, with the dependence parameter providing for evidence of the relationship between the choice of coverage and the
frequency of accidents. This dependence therefore provides an indication of the presence (or absence) of asymmetric information.
The type of coverage is in some sense ordered so that coverage with higher ordinals indicate the most comprehensive coverage.
Henceforth, a positive relationship would indicate that more coverage is chosen by high risk policyholders, and vice versa. This
presence of asymmetric information could be due to either adverse selection or moral hazard, a distinction often made in the
economics or insurance literature, or both. We calibrated our copula model using a one-year cross-sectional observation of claims
arising from a major automobile insurer in Singapore. Our estimation results indicate a significant positive coveragerisk
relationship. However, when we correct for the bias resulting from possible underreporting of accidents, we find that the positive
association vanishes. We further used our estimated model for other possible actuarial applications. In particular, we are able to
demonstrate the effect of coverage choice on the incidence of accidents, and based on which, the pure premium is derived. In
general, a positive margin is observed when compared with the gross premium available in our empirical database.
Available via Athens: Palgrave MacMillan
http://www.openathens.net
MULTIVARIATE ANALYSIS

Bias-reduced estimators for bivariate tail modelling. Beirlant, J; Dierckx, G; Guillou, A [RKN: 44971]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (1) : 18-26.
Ledford and Tawn (1997) introduced a flexible bivariate tail model based on the coefficient of tail dependence and on the
dependence of the extreme values of the random variables. In this paper, we extend the concept by specifying the slowly varying
part of the model as done by Hall (1982) with the univariate case. Based on Beirlant et al. (2009), we propose a bias-reduced
estimator for the coefficient of tail dependence and for the estimation of small tail probabilities. We discuss the properties of these
estimators via simulations and a real-life example. Furthermore, we discuss some theoretical asymptotic aspects of this approach.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

A characterization of the multivariate excess wealth ordering. Fernndez-Ponce, J M; Pellerey, Franco; Rodrguez-Griolo, M R
[RKN: 44943]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 410-417.
In this paper, some new properties of the upper-corrected orthant of a random vector are proved.
The univariate right-spread or excess wealth function, introduced by Fernndez-Ponce et al. (1996), is
extended to multivariate random vectors, and some properties of this multivariate function are studied.
Later, this function was used to define the excess wealth ordering by Shaked and Shanthikumar (1998)
and Fernndez-Ponce et al. (1998). The multivariate excess wealth function enable us to define a new
stochastic comparison which is weaker than the multivariate dispersion orderings. Also, some properties
relating the multivariate excess wealth order with stochastic dependence are described.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Convolutions of multivariate phase-type distributions. Berdel, Jasmin; Hipp, Christian [RKN: 45131]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (3) : 374-377.
This paper is concerned with multivariate phase-type distributions introduced by Assaf et al. (1984). We show that the sum of two
independent bivariate vectors each with a bivariate phase-type distribution is again bivariate phase-type and that this is no longer
true for higher dimensions. Further, we show that the distribution of the sum over different components of a vector with multivariate
phase-type distribution is not necessarily multivariate phase-type either, if the dimension of the components is two or larger.
Available via Athens: Palgrave MacMillan: http://www.openathens.net
123


Detection and correction of outliers in the bivariate chainladder method. Verdonck, T; Van Wouwe, M [RKN: 44961]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (2) : 188-193.
The expected profit or loss of a non-life insurance company is determined for the whole of its multiple business lines. This implies
the study of the claims reserving problem for a portfolio consisting of several correlated run-off triangles. A popular technique to
deal with such a portfolio is the multivariate chainladder method of . However, it is well known that the chainladder method is
very sensitive to outlying data. For the univariate case, we have already developed a robust version of the chainladder method.
In this article we propose two techniques to detect and correct outlying values in a bivariate situation. The methodologies are
illustrated and compared on real examples from practice.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

A generalized beta copula with applications in modeling multivariate long-tailed data. Yang, Xipei; Frees, Edward W; Zhang,
Zhengjun [RKN: 44968]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (2) : 265-284.
This work proposes a new copula class that the authors call the MGB2 copula. The new copula originates from extracting the
dependence function of the multivariate GB2 distribution (MGB2) whose marginals follow the univariate generalized beta
distribution of the second kind (GB2). The MGB2 copula can capture non-elliptical and asymmetric dependencies among marginal
coordinates and provides a simple formulation for multi-dimensional applications. This new class features positive tail dependence
in the upper tail and tail independence in the lower tail. Furthermore, it includes some well-known copula classes, such as the
Gaussian copula, as special or limiting cases.
To illustrate the usefulness of the MGB2 copula, the authors build a trivariate MGB2 copula model of bodily injury liability closed
claims. Extended GB2 distributions are chosen to accommodate the right-skewness and the long-tailedness of the outcome
variables. For the regression component, location parameters with continuous predictors are introduced using a nonlinear additive
function. For comparison purposes, we also consider the Gumbel and t copulas, alternatives that capture the upper tail
dependence. The paper introduces a conditional plot graphical tool for assessing the validation of the MGB2 copula. Quantitative
and graphical assessment of the goodness of fit demonstrate the advantages of the MGB2 copula over the other copulas.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

A multivariate aggregate loss model. Ren, Jiandong [RKN: 44799]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51(2) : 402-408.
In this paper, we introduce a multivariate aggregate loss model, where multiple categories of losses are considered. The model
assumes that different types of claims arrive according to a Marked Markovian arrival process (MMAP) introduced by He and
Neuts (1998) [Q M He, M F Neuts (1998), Markov chains with marked transitions, Stochastic Processes and their Applications, 74:
3752] in the queuing literature. This approach enables us to allow dependencies among the claim frequencies, and among the
claim sizes, as well as between claim frequencies and claim sizes. This model extends the (univariate) Markov modulated risk
processes (sometimes referred to as regime switching models) widely used in insurance and financial analysis. For the proposed
model, we provide formulas for calculating the joint moments of the present value of aggregate claims occurring in any time
interval (0,t]. Numerical examples are provided to show possible applications of the model.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Valuing variable annuity guarantees with the multivariate Esscher transform. Ng, Andrew Cheuk-Yin; Li, Johnny Siu-Hang [RKN:
44941]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 393-400.
Variable annuities are usually sold with a range of guarantees that protect annuity holders from some downside market risk.
Although it is common to see variable annuity guarantees written on multiple funds, existing pricing methods are, by and large,
based on stochastic processes for one single asset only. In this article, we fill this gap by developing a multivariate valuation
framework. First, we consider a multivariate regime-switching model for modeling returns on various assets at the same time. We
then identify a risk-neutral probability measure for use with the model under consideration. This is accomplished by a multivariate
extension of the regime-switching conditional Esscher transform. We further extend our results to the situation when the
guarantee being valued is linked to equity indexes measured in foreign currencies. In particular, we derive a probability measure
that is risk-neutral from the perspective of domestic investors. Finally, we illustrate our results with a hypothetical variable annuity
guarantee.
Available via Athens: Palgrave MacMillan: http://www.openathens.net
MULTIVARIATE TAIL

Asymptotic analysis of multivariate tail conditional expectations. Zhu, Li; Li, Haijun Society of Actuaries, - 14 pages. [RKN: 70665]
Shelved at: Per: NAAJ (Oxf) Per NAAJ (Lon) Shelved at: JOU
North American Actuarial Journal (2012) 16 (3) : 350-363.
Tail conditional expectations refer to the expected values of random variables conditioning on some tail events and are closely
related to various coherent risk measures. In the univariate case, the tail conditional expectation is asymptotically proportional to
Value-at-Risk, a popular risk measure. The focus of this paper is on asymptotic relations between the multivariate tail conditional
expectation and Value-at-Risk for heavy-tailed scale mixtures of multivariate distributions. Explicit tail estimates of multivariate tail
conditional expectations are obtained using the method of regular variation. Examples involving multivariate Pareto and elliptical
distributions, as well as application to risk allocation, are also discussed.
http://www.soa.org/news-and-publications/publications/journals/naaj/naaj-detail.aspx
124


NORWAY

Future building water loss projections posed by climate change. Haug, Ola; Dimakos, Xeni K; Vardal, Jofrid F; Aldrin, Magne;
Meze-Hausken, Elisabeth [RKN: 45146]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 1 : 1-20.
The insurance industry, like other parts of the financial sector, is vulnerable to climate change. Life as well as non-life products are
affected and knowledge of future loss levels is valuable. Risk and premium calculations may be updated accordingly, and
dedicated loss-preventive measures can be communicated to customers and regulators. We have established statistical claims
models for the coherence between externally inflicted water damage to private buildings in Norway and selected meteorological
variables. Based on these models and downscaled climate predictions from the Hadley centre HadAM3H climate model, the
estimated loss level of a future scenario period (2071-2100) is compared to that of a control period (1961-1990). In spite of
substantial estimation uncertainty, our analyses identify an incontestable increase in the claims level along with some regional
variability. Of the uncertainties inherently involved in such predictions, only the error due to model fit is quantifiable.

OPERATIONAL RISK

Analytic loss distributional approach models for operational risk from the a-stable doubly stochastic compound processes and
implications for capital allocation. Peters, Gareth W; Shevchenko, Pavel V; Young, Mark; Yip, Wendy [RKN: 44957]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 565-579.
Under the Basel II standards, the Operational Risk (OpRisk) advanced measurement approach is not prescriptive regarding the
class of statistical model utilized to undertake capital estimation. It has however become well accepted to utilize a Loss
Distributional Approach (LDA) paradigm to model the individual OpRisk loss processes corresponding to the Basel II Business
line/event type. In this paper we derive a novel class of doubly stochastic -stable family LDA models. These models provide the
ability to capture the heavy tailed loss processes typical of OpRisk, whilst also providing analytic expressions for the compound
processes annual loss density and distributions, as well as the aggregated compound processes annual loss models. In particular
we develop models of the annual loss processes in two scenarios. The first scenario considers the loss processes with a
stochastic intensity parameter, resulting in inhomogeneous compound Poisson processes annually. The resulting arrival
processes of losses under such a model will have independent counts over increments within the year. The second scenario
considers discretization of the annual loss processes into monthly increments with dependent time increments as captured by a
Binomial processes with a stochastic probability of success changing annually. Each of these models will be coupled under an
LDA framework with heavy-tailed severity models comprised of -stable severities for the loss amounts per loss event. In this paper
we will derive analytic results for the annual loss distribution density and distribution under each of these models and study their
properties.

Available via Athens: Palgrave MacMillan
http://www.openathens.net

Impact of insurance for operational risk: Is it worthwhile to insure or be insured for severe losses?. Peters, Gareth W; Byrnes,
Aaron D; Shevchenko, Pavel V [RKN: 40024]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 287-303.
Under the Basel II standards, the Operational Risk (OpRisk) advanced measurement approach allows a provision for reduction of
capital as a result of insurance mitigation of up to 20%. This paper studies different insurance policies in the context of capital
reduction for a range of extreme loss models and insurance policy scenarios in a multi-period, multiple risk setting. A Loss
Distributional Approach (LDA) for modeling of the annual loss process, involving homogeneous compound Poisson processes for
the annual losses, with heavy-tailed severity models comprised of a-stable severities is considered. There has been little analysis
of such models to date and it is believed insurance models will play more of a role in OpRisk mitigation and capital reduction in
future. The first question of interest is when would it be equitable for a bank or financial institution to purchase insurance for
heavy-tailed OpRisk losses under different insurance policy scenarios? The second question pertains to Solvency II and
addresses quantification of insurer capital for such operational risk scenarios. Considering fundamental insurance policies
available, in several two risk scenarios, we can provide both analytic results and extensive simulation studies of insurance
mitigation for important basic policies, the intention being to address questions related to VaR reduction under Basel II, SCR under
Solvency II and fair insurance premiums in OpRisk for different extreme loss scenarios. In the process we provide closed-form
solutions for the distribution of loss processes and claims processes in an LDA structure as well as closed-form analytic solutions
for the Expected Shortfall, SCR and MCR under Basel II and Solvency II. We also provide closed-form analytic solutions for the
annual loss distribution of multiple risks including insurance mitigation.
Available via Athens: Palgrave MacMillan
http://www.openathens.net




125

OPTIMAL REINSURANCE

Optimal reinsurance revisited point of view of cedent and reinsurer. Hrlimann, Werner - 28 pages. [RKN: 74746]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2011) 41 (2) : 547-474.
It is known that the partial stop-loss contract is an optimal reinsurance form under the VaR risk measure. Assuming that market
premiums are set according to the expected value principle with varying loading factors, the optimal reinsurance parameters of
this contract are obtained under three alternative single and joint party reinsurance criteria: (i) strong minimum of the total retained
loss VaR measure; (ii) weak minimum of the total retained loss VaR measure and maximum of the reinsurers expected profit; (iii)
weak minimum of the total retained loss VaR measure and minimum of the total variance risk measure. New conditions for
financing in the mean simultaneously the cedent's and the reinsurer's required VaR economic capital are revealed for situations of
pure risk transfer (classical reinsurance) or risk and profit transfer (design of internal reinsurance or reinsurance captive owned by
the captive of a corporate firm).
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Optimal reinsurance with positively dependent risks. Cai, Jun; Wei, Wei [RKN: 44990]
Insurance: Mathematics & Economics (2012) 50 (1) : 57-63.
In the individual risk model, one is often concerned about positively dependent risks. Several notions of positive dependence have
been proposed to describe such dependent risks. In this paper, we assume that the risks in the individual risk model are positively
dependent through the stochastic ordering (PDS). The PDS risks include independent, comonotonic, conditionally stochastically
increasing (CI) risks, and other interesting dependent risks. By proving the convolution preservation of the convex order for PDS
random vectors, we show that in individualized reinsurance treaties, to minimize certain risk measures of the retained loss of an
insurer, the excess-of-loss treaty is the optimal reinsurance form for an insurer with PDS dependent risks among a general class
of individualized reinsurance contracts. This extends the study in Denuit and Vermandele (1998) on individualized reinsurance
treaties to dependent risks. We also derive the explicit expressions for the retentions in the optimal excess-of-loss treaty in a
two-line insurance business model.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

OPTIMISATION

Behavioral optimal insurance. Sung, K C; Yam, S C P; Yung, S P; Zhou, J H [RKN: 44944]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 418-428.
The present work studies the optimal insurance policy offered by an insurer adopting a proportional premium principle to an
insured whose decision-making behavior is modeled by Kahneman and Tverskys Cumulative Prospect Theory with convex
probability distortions. We show that, under a fixed premium rate, the optimal insurance policy is a generalized insurance layer
(that is, either an insurance layer or a stoploss insurance). This optimal insurance decision problem is resolved by first converting
it into three different sub-problems similar to those in ; however, as we now demand a more regular optimal solution, a completely
different approach has been developed to tackle them. When the premium is regarded as a decision variable and there is no risk
loading, the optimal indemnity schedule in this form has no deductibles but a cap; further results also suggests that the deductible
amount will be reduced if the risk loading is decreased. As a whole, our paper provides a theoretical explanation for the popularity
of limited coverage insurance policies in the market as observed by many socio-economists, which serves as a mathematical
bridge between behavioral finance and actuarial science.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

OPTION PRICING

Arithmetic Asian options under stochastic delay models. McWilliams, Nairn; Sabanis, Sotirios Routledge, [RKN: 45522]
Applied Mathematical Finance (2011) 18 (5-6) : 423-446.
Motivated by the increasing interest in past-dependent asset pricing models, shown in recent years by market practitioners and
prominent authors such as Hobson and Rogers (1998, Complete models with stochastic volatility . Mathematical Finance , 8 ( 1 )
: 27 48), we explore option pricing techniques for arithmetic Asian options under a stochastic delay differential equation
approach. We obtain explicit closed-form expressions for a number of lower and upper bounds and compare their accuracy
numerically.


Available via Athens: Taylor & Francis Online
http://www.openathens.net

Bias reduction for pricing American options by least-squares Monte Carlo. Kan, Kin Hung Felix; Reesorb, Mark R Routledge,
[RKN: 45837]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (3-4) : 195-217.
We derive an approximation to the bias in regression-based Monte Carlo estimators of American option values. This derivation
holds for general asset-price processes of any dimensionality and for general pay-off structures. It uses the large sample
properties of least-squares regression estimators. Bias-corrected estimators result by subtracting the bias approximation from the
126

uncorrected estimator at each exercise opportunity. Numerical results show that the bias-corrected estimator outperforms its
uncorrected counterpart across all combinations of number of exercise opportunities, option moneyness and sample size. Finally,
the results suggest significant computational efficiency increases can be realized through trivial parallel implementations using the
corrected estimator.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

Bonds and options in exponentially affine bond models. Bermin, Hans-Peter [RKN: 45881]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (5-6) : 513-534.
In this article we apply the FlesakerHughston approach to invert the yield curve and to price various options by letting the
randomness in the economy be driven by a process closely related to the short rate, called the abstract short rate. This process is
a pure deterministic translation of the short rate itself, and we use the deterministic shift to calibrate the models to the initial yield
curve. We show that we can solve for the shift needed in closed form by transforming the problem to a new probability measure.
Furthermore, when the abstract short rate follows a CoxIngersollRoss (CIR) process we compute bond option and swaption
prices in closed form. We also propose a short-rate specification under the risk-neutral measure that allows the yield curve to be
inverted and is consistent with the CIR dynamics for the abstract short rate, thus giving rise to closed form bond option and
swaption prices.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

Computing bounds on the expected payoff of Alternative Risk Transfer products. Villegas, Andrs M; Medaglia, Andrs L; Zuluaga,
Luis F [RKN: 44786]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51(2) : 271-281.
The demand for integrated risk management solutions and the need for new sources of capital have led to the development of
innovative risk management products that mix the characteristics of traditional insurance and financial products. Such products,
usually referred as Alternative Risk Transfer (ART) products include: (re)insurance contracts that bundle several risks under a
single policy; multi-trigger products where the payment of benefits depends upon the occurrence of several events; and insurance
linked securities that place insurance risks in the capital market. Pricing of these complex products usually requires tailor-made
complex valuation methods that combine derivative pricing and actuarial science techniques for each product, as well as strong
distributional assumptions on the ARTs underlying risk factors. We present here an alternative methodology to compute bounds
on the price of ART products when there is limited information on the distribution of the underlying risk factors. In particular, we
develop a general optimization-based method that computes upper and lower price bounds for different ART products using
market data and possibly expert information about the underlying risk factors. These bounds are useful when the structure of the
product is too complex to develop analytical or simulation valuation methods, or when the scarcity of data makes it difficult to make
strong distributional assumptions on the risk factors. We illustrate our results by computing bounds on the price of a floating
retention insurance contract, and a catastrophe equity put (CatEPut) option.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Early exercise boundary for American type of floating strike Asian option and its numerical approximation. Bokes, Tom;
evcovic, Daniel Routledge, [RKN: 45520]
Applied Mathematical Finance (2011) 18 (5-6) : 367-394.
In this article, we generalize and analyse the model for pricing American-style Asian options proposed by Hansen and Jrgensen
(2000) by including a continuous dividend rate q and a general method of averaging the floating strike. We focus on the qualitative
and quantitative analysis of the early exercise boundary. The first-order expansion in terms of of the early exercise boundary
close to expiry is constructed. We furthermore propose an efficient numerical algorithm for determining the early exercise
boundary position based on the front-fixing method. Construction of the algorithm is based on a solution to a non-local parabolic
partial differential equation for the transformed variable representing the synthesized portfolio. Various numerical results and
comparisons of our numerical method and the method developed by Dai and Kwok (2006) are presented.


Available via Athens: Taylor & Francis Online
http://www.openathens.net

The effect of correlation and transaction costs on the pricing of basket options. Atkinson, C; Ingpochai, P Routledge, [RKN:
45797]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (1-2) : 131-179.
In this article, we examine the problem of evaluating the option price of a European call option written on N underlying assets when
there are proportional transaction costs in the market. Since the portfolio under consideration consists of multiple risky assets,
which makes numerical methods formidable, we use perturbation analyses. The article extends the model for option pricing on
uncorrelated assets, which was proposed by Atkinson and Alexandropoulos (2006 Pricing a European basket option in the
presence of proportional transaction cost . Applied Mathematical Finance , 13 ( 3 ) : 191 214). We determine optimal hedging
strategies as well as option prices on both correlated and uncorrelated assets. The option valuation problem is obtained by
comparing the maximized utility of wealth with and without option liability. The two stochastic control problems, which arise from
the transaction costs, are transformed to free boundary and partial differential equation problems. Once the problems have been
formulated, we establish optimal trading strategies for each of the portfolios. In addition, the optimal hedging strategies can be
found by comparing the trading strategies of the two portfolios. We provide a general procedure for solving N risky assets, which
shows that for small correlations the N asset problem can be replaced by N (N-1)/2 two-dimensional problems and give numerical
examples for the two risky assets portfolios.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

127


Fourier transforms, option pricing and controls. Joshi, Mark; Yang, Chao (2011). - Victoria: University of Melbourne, 2011. - 20
pages. [RKN: 74773]
We incorporate a simple and effective control-variate into Fourier inversion formulas for vanilla option prices. The control-variate
used in this paper is the Black-Scholes formula whose volatility parameter is determined in a generic non-arbitrary fashion. We
analyze contour dependence both in terms of value and speed of convergence. We use Gaussian quadrature rules to invert
Fourier integrals, and numerical results suggest that performing the contour integration along the real axis leads to the best pricing
performance
No.1 (1993) onwards available online. Download as PDF.
http://www.economics.unimelb.edu.au/ACT/papers.shtml

A general formula for option prices in a stochastic volatility model. Chin, Stephen; Dufresne, Daniel Routledge, [RKN: 45842]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (3-4) : 313-340.
We consider the pricing of European derivatives in a BlackScholes model with stochastic volatility. We show how Parseval's
theorem may be used to express those prices as Fourier integrals. This is a significant improvement over Monte Carlo simulation.
The main ingredient in our method is the Laplace transform of the ordinary (constant volatility) price of a put or call in the
BlackScholes model, where the transform is taken with respect to maturity (T); this does not appear to have been used before in
pricing options under stochastic volatility. We derive these formulas and then apply them to the case where volatility is modelled as
a continuous-time Markov chain, the so-called Markov regime-switching model. This model has been used previously in stochastic
volatility modelling, but mostly with only states. We show how to use states without difficulty, and how larger number of states
can be handled. Numerical illustrations are given, including the implied volatility curve in two- and three-state models. The curves
have the smile shape observed in practice.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

Good-deal bounds in a regime-switching diffusion market. Donnelly, Catherine Routledge, [RKN: 45525]
Applied Mathematical Finance (2011) 18 (5-6) : 491-515.
We consider option pricing in a regime-switching diffusion market. As the market is incomplete, there is no unique price for a
derivative. We apply the good-deal pricing bounds idea to obtain ranges for the price of a derivative. As an illustration, we calculate
the good-deal pricing bounds for a European call option and we also examine the stability of these bounds when we change the
generator of the Markov chain which drives the regime-switching. We find that the pricing bounds depend strongly on the choice of
the generator.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

One-dimensional pricing of CPPI. Paulot, Louis; Lacroze, Xavier [RKN: 45457]
Applied Mathematical Finance (2011) 18 (3-4) : 207-225.
Constant Proportion Portfolio Insurance (CPPI) is an investment strategy designed to give participation in the performance of a
risky asset while protecting the invested capital. This protection is, however, not perfect and the gap risk must be quantified. CPPI
strategies are path dependent and may have American exercise which makes their valuation complex. A naive description of the
state of the portfolio would involve three or even four variables. In this article we prove that the system can be described as a
discrete-time Markov process in one single variable if the underlying asset follows a process with independent increments. This
yields an efficient pricing scheme using transition probabilities. Our framework is flexible enough to handle most features of traded
CPPIs including profit lock-in and other kinds of strategies with discrete-time reallocation.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

Option valuation with a discrete-time double Markovian regime-switching model. Siu, Tak Kuen; Fung, Eric S; Ng, Michael K
Routledge, [RKN: 45524]
Applied Mathematical Finance (2011) 18 (5-6) : 473-490.
This article develops an option valuation model in the context of a discrete-time double Markovian regime-switching (DMRS)
model with innovations having a generic distribution. The DMRS model is more flexible than the traditional Markovian
regime-switching model in the sense that the drift and the volatility of the price dynamics of the underlying risky asset are
modulated by two observable, discrete-time and finite-state Markov chains, so that they are not perfectly correlated. The states of
each of the chains represent states of proxies of (macro)economic factors. Here we consider the situation that one
(macro)economic factor is caused by the other (macro)economic factor. The market model is incomplete, and so there is more
than one equivalent martingale measure. We employ a discrete-time version of the regime-switching Esscher transform to
determine an equivalent martingale measure for valuation. Different parametric distributions for the innovations of the price
dynamics of the underlying risky asset are considered. Simulation experiments are conducted to illustrate the implementation of
the model and to document the impacts of the macroeconomic factors described by the chains on the option prices under various
different parametric models for the innovations.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

Options on realized variance in log-OU models. Drimus, Gabriel G [RKN: 45879]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (5-6) : 477-494.
We study the pricing of options on realized variance in a general class of Log-OU (Ornsteinhlenbeck) stochastic volatility
models. The class includes several important models proposed in the literature. Having as common feature the log-normal law of
instantaneous variance, the application of standard FourierLaplace transform methods is not feasible. We derive extensions of
Asian pricing methods, to obtain bounds, in particular, a very tight lower bound for options on realized variance.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

128

Small-time asymptotics for an uncorrelated local-stochastic volatility model. Forde, Martin; Jacquier, Antoine Routledge, [RKN:
45526]
Applied Mathematical Finance (2011) 18 (5-6) : 517-535.
We add some rigour to the work of Henry-Labordre (Henry-Labordre, P. 2009. "Analysis, Geometry, and Modeling in Finance:
Advanced Methods in Option Pricing", New York, London : Chapman & Hall) on the small-time behaviour of a local-stochastic
volatility model with zero correlation at leading order. We do this using the FreidlinWentzell (FW) theory of large deviations for
stochastic differential equations (SDEs), and then converting to a differential geometry problem of computing the shortest
geodesic from a point to a vertical line on a Riemmanian manifold, whose metric is induced by the inverse of the diffusion
coefficient. The solution to this variable endpoint problem is obtained using a transversality condition, where the geodesic is
perpendicular to the vertical line under the aforementioned metric. We then establish the corresponding small-time asymptotic
behaviour for call options using Hlder's inequality, and the implied volatility (using a general result in Roper and Rutkowski
(Roper, M. and Rutkowski , M. forthcoming. "A note on the behaviour of the BlackScholes implied volatility close to expiry".
International Journal of Theoretical and Applied Finance)
We also derive a series expansion for the implied volatility in the small-maturity limit, in powers of the log-moneyness, and we
show how to calibrate such a model to the observed implied volatility smile in the small-maturity limit.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

Stochastic expansion for the pricing of call options with discrete dividends. Etore, Pierre; Gobet, Emmanuel Routledge, [RKN:
45839]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (3-4) : 233-264.
In the context of an asset paying affine-type discrete dividends, we present closed analytical approximations for the pricing of
European vanilla options in the BlackScholes model with time-dependent parameters. They are obtained using a stochastic
Taylor expansion around a shifted lognormal proxy model. The final formulae are, respectively, first-, second- and third- order
approximations w.r.t. the fixed part of the dividends. Using CameronMartin transformations, we provide explicit representations
of the correction terms as Greeks in the BlackScholes model. The use of Malliavin calculus enables us to provide tight error
estimates for our approximations. Numerical experiments show that this approach yields very accurate results, in particular
compared with known approximations of Bos, Gairat and Shepeleva (20035. Bos , R. , Gairat , A. and Shepeleva , D. 2003 .
Dealing with discrete dividends . Risk Magazine , 16 : 109 112 and Veiga and Wystup (2009, Closed formula for options with
discrete dividends and its derivatives, Applied Mathematical Finance, 16(6): 517531) and quicker than the iterated integration
procedure of Haug, Haug and Lewis (2003, Back to basics: a new approach to the discrete dividend problem, Wilmott Magazine,
pp. 37 47) or than the binomial tree method of Vellekoop and Nieuwenhuis (2006, Efficient pricing of derivatives on assets with
discrete dividends, Applied Mathematical Finance, 13(3), pp. 265284).
Available via Athens: Taylor & Francis Online: http://www.openathens.net

The stochastic intrinsic currency volatility model : A consistent framework for multiple FX rates and their volatilities. Doust, Paul
Routledge, [RKN: 45876]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (5-6) : 381-445.
The SABR and the Heston stochastic volatility models are widely used for foreign exchange (FX) option pricing. Although they are
able to reproduce the market's volatility smiles and skews for single FX rates, they cannot be extended to model multiple FX rates
in a consistent way. This is because when two FX rates with a common currency are described by either SABR or Heston
processes, the stochastic process for the third FX rate associated with the three currencies will not be a SABR or a Heston
process. A consistent description of the FX market should be symmetric so that all FX rates are described by the same type of
stochastic process. This article presents a way of doing that using the concept of intrinsic currency values. To model FX volatility
curves, the intrinsic currency framework is extended by allowing the volatility of each intrinsic currency value to be stochastic. This
makes the framework more realistic, while preserving all FX market symmetries. The result is a new SABR-style option pricing
formula and a model that can simultaneously be calibrated to the volatility smiles of all possible currency pairs under
consideration. Consequently, it is more powerful than modelling the volatility curves of each currency pair individually and offers a
methodology for comparing the volatility curves of different currency pairs against each other. One potential application of this is in
the pricing of FX options, such as vanilla options on less liquid currency pairs. Given the volatilities of less liquid currencies
against, for example, USD and EUR, the model could then be used to calculate the volatility smiles of those less liquid currencies
against all other currencies.
Available via Athens: Taylor & Francis Online: http://www.openathens.net

Valuing equity-linked death benefits and other contingent options : A discounted density approach. Gerber, Hans U; Shiu, Elias S
W; Yang, Hailiang [RKN: 45725]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 73-92.
Motivated by the Guaranteed Minimum Death Benefits in various deferred annuities, we investigate the calculation of the expected
discounted value of a payment at the time of death. The payment depends on the price of a stock at that time and possibly also on
the history of the stock price. If the payment turns out to be the payoff of an option, we call the contract for the payment a (life)
contingent option. Because each time-until-death distribution can be approximated by a combination of exponential distributions,
the analysis is made for the case where the time until death is exponentially distributed, i.e., under the assumption of a constant
force of mortality. The time-until-death random variable is assumed to be independent of the stock price process which is a
geometric Brownian motion. Our key tool is a discounted joint density function. A substantial series of closed-form formulas is
obtained, for the contingent call and put options, for lookback options, for barrier options, for dynamic fund protection, and for
dynamic withdrawal benefits. In a section on several stocks, the method of Esscher transforms proves to be useful for finding
among others an explicit result for valuing contingent Margrabe options or exchange options. For the case where the contracts
have a finite expiry date, closed-form formulas are found for the contingent call and put options. From these, results for De
Moivres law are obtained as limits. We also discuss equity-linked death benefit reserves and investment strategies for maintaining
such reserves. The elasticity of the reserve with respect to the stock price plays an important role. Whereas in the most important
applications the stopping time is the time of death, it could be different in other applications, for example, the time of the next
catastrophe.
Available via Athens: Palgrave MacMillan: http://www.openathens.net
129


OPTIONS

Bonds and options in exponentially affine bond models. Bermin, Hans-Peter [RKN: 45881]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (5-6) : 513-534.
In this article we apply the FlesakerHughston approach to invert the yield curve and to price various options by letting the
randomness in the economy be driven by a process closely related to the short rate, called the abstract short rate. This process is
a pure deterministic translation of the short rate itself, and we use the deterministic shift to calibrate the models to the initial yield
curve. We show that we can solve for the shift needed in closed form by transforming the problem to a new probability measure.
Furthermore, when the abstract short rate follows a CoxIngersollRoss (CIR) process we compute bond option and swaption
prices in closed form. We also propose a short-rate specification under the risk-neutral measure that allows the yield curve to be
inverted and is consistent with the CIR dynamics for the abstract short rate, thus giving rise to closed form bond option and
swaption prices.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

The British put option. Peskir, Goran; Samee, Farman Routledge, [RKN: 45527]
Applied Mathematical Finance (2011) 18 (5-6) : 537-563.
We present a new put option where the holder enjoys the early exercise feature of American options whereupon his payoff
(deliverable immediately) is the best prediction of the European payoff under the hypothesis that the true drift of the stock price
equals a contract drift. Inherent in this is a protection feature which is key to the British put option. Should the option holder believe
the true drift of the stock price to be unfavourable (based upon the observed price movements) he can substitute the true drift with
the contract drift and minimize his losses. The practical implications of this protection feature are most remarkable as not only can
the option holder exercise at or above the strike price to a substantial reimbursement of the original option price (covering the
ability to sell in a liquid option market completely endogenously) but also when the stock price movements are favourable he will
generally receive higher returns at a lesser price. We derive a closed form expression for the arbitrage-free price in terms of the
rational exercise boundary and show that the rational exercise boundary itself can be characterized as the unique solution to a
nonlinear integral equation. Using these results we perform a financial analysis of the British put option that leads to the
conclusions above and shows that with the contract drift properly selected the British put option becomes a very attractive
alternative to the classic American put.


Available via Athens: Taylor & Francis Online
http://www.openathens.net

Characterization of the American put option using convexity. Xie, Dejun; Edwards, David A; Schleiniger, Gilberto; Zhu, Qinghua
[RKN: 45463]
Applied Mathematical Finance (2011) 18 (3-4) : 353-365.
Understanding the behaviour of the American put option is one of the classic problems in mathematical finance. Considerable
efforts have been made to understand the asymptotic expansion of the optimal early exercise boundary for small time near expiry.
Here we focus on the large-time expansion of the boundary. Based on a recent development of the convexity property, we are able
to establish two integral identities pertaining to the boundary, from which the upper bound of its large-time expansion is derived.
The bound includes parameter dependence in the exponential decay to its limiting value. In addition, these time explicit identities
provide very efficient numerical approximations to the true solution to the problem.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

Closed form approximations for spread options. Venkatramanan, Aanand; Alexander, Carol Routledge, [RKN: 45523]
Applied Mathematical Finance (2011) 18 (5-6) : 447-472.
This article expresses the price of a spread option as the sum of the prices of two compound options. One compound option is to
exchange vanilla call options on the two underlying assets and the other is to exchange the corresponding put options. This way
we derive a new closed form approximation for the price of a European spread option and a corresponding approximation for each
of its price, volatility and correlation hedge ratios. Our approach has many advantages over existing analytical approximations,
which have limited validity and an indeterminacy that renders them of little practical use. The compound exchange option
approximation for European spread options is then extended to American spread options on assets that pay dividends or incur
costs. Simulations quantify the accuracy of our approach; we also present an empirical application to the American crack spread
options that are traded on NYMEX. For illustration, we compare our results with those obtained using the approximation attributed
to Kirk (1996, Correlation in energy markets In "Managing Energy Price Risk" , Edited by Kaminski , V. 71 78 (London : Risk
Publications)).
Available via Athens: Taylor & Francis Online
http://www.openathens.net

Comparison of two methods for superreplication. Ekstrom, Erik; Tysk, Johan Routledge, [RKN: 45798]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (1-2) : 181-193.
We compare two methods for superreplication of options with convex pay-off functions. One method entails the overestimation of
an unknown covariance matrix in the sense of quadratic forms. With this method the value of the superreplicating portfolio is given
as the solution of a linear BlackScholes BS-type equation. In the second method, the choice of quadratic form is made pointwise.
This leads to a fully non-linear equation, the so-called BlackScholesBarenblatt (BSB) equation, for the value of the
superreplicating portfolio. In general, this value is smaller for the second method than for the first method. We derive estimates for
130

the difference between the initial values of the superreplicating strategies obtained using the two methods.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

Determination of the probability distribution measures from market option prices using the method of maximum entropy in the
mean. Gzyl, Henryk; Mayoral, Silvia Routledge, [RKN: 45841]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (3-4) : 299-312.
We consider the problem of recovering the risk-neutral probability distribution of the price of an asset, when the information
available consists of the market price of derivatives of European type having the asset as underlying. The information available
may or may not include the spot value of the asset as data. When we only know the true empirical law of the underlying, our
method will provide a measure that is absolutely continuous with respect to the empirical law, thus making our procedure model
independent. If we assume that the prices of the derivatives include risk premia and/or transaction prices, using this method it is
possible to estimate those values, as well as the no-arbitrage prices. This is of interest not only when the market is not complete,
but also if for some reason we do not have information about the model for the price of the underlying.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

Exchange options under jump-diffusion dynamics. Cheang, Gerald H; Chiarella, Carl [RKN: 45459]
Applied Mathematical Finance (2011) 18 (3-4) : 245-276.
This article extends the exchange option model of Margrabe, where the distributions of both stock prices are log-normal with
correlated Wiener components, to allow the underlying assets to be driven by jump-diffusion processes of the type originally
introduced by Merton. We introduce the RadonNikodm derivative process that induces the change of measure from the market
measure to an equivalent martingale measure. The choice of parameters in the RadonNikodm derivative allows us to price the
option under different financial-economic scenarios. We also consider American style exchange options and provide a
probabilistic interpretation of the early exercise premium.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

OUTSTANDING CLAIMS

Prediction of outstanding payments in a Poisson cluster model. Jessen, Anders Hedegaard; Mikosch, Thomas; Samorodnitsky,
Gennady [RKN: 45490]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 3 : 214-237.
We consider a simple Poisson cluster model for the payment numbers and the corresponding total payments for insurance claims
arriving in a given year. Due to the Poisson structure one can give reasonably explicit expressions for the prediction of the
payment numbers and total payments in future periods given the past observations of the payment numbers. One can also derive
reasonably explicit expressions for the corresponding prediction errors. In the (a, b) class of Panjer's claim size distributions, these
expressions can be evaluated by simple recursive algorithms. We study the conditions under which the predictions are
asymptotically linear as the number of past payments becomes large. We also demonstrate that, in other regimes, the prediction
may be far from linear. For example, a staircase-like pattern may arise as well. We illustrate how the theory works on real-life data,
also in comparison with the chain ladder method.

PANJER'S CLASS OF COUNTING DISTRIBUTIONS

Prediction of outstanding payments in a Poisson cluster model. Jessen, Anders Hedegaard; Mikosch, Thomas; Samorodnitsky,
Gennady [RKN: 45490]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 3 : 214-237.
We consider a simple Poisson cluster model for the payment numbers and the corresponding total payments for insurance claims
arriving in a given year. Due to the Poisson structure one can give reasonably explicit expressions for the prediction of the
payment numbers and total payments in future periods given the past observations of the payment numbers. One can also derive
reasonably explicit expressions for the corresponding prediction errors. In the (a, b) class of Panjer's claim size distributions, these
expressions can be evaluated by simple recursive algorithms. We study the conditions under which the predictions are
asymptotically linear as the number of past payments becomes large. We also demonstrate that, in other regimes, the prediction
may be far from linear. For example, a staircase-like pattern may arise as well. We illustrate how the theory works on real-life data,
also in comparison with the chain ladder method.





131

PARETO DISTRIBUTION

A Bayesian approach for estimating extreme quantiles under a semiparametric mixture model. Cabras, Stefano; Castellanos,
Maria Eugenia [RKN: 45301]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU/AST
ASTIN Bulletin (2011) 41 (1) : 87-106.
In this paper we propose an additive mixture model, where one component is the Generalized Pareto distribution (GPD) that
allows us to estimate extreme quantiles. GPD plays an important role in modeling extreme quantiles for the wide class of
distributions belonging to the maximum domain of attraction of an extreme value model. One of the main difficulty with this
modeling approach is the choice of the threshold u, such that all observations greater than u enter into the likelihood function of the
GPD model. Difficulties are due to the fact that GPD parameter estimators are sensible to the choice of u. In this work we estimate
u, and other parameters, using suitable priors in a Bayesian approach. In particular, we propose to model all data, extremes and
non-extremes, using a semiparametric model for data below u, and the GPD for the exceedances over u. In contrast to the usual
estimation techniques for u, in this setup we account for uncertainty on all GPD parameters, including u, via their posterior
distributions. A Monte Carlo study shows that posterior credible intervals also have frequentist coverages. We further illustrate the
advantages of our approach on two applications from insurance.
online access via International Actuarial Association:
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Modeling with Weibull-Pareto models. Scollnik, David P M; Sun, Chenchen Society of Actuaries, - 13 pages. [RKN: 70138]
Shelved at: Per: NAAJ (Oxf) Per NAAJ (Lon) Shelved at: JOU
North American Actuarial Journal (2012) 16 (2) : 260-272.
In this paper we develop several composite Weibull-Pareto models and suggest their use to model loss payments and other forms
of actuarial data. These models all comprise a Weibull distribution up to a threshold point, and some form of Pareto distribution
thereafter. They are similar in spirit to some composite lognormal-Pareto models that have previously been considered in the
literature. All of these models are applied, and their performance compared, in the context of a real world fire insurance data set.
http://www.soa.org/news-and-publications/publications/journals/naaj/naaj-detail.aspx

Modelling losses and locating the tail with the Pareto Positive Stable distribution. Guilln, Montserrat; Prieto, Faustino; Sarabia,
Jos Mara [RKN: 44947]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 454-461.
This paper focuses on modelling the severity distribution. We directly model the small, moderate and large losses with the Pareto
Positive Stable (PPS) distribution and thus it is not necessary to fix a threshold for the tail behaviour. Estimation with the method of
moments is straightforward. Properties, graphical tests and expressions for value-at risk and tail value-at-risk are presented.
Furthermore, we show that the PPS distribution can be used to construct a statistical test for the Pareto distribution and to
determine the threshold for the Pareto shape if required. An application to loss data is presented. We conclude that the PPS
distribution can perform better than commonly used distributions when modelling a single loss distribution for moderate and large
losses. This approach avoids the pitfalls of cut-off selection and it is very simple to implement for quantitative risk analysis.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

PARTICIPATING POLICIES

A performance analysis of participating life insurance contracts. Faust, Roger; Schmeiser, Hato; Zemp, Alexandra [RKN: 45733]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 158-171.
Participating life insurance contracts are one of the most important products in the European life insurance market. Even though
these contract forms are very common, only very little research has been conducted in respect to their performance. Hence, we
conduct a performance analysis to provide a decision support for policyholders. We decompose a participating life insurance
contract in a term life insurance and a savings part and simulate the cash flow distribution of the latter. Simulation results are
compared with cash flows resulting from two benchmarks investing in the same portfolio of assets but without investment
guarantees and bonus distribution schemes, in order to measure the impact of these two product features. To provide a realistic
picture within the two alternatives, we take transaction costs and wealth transfers between different groups of policyholders into
account. We show that the payoff distribution strongly depends on the initial reserve situation and managerial discretion. Results
indicate that policyholders will in general profit from a better payoff distribution of the participating life insurance compared to a
mutual fund benchmark but not compared to an exchange-traded fund benchmark portfolio.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

PENSION FUND ADMINISTRATION

Pension fund management and conditional indexation. Kleinow, Torsten [RKN: 45300]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU/AST
ASTIN Bulletin (2011) 41 (1) : 61-86.
Conditional indexation offers a middle way between defined benefit and defined contribution pension schemes. In this paper, we
consider a fully-funded pension scheme with conditional indexation. We show how the pension fund can be managed to reduce
132

the risks associated with promised pension benefits when declared benefits are adjusted regularly during the working life. In
particular, we derive an investment strategy that provides protection against underfunding at retirement and which is self-financing
on average. Our results are illustrated in an extensive simulation study.
online access via International Actuarial Association:
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN
PENSION FUNDS

On optimal pension management in a stochastic framework with exponential utility. Ma, Qing-Ping [RKN: 44976]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (1) : 61-69.
This paper reconsiders the optimal asset allocation problem in a stochastic framework for defined-contribution pension plans with
exponential utility, which has been investigated by Battocchio and Menoncin [Battocchio, P., Menoncin, F., 2004. Optimal pension
management in a stochastic framework. Insurance: Mathematics and Economics. 34, 7995]. When there are three types of
asset, cash, bond and stock, and a non-hedgeable wage risk, the optimal pension portfolio composition is horizon dependent for
pension plan members whose terminal utility is an exponential function of real wealth (nominal wealth-to-price index ratio). With
market parameters usually assumed, wealth invested in bond and stock increases as retirement approaches, and wealth invested
in cash asset decreases. The present study also shows that there are errors in the formulation of the wealth process and control
variables in solving the optimization problem in the study of Battocchio and Menoncin, which render their solution erroneous and
lead to wrong results in their numerical simulation.
Available via Athens: Palgrave MacMillan
http://www.openathens.net
PENSION PLANS

Modelling and management of longevity risk: approximations to survivor functions and dynamic hedging. Cairns, Andrew J G
[RKN: 44946]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 438-453.
This paper looks at the development of dynamic hedging strategies for typical pension plan liabilities using longevity-linked
hedging instruments. Progress in this area has been hindered by the lack of closed-form formulae for the valuation of
mortality-linked liabilities and assets, and the consequent requirement for simulations within simulations. We propose the use of
the probit function along with a Taylor expansion to approximate longevity-contingent values. This makes it possible to develop
and implement computationally efficient, discrete-time delta hedging strategies using -forwards as hedging instruments. The
methods are tested using the model proposed by (CBD). We find that the probit approximations are generally very accurate, and
that the discrete-time hedging strategy is very effective at reducing risk.

Available via Athens: Palgrave MacMillan
http://www.openathens.net

Optimal asset allocation for DC pension plans under inflation. Han, Nan-wei; Hung, Mao-Wei [RKN: 45734]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 172-181.
In this paper, the stochastic dynamic programming approach is used to investigate the optimal asset allocation for a
defined-contribution pension plan with downside protection under stochastic inflation. The plan participant invests the fund wealth
and the stochastic interim contribution flows into the financial market. The nominal interest rate model is described by the
CoxIngersollRoss (Cox et al., 1985) dynamics. To cope with the inflation risk, the inflation indexed bond is included in the asset
menu. The retired individuals receive an annuity that is indexed by inflation and a downside protection on the amount of this
annuity is considered. The closed-form solution is derived under the CRRA utility function. Finally, a numerical application is
presented to characterize the dynamic behavior of the optimal investment strategy.
Available via Athens: Palgrave MacMillan
http://www.openathens.net
PENSION SCHEMES

Equity-linked pension schemes with guarantees. Nielsen, J Aase; Sandmann, Klaus; Schlgl, Erik [RKN: 44956]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 547-564.
This paper analyses the relationship between the level of a return guarantee in an equity-linked pension scheme and the
proportion of an investors contribution needed to finance this guarantee. Three types of schemes are considered: investment
guarantee, contribution guarantee and surplus participation. The evaluation of each scheme involves pricing an Asian option, for
which relatively tight upper and lower bounds can be calculated in a numerically efficient manner. The authors find a negative (and
for two contract specifications also concave) relationship between the participation in the surplus return of the investment strategy
and the guarantee level in terms of a minimum rate of return. Furthermore, the introduction of the possibility of early termination of
the contract (e.g. due to the death of the investor) has no qualitative and very little quantitative impact on this relationship.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

133

PENSIONS

A computationally efficient algorithm for estimating the distribution of future annuity values under interest-rate and longevity
risks. Dowd, Kevin; Blake, David; Cairns, Andrew J G Society of Actuaries, - 11 pages. [RKN: 74836]
Shelved at: Per: NAAJ (Oxf) Per NAAJ (Lon) Shelved at: JOU
North American Actuarial Journal (2011) 15 (2) : 237-247.
This paper proposes a computationally efficient algorithm for quantifying the impact of interestrate risk and longevity risk on the
distribution of annuity values in the distant future. The algorithm simulates the state variables out to the end of the horizon period
and then uses a Taylor series approximation to compute approximate annuity values at the end of that period, thereby avoiding a
computationally expensive simulation-within-simulation problem. Illustrative results suggest that annuity values are likely to rise
considerably but are also quite uncertain. These findings have some unpleasant implications both for defined contribution pension
plans and for defined benefit plan sponsors considering using annuities to hedge their exposure to these risks at some point in the
future.
http://www.soa.org/news-and-publications/publications/journals/naaj/naaj-detail.aspx

A risk-based model for the valuation of pension insurance. Chen, An [RKN: 44942]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 401-409.
In the US, defined benefit plans are insured by the Pension Benefit Guaranty Corporation (PBGC). Taking account of the fact that
the PBGC covers only the residual deficits of the pension fund the sponsoring company is unable to cover and that the plans can
be prematurely terminated, we consider a model that accounts for the joint dynamics of the pension funds and sponsoring firms
assets in order to effectively determine the risk-based pension premium for the insurance provided by the PBGC. We obtain a
closed-form pricing formula for this risk-based premium. Its magnitude depends highly on the investment portfolio of the pension
fund and of the sponsoring company as well as the correlation between these two portfolios.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Three retirement decision models for defined contribution pension plan members: A simulation study. MacDonald,
Bonnie-Jeanne; Cairns, Andrew J G [RKN: 14542]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 1-18.
This paper examines the hypothetical retirement behavior of defined contribution (DC) pension plan participants. Using a Monte
Carlo simulation approach, we compare and discuss three retirement decision models: the two-thirds replacement ratio
benchmark model, the option-value of continued work model and a newly-developed one-year retirement decision model. Unlike
defined benefit (DB) pension plans where economic incentives create spikes in retirement at particular ages, all three retirement
decision models suggest that the retirement ages of DC participants are much more smoothly distributed over a wide range of
ages. We find that the one-year model possesses several advantages over the other two models when representing the
theoretical retirement choice of a DC pension plan participant. First, its underlying theory for retirement decision-making is more
feasible given the distinct features and pension drivers of a DC plan. Second, its specifications produce a more logical relationship
between an individuals decision to retire and his/her age and accumulated retirement wealth. Lastly, although the one-year model
is less complex than the option-value model as the DC participants scope is only one year, the retirement decision is optimal over
all future projected years if projections are made using reasonable financial assumptions.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

PERIODICALS

Editorial: European Actuarial Journal. Hipp, Christian [RKN: 44804]
Shelved at: online only
European Actuarial Journal (2011) 1(1) July : 1-2.
The editors welcome readers to this new international scientific journal which is published and edited by a cooperation of 13
actuarial associations of the following 11 countries: Austria, Belgium, France, Germany, Greece, Hungary, Italy, Poland, Portugal,
Slovenia, and Switzerland. EAJ is the successor of the following six actuarial journals: 1. Belgian Actuarial Bulletin, 2. Bltter der
Deutschen Gesellschaft fr Versicherungs- und Finanzmathematik, 3. Boletim do Instituto dos Acturios Portugueses, 4. Giornale
dellIstituto Italiano degli Attuari, 5. Mitteilungen der Schwweiserische Aktuarveringung/Bulletin de lAssociation Suisse des
Actuaires, 6. Mitteilungen der Aktuarveringung sterreichs (Austria)
Available via Athens: Springer
PERSONAL FINANCIAL PLANNING

Household consumption, investment and life insurance. Bruhn, Kenneth; Steffensen, Mogens [RKN: 45125]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (3) : 315-325.
This paper develops a continuous-time Markov model for utility optimization of households. The household optimizes expected
future utility from consumption by controlling consumption, investments and purchase of life insurance for each person in the
household. The optimal controls are investigated in the special case of a two-person household, and we present graphics
illustrating how differences between the two persons affect the controls.
Available via Athens: Palgrave MacMillan
http://www.openathens.net
134


PHILOSOPHY

The philosophy of modelling. Edwards, Matthew; Hoosain, Zaid (2012). - London: Staple Inn Actuarial Society, 2012. - [6], 82 pages.
[RKN: 43539]
Shelved at: Online only Shelved at: Online only
Modelling is at the heart of almost all actuarial work. But what exactly is a model? What are we really doing when we design a
model, or run one, or use its results? In this presentation the authors explore the fundamentals of modelling not technical
fundamentals but deeper issues which take us in places to the borderline of philosophy. Areas covered include:
- How has modelling developed over the last millennium?
- What is a model? Why is a model a model?
- What are the component steps in the modelling process, and what are the fundamental assumptions underlying their validity?
- Where does model risk arise, and how can we mitigate it?
- What are the lessons from all this? How can we improve our modelling, and achieve more robust validations?
- And what do Ludwig Wittgenstein, Bertrand Russell, William of Ockham and Thomas Aquinas have to say on the subject?
The presentation is aimed at young and experienced actuaries from all practice areas.
Paper presented to Staple Inn Actuarial Society, 26 June 2012
http://www.sias.org.uk/siaspapers/pastmeetings/view meeting?id=SIASMeetingJune2012

POISSON DISTRIBUTION

Smoothing dispersed counts with applications to mortality data. Djeundje, V A B; Currie, I D Faculty of Actuaries and Institute of
Actuaries; Cambridge University Press, [RKN: 40000]
Shelved at: Per: AAS (Oxf) Per: AAS (Lon)
Annals of Actuarial Science (2011) 5(1) : 33-52.
Mortality data are often classified by age at death and year of death. This classification results in a heterogeneous risk set and this
can cause problems for the estimation and forecasting of mortality. In the modelling of such data, we replace the classical
assumption that the numbers of claims follow the Poisson distribution with the weaker assumption that the numbers of claims have
a variance proportional to the mean. The constant of proportionality is known as the dispersion parameter and it enables us to
allow for heterogeneity; in the case of insurance data the dispersion parameter also allows for the presence of duplicates in a
portfolio. We use both the quasi-likelihood and the extended quasi-likelihood to estimate models for the smoothing and forecasting
of mortality tables jointly with smooth estimates of the dispersion parameters. We present three main applications of our method:
first, we show how taking account of dispersion reduces the volatility of a forecast of a mortality table; second, we smooth mortality
data by amounts, ie, when deaths are amounts claimed and exposed-to-risk are sums assured; third, we present a joint model for
mortality by lives and by amounts with the property that forecasts by lives and by amounts are consistent. Our methods are
illustrated with data from the Continuous Mortality Investigation.
http://www.actuaries.org.uk/research-and-resources/pages/access-journals

POISSON HIDDEN MARKOV

A nonhomogeneous Poisson hidden Markov model for claim counts. Lu, Yi; Zeng, Leilei - 22 pages. [RKN: 70748]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2012) 42 (1) : 181-202.
We propose a nonhomogeneous Poisson hidden Markov model for a time series of claim counts that accounts for both seasonal
variations and random fluctuations in the claims intensity. It assumes that the parameters of the intensity function for the
nonhomogeneous Poisson distribution vary according to an (unobserved) underlying Markov chain. This can apply to natural
phenomena that evolve in a seasonal environment. For example, hurricanes that are subject to random fluctuations (El Nio-La
Nia cycles) affect insurance claims. The Expectation-Maximization (EM) algorithm is used to calculate the maximum likelihood
estimators for the parameters of this dynamic Poisson hidden Markov model. Statistical applications of this model to Atlantic
hurricanes and tropical storms data are discussed.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

POISSON PROCESS

Bayesian multivariate Poisson models for insurance ratemaking. Bermudez, Lluis; Karlis, Dimitris [RKN: 40017]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 226-236.
When actuaries face the problem of pricing an insurance contract that contains different types of coverage, such as a motor
insurance or a homeowners insurance policy, they usually assume that types of claim are independent. However, this assumption
may not be realistic: several studies have shown that there is a positive correlation between types of claim. Here we introduce
different multivariate Poisson regression models in order to relax the independence assumption, including zero-inflated models to
account for excess of zeros and overdispersion. These models have been largely ignored to date, mainly because of their
135

computational difficulties. Bayesian inference based on MCMC helps to resolve this problem (and also allows us to derive, for
several quantities of interest, posterior summaries to account for uncertainty). Finally, these models are applied to an automobile
insurance claims database with three different types of claim. We analyse the consequences for pure and loaded premiums when
the independence assumption is relaxed by using different multivariate Poisson regression models together with their zero-inflated
versions.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Exchange options under jump-diffusion dynamics. Cheang, Gerald H; Chiarella, Carl [RKN: 45459]
Applied Mathematical Finance (2011) 18 (3-4) : 245-276.
This article extends the exchange option model of Margrabe, where the distributions of both stock prices are log-normal with
correlated Wiener components, to allow the underlying assets to be driven by jump-diffusion processes of the type originally
introduced by Merton. We introduce the RadonNikodm derivative process that induces the change of measure from the market
measure to an equivalent martingale measure. The choice of parameters in the RadonNikodm derivative allows us to price the
option under different financial-economic scenarios. We also consider American style exchange options and provide a
probabilistic interpretation of the early exercise premium.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

An improvement of the BerryEsseen inequality with applications to Poisson and mixed Poisson random sums. Korolev, Victor;
Shevtsova, Irina [RKN: 45780]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2012) 2 : 81-105.
By a modification of the method that was applied in study of Korolev & Shevtsova (2009), here the inequalities * and * are proved
for the uniform distance (F n ,F) between the standard normal distribution function F and the distribution function F n of the
normalized sum of an arbitrary number n=1 of independent identically distributed random variables with zero mean, unit variance,
and finite third absolute moment 3. The first of these two inequalities is a structural improvement of the classical BerryEsseen
inequality and as well sharpens the best known upper estimate of the absolute constant in the classical BerryEsseen inequality
since 0.33477(3+0.429)=0.33477(1+0.429)3<0.47843 by virtue of the condition 3=1. The latter inequality is applied to
lowering the upper estimate of the absolute constant in the analog of the BerryEsseen inequality for Poisson random sums to
0.3041 which is strictly less than the least possible value 0.4097 of the absolute constant in the classical BerryEsseen
inequality. As corollaries, the estimates of the rate of convergence in limit theorems for compound mixed Poisson distributions are
refined.
http://www.openathens.net/

Modelling dependence in insurance claims process with Lvy copulas. Avanzi, Benjamin; Cassar, Luke C; Wong, Bernard - 35
pages. [RKN: 74747]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2011) 41 (2) : 575-609.
In this paper we investigate the potential of Lvy copulas as a tool for modelling dependence between compound Poisson
processes and their applications in insurance. We analyse characteristics regarding the dependence in frequency and
dependence in severity allowed by various Lvy copula models. Through the introduction of new Lvy copulas and comparison
with the Clayton Lvy copula, we show that Lvy copulas allow for a great range of dependence structures. Procedures for
analysing the fit of Lvy copula models are illustrated by fitting a number of Lvy copulas to a set of real data from Swiss workers
compensation insurance. How to assess the fit of these models with respect to the dependence structure exhibited by the dataset
is also discussed. Finally, we provide a decomposition of the trivariate compound Poisson process and discuss how trivariate Lvy
copulas model dependence in this multivariate setting.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

On a multi-threshold compound Poisson surplus process with interest. Mitric, Ilie-Radu [RKN: 45353]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 2 : 75-95.
We consider a multi-threshold compound Poisson surplus process. When the initial surplus is between any two consecutive
thresholds, the insurer has the option to choose the respective premium rate and interest rate. Also, the model allows for
borrowing the current amount of deficit whenever the surplus falls below zero. Starting from the integro-differential equations
satisfied by the Gerber-Shiu function that appear in Yang et al. (2008), we consider exponentially and phase-type(2) distributed
claim sizes, in which cases we are able to transform the integro-differential equations into ordinary differential equations. As a
result, we obtain explicit expressions for the Gerber-Shiu function.

On maximum likelihood and pseudo-maximum likelihood estimation in compound insurance models with deductibles. Paulsen,
Jostein; Stubo, Knut [RKN: 45298]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU/AST
ASTIN Bulletin (2011) 41 (1) : 1-28.
Non-life insurance payouts consist of two factors: claimsizes and claim frequency. When calculating e.g. next years premium, it is
vital to correctly model these factors and to estimate the unknown parameters. A standard way is to separately estimate in the
claimsize and the claim frequency models.
Often there is a deductible with each single claim, and this deductible can be quite large, particularly in inhomogeneous cases
such as industrial fire insurance or marine insurance. Not taking the deductibles into account can lead to serious bias in the
estimates and consequent implications when applying the model.
When the deductibles are nonidentical, in a full maximum likelihood estimation all unknown parameters have to be estimated
simultaneously. An alternative is to use pseudo-maximum likelihood, i.e. first estimate the claimsize model, taking the deductibles
into account, and then use the estimated probability that a claim exceeds the deductible as an offset in the claim frequency
estimation. This latter method is less efficient, but due to complexity or time considerations, it may be the preferred option.
In this paper we will provide rather general formulas for the relative efficiency of the pseudo maximum likelihood estimators in the
i.i.d. case. Two special cases will be studied in detail, and we conclude the paper by comparing the methods on some marine
136

insurance data.
online access via International Actuarial Association:
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

On the moments of aggregate discounted claims with dependence introduced by a FGM copula. Bargs, Mathieu; Cossette,
Helene; Loisel, Stphane; Marceau, Etienne [RKN: 45306]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU/AST
ASTIN Bulletin (2011) 41 (1) : 215-238.
In this paper, we investigate the computation of the moments of the compound Poisson sums with discounted claims when
introducing dependence between the interclaim time and the subsequent claim size. The dependence structure between the two
random variables is defined by a Farlie-Gumbel-Morgenstern copula. Assuming that the claim distribution has finite moments, we
give expressions for the first and the second moments and then we obtain a general formula for any mth order moment. The
results are illustrated with applications to premium calculation and approximations based on moment matching methods.
online access via International Actuarial Association:
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Optimal dividend policies for compound Poisson processes : The case of bounded dividend rates. Azcue, Pablo; Muler, Nora
[RKN: 45721]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 26-42.
We consider in this paper the optimal dividend problem for an insurance company whose uncontrolled reserve process evolves as
a classical CramrLundberg model with arbitrary claim-size distribution. Our objective is to find the dividend payment policy
which maximizes the cumulative expected discounted dividend pay-outs until the time of bankruptcy imposing a ceiling on the
dividend rates. We characterize the optimal value function as the unique bounded viscosity solution of the associated
HamiltonJacobiBellman equation. We prove that there exists an optimal dividend strategy and that this strategy is stationary
with a band structure. We study the regularity of the optimal value function. We find a characterization result to check optimality
even in the case where the optimal value function is not differentiable. We construct examples where the claim-size distribution is
smooth but the optimal dividend policy is not threshold and the optimal value function is not differentiable. We study the survival
probability of the company under the optimal dividend policy. We also present examples where the optimal dividend policy has
infinitely many bands even in the case that the claim-size distribution has a bounded density.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Prediction of outstanding payments in a Poisson cluster model. Jessen, Anders Hedegaard; Mikosch, Thomas; Samorodnitsky,
Gennady [RKN: 45490]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 3 : 214-237.
We consider a simple Poisson cluster model for the payment numbers and the corresponding total payments for insurance claims
arriving in a given year. Due to the Poisson structure one can give reasonably explicit expressions for the prediction of the
payment numbers and total payments in future periods given the past observations of the payment numbers. One can also derive
reasonably explicit expressions for the corresponding prediction errors. In the (a, b) class of Panjer's claim size distributions, these
expressions can be evaluated by simple recursive algorithms. We study the conditions under which the predictions are
asymptotically linear as the number of past payments becomes large. We also demonstrate that, in other regimes, the prediction
may be far from linear. For example, a staircase-like pattern may arise as well. We illustrate how the theory works on real-life data,
also in comparison with the chain ladder method.

Prediction uncertainty in the Bornhuetter-Ferguson claims reserving method: revisited. Alai, D H; Merz, M; Wthrich, Mario V
Faculty of Actuaries and Institute of Actuaries; Cambridge University Press, [RKN: 39998]
Shelved at: Per: AAS (Oxf) Per: AAS (Lon)
Annals of Actuarial Science (2011) 5(1) : 7-17.
We revisit the stochastic model of Alai et al. (2009) for the Bornhuetter-Ferguson claims reserving method, Bornhuetter &
Ferguson (1972). We derive an estimator of its conditional mean square error of prediction (MSEP) using an approach that is
based on generalized linear models and maximum likelihood estimators for the model parameters. This approach leads to simple
formulas, which can easily be implemented in a spreadsheet.
http://www.actuaries.org.uk/research-and-resources/pages/access-journals

Pricing compound Poisson processes with the FarlieGumbelMorgenstern dependence structure. Marri, Fouad; Furman,
Edward [RKN: 45732]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 151-157.
Convenient expressions for the Esscher pricing functional in the context of the compound Poisson processes with dependent loss
amounts and loss inter-arrival times are developed. To this end, the moment generating function of the aforementioned dependent
processes is derived and studied. Various implications of the dependence are discussed and exemplified numerically.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Randomized observation periods for the compound poisson risk model: dividends. Albrecher, Hansjrg; Cheung, Eric C K;
Thonhauser, Stefan - 28 pages. [RKN: 74900]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2011) 41 (2) : 645-672.
In the framework of the classical compound Poisson process in collective risk theory, we study a modification of the horizontal
dividend barrier strategy by introducing random observation times at which dividends can be paid and ruin can be observed. This
model contains both the continuous-time and the discrete-time risk model as a limit and represents a certain type of bridge
between them which still enables the explicit calculation of moments of total discounted dividend payments until ruin. Numerical
137

illustrations for several sets of parameters are given and the effect of random observation times on the performance of the
dividend strategy is studied.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Refinements of two-sided bounds for renewal equations. Woo, Jae-Kyung [RKN: 40012]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 189-196.
Many quantities of interest in the study of renewal processes may be expressed as the solution to a special type of integral
equation known as a renewal equation. The main purpose of this paper is to provide bounds for the solution of renewal equations
based on various reliability classifications. Exponential and nonexponential types of inequalities are derived. In particular,
two-sided bounds with specific reliability conditions become sharp. Finally, some examples including ultimate ruin for the classical
Poisson model with time-dependent claim sizes, the joint distribution of the surplus prior to and at ruin, and the excess life time, are
provided.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Ruin probabilities for a regenerative Poisson gap generated risk process. Asmussen, Sren; Biard, Romain [RKN: 44802]
Shelved at: online only
European Actuarial Journal (2011) 1(1) July : 3-22.
A risk process with constant premium rate c and Poisson arrivals of claims is considered. A threshold r is defined for claim
interarrival times, such that if k consecutive interarrival times are larger than r, then the next claim has distribution G. Otherwise,
the claim size distribution is F. Asymptotic expressions for the infinite horizon ruin probabilities are given for both light- and the
heavy-tailed cases. A basic observation is that the process regenerates at each G-claim. Also an approach via Markov additive
processes is outlined, and heuristics are given for the distribution of the time to ruin.
Available via Athens: Springer

POLICYHOLDERS

Stochastic comparisons for allocations of policy limits and deductibles with applications. Lu, ZhiYi; Meng, LiLi [RKN: 45127]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (3) : 338-343.
In this paper, we study the problem of comparing losses of a policyholder who has an increasing utility function when the form of
coverage is policy limit and deductible. The total retained losses of a policyholder [formula] are ordered in the usual stochastic
order sense when Xi(i=1,,n) are ordered with respect to the likelihood ratio order. The parallel results for the case of deductibles
are obtained in the same way. It is shown that the ordering of the losses are related to the characteristics (log-concavity or
log-convexity) of distributions of the risks. As an application of the comparison results, the optimal problems of allocations of policy
limits and deductibles are studied in usual stochastic order sense and the closed-form optimal solutions are obtained in some
special cases.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

POLLUTION

The endogenous price dynamics of emission allowances and an application to CO2 option pricing. Chesney, Marc; Taschini, Luca
[RKN: 45878]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (5-6) : 447-475.
Market mechanisms are increasingly being used as a tool for allocating somewhat scarce but unpriced rights and resources, and
the European Emission Trading Scheme is an example. By means of dynamic optimization in the contest of firms covered by such
environmental regulations, this article generates endogenously the price dynamics of emission permits under asymmetric
information, allowing inter-temporal banking and borrowing. In the market, there are a finite number of firms and each firm's
pollution emission follows an exogenously given stochastic process. We prove the discounted permit price is a martingale with
respect to the relevant filtration. The model is solved numerically. Finally, a closed-form pricing formula for European-style options
is derived.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

PORTFOLIO INSURANCE

One-dimensional pricing of CPPI. Paulot, Louis; Lacroze, Xavier [RKN: 45457]
Applied Mathematical Finance (2011) 18 (3-4) : 207-225.
Constant Proportion Portfolio Insurance (CPPI) is an investment strategy designed to give participation in the performance of a
risky asset while protecting the invested capital. This protection is, however, not perfect and the gap risk must be quantified. CPPI
strategies are path dependent and may have American exercise which makes their valuation complex. A naive description of the
138

state of the portfolio would involve three or even four variables. In this article we prove that the system can be described as a
discrete-time Markov process in one single variable if the underlying asset follows a process with independent increments. This
yields an efficient pricing scheme using transition probabilities. Our framework is flexible enough to handle most features of traded
CPPIs including profit lock-in and other kinds of strategies with discrete-time reallocation.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

Optimal control and dependence modeling of insurance portfolios with Lvy dynamics. Bauerle, Nicole; Blatter, Anja [RKN: 45134]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (3) : 398-405.
In this paper we are interested in optimizing proportional reinsurance and investment policies in a multidimensional Lvy-driven
insurance model. The criterion is that of maximizing exponential utility. Solving the classical HamiltonJacobiBellman equation
yields that the optimal retention level keeps a constant amount of claims regardless of time and the companys wealth level. A
special feature of our construction is to allow for dependencies of the risk reserves in different business lines. Dependence is
modeled via an Archimedean Lvy copula. We derive a sufficient and necessary condition for an Archimedean Lvy generator to
create a multidimensional positive Lvy copula in arbitrary dimension. Based on these results we identify structure conditions for
the generator and the Lvy measure of an Archimedean Lvy copula under which an insurance company reinsures a larger
fraction of claims from one business line than from another.

Available via Athens: Palgrave MacMillan
http://www.openathens.net

Portfolio insurance under a risk-measure constraint. De Franco, Carmine; Tankov, Peter [RKN: 44938]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 361-370.
The authors study the problem of portfolio insurance from the point of view of a fund manager, who guarantees to the investor that
the portfolio value at maturity will be above a fixed threshold. If, at maturity, the portfolio value is below the guaranteed level, a third
party will refund the investor up to the guarantee. In exchange for this protection, the third party imposes a limit on the risk
exposure of the fund manager, in the form of a convex monetary risk measure. The fund manager therefore tries to maximize the
investors utility function subject to the risk-measure constraint. The authors give a full solution to this non-convex optimization
problem in the complete market setting and show in particular that the choice of the risk measure is crucial for the optimal portfolio
to exist. Explicit results are provided for the entropic risk measure (for which the optimal portfolio always exists) and for the class
of spectral risk measures (for which the optimal portfolio may fail to exist in some cases).
Available via Athens: Palgrave MacMillan
http://www.openathens.net

PORTFOLIO INVESTMENT

Extreme Events: Robust Portfolio Construction in the Presence of Fat Tails. Kemp, Malcolm H D (2011). - Chichester: John Wiley
& Sons Ltd, 2011. - 312 pages. [RKN: 13141]
Shelved at: EF/JNH (Lon) Shelved at: 368.01 KEM
Markets are fat-tailed; extreme outcomes occur more often than many might hope, or indeed the statistics or normal distributions
might indicate. In this book, the author provides readers with the latest tools and techniques on how best to adapt portfolio
construction techniques to cope with extreme events. Beginning with an overview of portfolio construction and market drivers, the
book will analyze fat tails, what they are, their behavior, how they can differ and what their underlying causes are. The book will
then move on to look at portfolio construction techniques which take into account fat tailed behavior, and how to stress test your
portfolio against extreme events. Finally, the book will analyze really extreme events in the context of portfolio choice and
problems. The book will offer readers: Ways of understanding and analyzing sources of extreme events, Tools for analyzing the
key drivers of risk and return, their potential magnitude and how they might interact, Methodologies for achieving efficient portfolio
construction and risk budgeting, Approaches for catering for the time-varying nature of the world in which we live, Back-stop
approaches for coping with really extreme events, Illustrations and real life examples of extreme events across asset classes. This
will be an indispensible guide for portfolio and risk managers who will need to better protect their portfolios against extreme events
which, within the financial markets, occur more frequently than we might expect.

Portfolio adjusting optimization with added assets and transaction costs based on credibility measures. Zhang, Wei-Guo; Zhang,
Xi-Li; Chen, Yunxia [RKN: 44937]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 353-360.
In response to changeful financial markets and investors capital, we discuss a portfolio adjusting problem with additional risk
assets and a riskless asset based on credibility theory. We propose two credibilistic meanvariance portfolio adjusting models with
general fuzzy returns, which take lending, borrowing, transaction cost, additional risk assets and capital into consideration in
portfolio adjusting process. We present crisp forms of the models when the returns of risk assets are some deterministic fuzzy
variables such as trapezoidal, triangular and interval types. We also employ a quadratic programming solution algorithm for
obtaining optimal adjusting strategy. The comparisons of numeral results from different models illustrate the efficiency of the
proposed models and the algorithm.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

139

PORTFOLIO MANAGEMENT

Dynamic portfolio optimization in discrete-time with transaction costs. Atkinson, Colin; Quek, Gary Routledge, [RKN: 45840]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (3-4) : 265-298.
A discrete-time model of portfolio optimization is studied under the effects of proportional transaction costs. A general class of
underlying probability distributions is assumed for the returns of the asset prices. An investor with an exponential utility function
seeks to maximize the utility of terminal wealth by determining the optimal investment strategy at the start of each time step.
Dynamic programming is used to derive an algorithm for computing the optimal value function and optimal boundaries of the
no-transaction region at each time step. In the limit of small transaction costs, perturbation analysis is applied to obtain the optimal
value function and optimal boundaries at any time step in the rebalancing of the portfolio.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

Extreme Events: Robust Portfolio Construction in the Presence of Fat Tails. Kemp, Malcolm H D (2011). - Chichester: John Wiley
& Sons Ltd, 2011. - 312 pages. [RKN: 13141]
Shelved at: EF/JNH (Lon) Shelved at: 368.01 KEM
Markets are fat-tailed; extreme outcomes occur more often than many might hope, or indeed the statistics or normal distributions
might indicate. In this book, the author provides readers with the latest tools and techniques on how best to adapt portfolio
construction techniques to cope with extreme events. Beginning with an overview of portfolio construction and market drivers, the
book will analyze fat tails, what they are, their behavior, how they can differ and what their underlying causes are. The book will
then move on to look at portfolio construction techniques which take into account fat tailed behavior, and how to stress test your
portfolio against extreme events. Finally, the book will analyze really extreme events in the context of portfolio choice and
problems. The book will offer readers: Ways of understanding and analyzing sources of extreme events, Tools for analyzing the
key drivers of risk and return, their potential magnitude and how they might interact, Methodologies for achieving efficient portfolio
construction and risk budgeting, Approaches for catering for the time-varying nature of the world in which we live, Back-stop
approaches for coping with really extreme events, Illustrations and real life examples of extreme events across asset classes. This
will be an indispensible guide for portfolio and risk managers who will need to better protect their portfolios against extreme events
which, within the financial markets, occur more frequently than we might expect.

Multi-period mean-variance portfolio selection with regime switching and a stochastic cash flow. Wu, Huiling; Li, Zhongfei [RKN:
45640]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 371-384.
This paper investigates a non-self-financing portfolio optimization problem under the framework of multi-period meanvariance
with Markov regime switching and a stochastic cash flow. The stochastic cash flow can be explained as capital additions or
withdrawals during the investment process. Specially, the cash flow is the surplus process or the risk process of an insurer at each
period. The returns of assets and amount of the cash flow all depend on the states of a stochastic market which are assumed to
follow a discrete-time Markov chain. We analyze the existence of optimal solutions, and derive the optimal strategy and the
efficient frontier in closed-form. Several special cases are discussed and numerical examples are given to demonstrate the effect
of cash flow.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Portfolio selection and duality under mean variance preferences. Eichner, Thomas [RKN: 39937]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 146-152.
This paper uses duality to analyze an investors behavior in a n-asset portfolio selection problem when the investor has mean
variance preferences. The indirect utility and wealth requirement functions are used to derive Roys identity, Shephards lemma
and the Slutsky equation. In our simple Slutsky equation the income effect is characterized by decreasing absolute risk aversion
(DARA) and the substitution effect is always positive [negative] with respect to an assets holding if the assets mean return [risk]
increases. Substitution effect and income effect work in the same direction presupposed mean variance preferences display
DARA.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Portfolio selection through an extremality stochastic order. Laniado, Henry; Lillo, Rosa E; Pellerey, Franco; Romo, Juan [RKN:
45718]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 1-9.
In this paper, we introduce a new multivariate stochastic order that compares random vectors in a direction which is determined by
a unit vector, generalizing the previous upper and lower orthant orders. The main properties of this new order, together with its
relationships with other multivariate stochastic orders, are investigated and, we present some examples of application in the
determination of optimal allocations of wealth among risks in single period portfolio problems.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Translation-invariant and positive-homogeneous risk measures and optimal portfolio management in the presence of a riskless
component. Landsman, Zinoviy; Makov, Udi E [RKN: 44994]
Insurance: Mathematics & Economics (2012) 50 (1) : 94-98.
Risk portfolio optimization, with translation-invariant and positive-homogeneous risk measures, leads to the problem of minimizing
a combination of a linear functional and a square root of a quadratic functional for the case of elliptical multivariate underlying
distributions.

140

This problem was recently treated by the authors for the case when the portfolio does not contain a riskless component. When it
does, however, the initial covariance matrix S becomes singular and the problem becomes more complicated. In the paper we
focus on this case and provide an explicit closed-form solution of the minimization problem, and the condition under which this
solution exists. The results are illustrated using data of 10 stocks from the NASDAQ Computer Index.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

TVaR-based capital allocation for multivariate compound distributions with positive continuous claim amounts. Cossette,
Helene; Mailhot, Melina; Marceau, Etienne [RKN: 45598]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (2) : 247-256.
In this paper, we consider a portfolio of n dependent risks X1,,Xn and we study the stochastic behavior of the aggregate claim
amount [unable to display]. Our objective is to determine the amount of economic capital needed for the whole portfolio and to
compute the amount of capital to be allocated to each risk X1,,Xn. To do so, we use a topdown approach. For (X1,,Xn), we
consider risk models based on multivariate compound distributions defined with a multivariate counting distribution. We use the
TVaR to evaluate the total capital requirement of the portfolio based on the distribution of S, and we use the TVaR-based capital
allocation method to quantify the contribution of each risk. To simplify the presentation, the claim amounts are assumed to be
continuously distributed. For multivariate compound distributions with continuous claim amounts, we provide general formulas for
the cumulative distribution function of S, for the TVaR of S and the contribution to each risk. We obtain closed-form expressions for
those quantities for multivariate compound distributions with gamma and mixed Erlang claim amounts. Finally, we treat in detail
the multivariate compound Poisson distribution case. Numerical examples are provided in order to examine the impact of the
dependence relation on the TVaR of S, the contribution to each risk of the portfolio, and the benefit of the aggregation of several
risks.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

PREDICTION

Claims development result in the paid-incurred chain reserving method. Happ, Sebastian; Merz, Michael; Wthrich, Mario V [RKN:
45724]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 66-72.
We present the one-year claims development result (CDR) in the paid-incurred chain (PIC) reserving model. The PIC reserving
model presented in Merz and Wthrich (2010) is a Bayesian stochastic claims reserving model that considers simultaneously
claims payments and incurred losses information and allows for deriving the full predictive distribution of the outstanding loss
liabilities. In this model we study the conditional mean square error of prediction (MSEP) for the one-year CDR uncertainty, which
is the crucial uncertainty view under Solvency II.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Prediction of outstanding payments in a Poisson cluster model. Jessen, Anders Hedegaard; Mikosch, Thomas; Samorodnitsky,
Gennady [RKN: 45490]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 3 : 214-237.
We consider a simple Poisson cluster model for the payment numbers and the corresponding total payments for insurance claims
arriving in a given year. Due to the Poisson structure one can give reasonably explicit expressions for the prediction of the
payment numbers and total payments in future periods given the past observations of the payment numbers. One can also derive
reasonably explicit expressions for the corresponding prediction errors. In the (a, b) class of Panjer's claim size distributions, these
expressions can be evaluated by simple recursive algorithms. We study the conditions under which the predictions are
asymptotically linear as the number of past payments becomes large. We also demonstrate that, in other regimes, the prediction
may be far from linear. For example, a staircase-like pattern may arise as well. We illustrate how the theory works on real-life data,
also in comparison with the chain ladder method.

PREMIUM CALCULATION

Mean-Value Principle under Cumulative Prospect Theory. Kaluszka, Marek; Krzeszowiec, Michal - 20 pages. [RKN: 70745]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2012) 42 (1) : 103-122.
In the paper we introduce a generalization of the mean-value principle under Cumulative Prospect Theory. This new method
involves some well-known ways of pricing insurance contracts described in the actuarial literature. Properties of this premium
principle, such as translation and scale invariance, additivity for independent risks, risk loading and others are studied.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

On the interplay between distortion, mean value and Haezendonck-Goovaerts risk measures. Goovaerts, Marc; Linders, Daniel;
Van Weert, Koen; Tank, Fatih [RKN: 45719]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 10-18.
In the actuarial research, distortion, mean value and HaezendonckGoovaerts risk measures are concepts that are usually treated
separately. In this paper we indicate and characterize the relation between these different risk measures, as well as their relation
141

to convex risk measures. While it is known that the mean value principle can be used to generate premium calculation principles,
we will show how they also allow to generate solvency calculation principles. Moreover, we explain the role provided for the
distortion risk measures as an extension of the Tail Value-at-Risk (TVaR) and Conditional Tail Expectation (CTE).
Available via Athens: Palgrave MacMillan
http://www.openathens.net

PREMIUM PRINCIPLES

Conditional tail expectation and premium calculation. Heras, Antonio; Balbs, Beatriz; Vilar, Jos Luis - 18 pages. [RKN: 70753]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2012) 42 (1) : 325-342.
In this paper we calculate premiums which are based on the minimization of the Expected Tail Loss or Conditional Tail Expectation
(CTE) of absolute loss functions. The methodology generalizes well known premium calculation procedures and gives sensible
results in practical applications. The choice of the absolute loss becomes advisable in this context since its CTE is easy to
calculate and to understand in intuitive terms. The methodology also can be applied to the calculation of the VaR and CTE of the
loss associated with a given premium.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

PREMIUMS

Calibrating affine stochastic mortality models using term assurance premiums. Russo, Vincenzo; Giacometti, Rosella; Ortobelli,
Sergio; Rachev, Svetlozar; Fabozzi, Frank J [RKN: 44975]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (1) : 53-60.
In this paper, we focus on the calibration of affine stochastic mortality models using term assurance premiums. We view term
assurance contracts as a swap in which policyholders exchange cash flows (premiums vs. benefits) with an insurer analogous to
a generic interest rate swap or credit default swap. Using a simple bootstrapping procedure, we derive the term structure of
mortality rates from a stream of contract quotes with different maturities. This term structure is used to calibrate the parameters of
affine stochastic mortality models where the survival probability is expressed in closed form. The Vasicek, CoxIngersollRoss,
and jump-extended Vasicek models are considered for fitting the survival probabilities term structure. An evaluation of the
performance of these models is provided with respect to premiums of three Italian insurance companies.

Available via Athens: Palgrave MacMillan
http://www.openathens.net

Estimating the distortion parameter of the proportional-hazard premium for heavy-tailed losses. Brahimi, Brahim; Meraghni,
Djamel; Necir, Abdelhakim; Zitikis, Ricardas [RKN: 44934]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 325-334.
The distortion parameter reflects the amount of loading in insurance premiums. A specific value of a given premium determines a
value of the distortion parameter, which depends on the underlying loss distribution. Estimating the parameter, therefore,
becomes a statistical inferential problem, which has been initiated by Jones and Zitikis [Jones, B.L., Zitikis, R., 2007. Risk
measures, distortion parameters, and their empirical estimation. Insurance: Mathematics and Economics, 41, 279297] in the
case of the distortion premium and tackled within the framework of the central limit theorem. Heavy-tailed losses do not fall into
this framework as they rely on the extreme-value theory. In this paper, we concentrate on a special but important distortion
premium, called the proportional-hazard premium, and propose an estimator for its distortion parameter in the case of heavy-tailed
losses. We derive an asymptotic distribution of the estimator, construct a practically implementable confidence interval for the
distortion parameter, and illustrate the performance of the interval in a simulation study.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

A joint valuation of premium payment and surrender options in participating life insurance contracts. Schmeiser, H; Wagner, J
[RKN: 44958]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 580-596.
In addition to an interest rate guarantee and annual surplus participation, life insurance contracts typically embed the right to stop
premium payments during the term of the contract (paid-up option), to resume payments later (resumption option), or to terminate
the contract early (surrender option). Terminal guarantees are on benefits payable upon death, survival and surrender. The latter
are adapted after exercising the options. A model framework including these features and an algorithm to jointly value the
premium payment and surrender options is presented. In a first step, the standard principles of risk-neutral evaluation are applied
and the policyholder is assumed to use an economically rational exercise strategy. In a second step, option value sensitivity on
different contract parameters, benefit adaptation mechanisms, and exercise behavior is analyzed numerically. The two latter are
the main drivers for the option value.

Available via Athens: Palgrave MacMillan
http://www.openathens.net

Log-supermodularity of weight functions, ordering weighted losses, and the loading monotonicity of weighted premiums.
Sendov, Hristo S; Wang, Ying; Zitikis, Ricardas [RKN: 40020]
142

Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 257-264.
The paper is motivated by a problem concerning the monotonicity of insurance premiums with respect to their loading parameter:
the larger the parameter, the larger the insurance premium is expected to be. This property, usually called the loading
monotonicity, is satisfied by premiums that appear in the literature. The increased interest in constructing new insurance
premiums has raised a question as to what weight functions would produce loading-monotonic premiums. In this paper, we
demonstrate a decisive role of log-supermodularity or, equivalently, of total positivity of order 2 (TP2) in answering this question.
As a consequence, we establishat a strokethe loading monotonicity of a number of well-known insurance premiums, and offer a
host of further weight functions, and consequently of premiums, thus illustrating the power of the herein suggested methodology
for constructing loading-monotonic insurance premiums.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

A note on subadditivity of zero-utility premiums. Denuit, Michel; Eeckhoudt, Louis; Menegatti, Mario [RKN: 45307]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU/AST
ASTIN Bulletin (2011) 41 (1) : 239-250.
Many papers in the literature have adopted the expected utility paradigm to analyze insurance decisions. Insurance companies
manage policies by growing, by adding independent risks. Even if adding risks generally ultimately decreases the probability of
insolvency, the impact on the insurers expected utility is less clear. Indeed, it is not true that the risk aversion toward the additional
loss generated by a new policy included in an insurance portfolio always decreases with the number of contracts already
underwritten. The present paper derives conditions under which zero-utility premium principles are subadditive for independent
risks. It is shown that subadditivity is the exception rather than the rule: the zero-utility premium principle generates a
superadditive risk premium for most common utility functions. For instance, all completely monotonic utility functions generate
superadditive zero-utility premiums. The main message of the present paper is thus that the zero-utility premium for a marginal
policy is generally not sufficient to guarantee the formation of insurance portfolios without additional capital.
online access via International Actuarial Association:
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Pricing insurance contracts under Cumulative Prospect Theory. Kaluszka, Marek; Krzeszowiec, Michal [RKN: 45532]
Insurance: Mathematics & Economics (2012) 50 (1) : 159-166.
The aim of this paper is to introduce a premium principle which relies on Cumulative Prospect Theory by Kahneman and Tversky.
Some special cases of this premium principle have already been studied in the actuarial literature. In the paper, properties of this
premium principle are examined.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

PRICE MECHANISM

Multidimensional LeeCarter model with switching mortality processes. Hainaut, Donatien [RKN: 45597]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (2) : 236-246.
This paper proposes a multidimensional LeeCarter model, in which the time dependent components are ruled by switching
regime processes. The main feature of this model is its ability to replicate the changes of regimes observed in the mortality
evolution. Changes of measure, preserving the dynamics of the mortality process under a pricing measure, are also studied. After
a review of the calibration method, a 2D, 2-regimes model is fitted to the male and female French population, for the period
19462007. Our analysis reveals that one regime corresponds to longevity conditions observed during the decade following the
second world war, while the second regime is related to longevity improvements observed during the last 30 years. To conclude,
we analyze, in a numerical application, the influence of changes of measure affecting transition probabilities, on prices of life and
death insurances.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

PRICES

The herd behavior index : A new measure for the implied degree of co-movement in stock markets. Dhaene, Jan; Linders, Daniel;
Schoutens, Wim; Vyncke, David [RKN: 45639]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 357-370.
We introduce a new and easy-to-calculate measure for the expected degree of herd behaviour or co-movement between stock
prices. This forward looking measure is model-independent and based on observed option data. It is baptized the Herd Behavior
Index (HIX). The degree of co-movement in a stock market can be determined by comparing the observed market situation with
the extreme (theoretical) situation under which the whole system is driven by a single factor. The HIX is then defined as the ratio of
an option-based estimate of the risk-neutral variance of the market index and an option-based estimate of the corresponding
variance in case of the extreme single factor market situation. The HIX can be determined for any market index provided an
appropriate series of vanilla options is traded on this index as well as on its components. As an illustration, we determine historical
values of the 30-days HIX for the Dow Jones Industrial Average, covering the period January 2003 to October 2009.
Available via Athens: Palgrave MacMillan
http://www.openathens.net
143


The tick-by-tick dynamical consistency of price impact in limit order books. Challet, Damien [RKN: 45456]
Applied Mathematical Finance (2011) 18 (3-4) : 189-205.
Constant price impact functions, much used in financial literature, are shown to give rise to paradoxical outcomes as they do not
allow for proper predictability removal: for instance, the exploitation of a single large trade whose size and time of execution are
known in advance to some insider leaves the arbitrage opportunity unchanged, which allows arbitrage exploitation multiple times.
We argue that chain arbitrage exploitation should not exist, which provides an a contrario consistency criterion. Remarkably, all
the stocks investigated in the Paris Stock Exchange have dynamically consistent price impact functions. Both the bid-ask spread
and the feedback of sequential same-side market orders onto both sides of the order book are essential to ensure consistency at
the smallest time scale.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

PRICING

Ambiguity aversion : a new perspective on insurance pricing. Zhao, Lin; Zhu, Wei [RKN: 45304]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU/AST
ASTIN Bulletin (2011) 41 (1) : 157-189.
This paper intends to develop a feasible framework which incorporates ambiguity aversion into the pricing of insurance products
and investigate the implications of ambiguity aversion on the pricing by comparing it with risk aversion. As applications of the
framework, we present the closed-form pricing formulae for some insurance products appearing in life insurance and property
insurance. Our model confirms that the effects of ambiguity aversion on the pricing of insurance do differ from those of risk
aversion. Implications of our model are consistent with some empirical evidences documented in the literature. Our results
suggest that taking advantage of natural hedge mechanism can help us control the effects of model uncertainty.
online access via International Actuarial Association:
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Contingent claim pricing using a normal inverse Gaussian probability distortion operator. Godin, Frdric; Mayoral, Silvia;
Morales, Manuel - 26 pages. [RKN: 70433]
Shelved at: Per: J.Risk Ins (Oxf) Shelved at: JOU
Journal of Risk and Insurance (2012) 79 (3) : 841-866.
We consider the problem of pricing contingent claims using distortion operators. This approach was first developed in (Wang,
2000) where the original distortion function was defined in terms of the normal distribution. Here, we introduce a new distortion
based on the Normal Inverse Gaussian (NIG) distribution. The NIG is a generalization of the normal distribution that allows for
heavier skewed tails. The resulting operator asymmetrically distorts the underlying distribution. Moreover, we show how we can
recuperate non-Gaussian BlackScholes formulas using distortion operators and we provide illustrations of their performance. We
conclude with a brief discussion on risk management applications.
Available online via Athens

Corrections to the prices of derivatives due to market incompleteness. German, David [RKN: 45258]
Applied Mathematical Finance (2011) 18 (1-2) : 155-187.
We compute the first-order corrections to marginal utility-based prices with respect to a 'small' number of random endowments in
the framework of three incomplete financial models. They are a stochastic volatility model, a basis risk and market portfolio model
and a credit-risk model with jumps and stochastic recovery rate.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

Efficient algorithms for basket default swap pricing with multivariate Archimedean copulas. Choe, Geon Ho; Jang, Hyun Jin [RKN:
40014]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 205-213.
We introduce a new importance sampling method for pricing basket default swaps employing exchangeable Archimedean copulas
and nested Gumbel copulas. We establish more realistic dependence structures than existing copula models for credit risks in the
underlying portfolio, and propose an appropriate density for importance sampling by analyzing multivariate Archimedean copulas.
To justify efficiency and accuracy of the proposed algorithms, we present numerical examples and compare them with the crude
Monte Carlo simulation, and finally show that our proposed estimators produce considerably smaller variances.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

The impact of stochastic volatility on pricing, hedging, and hedge efficiency of withdrawal benefit guarantees in variable
annuities. Kling, Alexander; Ruez, Frederik; Russ, Jochen - 35 pages. [RKN: 74745]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2011) 41 (2) : 511-545.
We analyze different types of guaranteed withdrawal benefits for life, the latest guarantee feature within variable annuities.
Besides an analysis of the impact of different product features on the clients' payoff profile, we focus on pricing and hedging of the
guarantees. In particular, we investigate the impact of stochastic equity volatility on pricing and hedging. We consider different
dynamic hedging strategies for delta and vega risks and compare their performance. We also examine the effects if the hedging
model (with deterministic volatility) differs from the data-generating model (with stochastic volatility). This is an indication for the
model risk an insurer takes by assuming constant equity volatilities for risk management purposes, whereas in the real world
144

volatilities are stochastic.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Insurance pricing with complete information, state-dependent utility, and production costs. Ramsay, Colin M; Oguledo, Victor I
[RKN: 45649]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 462-469.
We consider a group of identical risk-neutral insurers selling single-period indemnity insurance policies. The insurance market
consists of individuals with common state-dependent utility function who are identical except for their known accident probability q.
Insurers incur production costs (commonly called expenses or transaction costs by actuaries) that are proportional to the amount
of insurance purchased and to the premium charged. By introducing the concept of insurance desirability, we prove that the
existence of insurer expenses generates a pair of constants qmin and qmax that naturally partitions the applicant pool into three
mutually exclusive and exhaustive groups of individuals: those individuals with accident probability q [0,qmin) are insurable but do
not desire insurance, those individuals with accident probability q [qmin,qmax] are insurable and desire insurance, and those
individuals with accident probability q (qmax,1] desire insurance but are uninsurable. We also prove that, depending on the level of
q and the marginal rate of substitution between states, it may be optimal for individuals to buy complete (full) insurance, partial
insurance, or no insurance at all. Finally, we prove that when q is known in monopolistic markets (i.e., markets with a single
insurer), applicants may be induced to over insure whenever partial insurance is bought.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Mean-Value Principle under Cumulative Prospect Theory. Kaluszka, Marek; Krzeszowiec, Michal - 20 pages. [RKN: 70745]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2012) 42 (1) : 103-122.
In the paper we introduce a generalization of the mean-value principle under Cumulative Prospect Theory. This new method
involves some well-known ways of pricing insurance contracts described in the actuarial literature. Properties of this premium
principle, such as translation and scale invariance, additivity for independent risks, risk loading and others are studied.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

New analytic approach to address putcall parity violation due to discrete dividends. Buryaka, Alexander; Guo, Ivan Routledge,
[RKN: 45794]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (1-2) : 37-58.
The issue of developing simple BlackScholes (BS)-type approximations for pricing European options with large discrete
dividends was popular since the early 2000s with a few different approaches reported during the last 10 years. Moreover, it has
been claimed that at least some of the resulting expressions represent high-quality approximations which closely match the results
obtained by the use of numerics. In this article we review, on the one hand, these previously suggested BS-type approximations
and, on the other hand, different versions of the corresponding CrankNicolson (CN) numerical schemes with a primary focus on
their boundary condition variations. Unexpectedly we often observe substantial deviations between the analytical and numerical
results which may be especially pronounced for European puts. Moreover, our analysis demonstrates that any BS-type
approximation which adjusts put parameters identically to call parameters has an inherent problem of failing to detect a little known
putcall parity violation phenomenon. To address this issue, we derive a new analytic pricing approximation which is in better
agreement with the corresponding numerical results in comparison with any of the previously known analytic approaches for
European calls and puts with large discrete dividends.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

No-good-deal, local mean-variance and ambiguity risk pricing and hedging for an insurance payment process. Delong, Lukasz -
30 pages. [RKN: 70749]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2012) 42 (1) : 203-232.
We study pricing and hedging for an insurance payment process. We investigate a Black-Scholes financial model with stochastic
coefficients and a payment process with death, survival and annuity claims driven by a point process with a stochastic intensity.
The dependence of the claims and the intensity on the financial market and on an additional background noise (correlated index,
longevity risk) is allowed. We establish a general modeling framework for no-good-deal, local mean-variance and ambiguity risk
pricing and hedging. We show that these three valuation approaches are equivalent under appropriate formulations. We
characterize the price and the hedging strategy as a solution to a backward stochastic differential equation. The results could be
applied to pricing and hedging of variable annuities, surrender options under an irrational lapse behavior and mortality derivatives.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

On modelling and pricing rainfall derivatives with seasonality. Leobacher, Gunther; Ngare, Philip [RKN: 45254]
Applied Mathematical Finance (2011) 18 (1-2) : 71-91.
We are interested in pricing rainfall options written on precipitation at specific locations. We assume the existence of a tradeable
financial instrument in the market whose price process is affected by the quantity of rainfall. We then construct a suitable
'Markovian gamma' model for the rainfall process which accounts for the seasonal change of precipitation and show how
maximum likelihood estimators can be obtained for its parameters.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

On the spurious correlation between sample betas and mean returns. Levy, Moshe Routledge, [RKN: 45843]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (3-4) : 341-360.
Cornerstone asset pricing models, such as capital asset pricing model (CAPM) and arbitrage pricing theory (APT), yield
theoretical predictions about the relationship between expected returns and exposure to systematic risk, as measured by beta(s).
Numerous studies have investigated the empirical validity of these models. We show that even if no relationship holds between
145

true expected returns and betas in the population, the existence of low-probability extreme outcomes induces a spurious
correlation between the sample means and the sample betas. Moreover, the magnitude of this purely spurious correlation is
similar to the empirically documented correlation, and the regression slopes and intercepts are very similar as well. This result
does not necessarily constitute evidence against the theoretical asset pricing models, but it does shed new light on previous
empirical results, and it points to an issue that should be carefully considered in the empirical testing of these models. The analysis
points to the dangers of relying on simple least squares regression for drawing conclusions about the validity of equilibrium pricing
models.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

Pricing compound Poisson processes with the FarlieGumbelMorgenstern dependence structure. Marri, Fouad; Furman,
Edward [RKN: 45732]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 151-157.
Convenient expressions for the Esscher pricing functional in the context of the compound Poisson processes with dependent loss
amounts and loss inter-arrival times are developed. To this end, the moment generating function of the aforementioned dependent
processes is derived and studied. Various implications of the dependence are discussed and exemplified numerically.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Pricing fixed-income securities in an information-based framework. Hughston, Lane P; Macrina, Andrea Routledge, [RKN: 45844]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (3-4) : 361-379.
The purpose of this article is to introduce a class of information-based models for the pricing of fixed-income securities. We
consider a set of continuous-time processes that describe the flow of information concerning market factors in a monetary
economy. The nominal pricing kernel is assumed to be given at any specified time by a function of the values of information
processes at that time. Using a change-of-measure technique, we derive explicit expressions for the prices of nominal discount
bonds and deduce the associated dynamics of the short rate of interest and the market price of risk. The interest rate positivity
condition is expressed as a differential inequality. An example that shows how the model can be calibrated to an arbitrary initial
yield curve is presented. We proceed to model the price level, which is also taken at any specified time to be given by a function of
the values of the information processes at that time. A simple model for a stochastic monetary economy is introduced in which the
prices of the nominal discount bonds and inflation-linked notes can be expressed in terms of aggregate consumption and the
liquidity benefit generated by the money supply.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

Pricing of Parisian options for a jump-diffusion model with two-sided jumps. Albrecher, Hansjorg; Kortschak, Dominik; Zhou,
Xiaowen Routledge, [RKN: 45796]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (1-2) : 97-129.
Using the solution of one-sided exit problem, a procedure to price Parisian barrier options in a jump-diffusion model with two-sided
exponential jumps is developed. By extending the method developed in Chesney, Jeanblanc-Picqu and Yor (1997; Brownian
excursions and Parisian barrier options, Advances in Applied Probability, 29(1), pp. 165184) for the diffusion case to the more
general set-up, we arrive at a numerical pricing algorithm that significantly outperforms Monte Carlo simulation for the prices of
such products.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

Statistical analysis of model risk concerning temperature residuals and its impact on pricing weather derivatives. Ahcan, Ales
[RKN: 44998]
Insurance: Mathematics & Economics (2012) 50 (1) : 131-138.
In this paper we model the daily average temperature via an extended version of the standard Ornstein Uhlenbeck process driven
by a Levy noise with seasonally adjusted asymmetric ARCH process for volatility. More precisely, we model the disturbances with
the Normal inverse Gaussian (NIG) and Variance gamma (VG) distribution. Besides modelling the residuals we also compare the
prices of January 2010 out of the money call and put options for two of the Slovenian largest cities Ljubljana and Maribor under
normally distributed disturbances and NIG and VG distributed disturbances. The results of our numerical analysis demonstrate
that the normal model fails to capture adequately tail risk, and consequently significantly misprices out of the money options. On
the other hand prices obtained using NIG and VG distributed disturbances fit well to the results obtained by bootstrapping the
residuals. Thus one should take extreme care in choosing the appropriate statistical model.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

A time-dependent variance model for pricing variance and volatility swaps. Goard, Joanna [RKN: 45253]
Applied Mathematical Finance (2011) 18 (1-2) : 51-70.
Analytic solutions are found for prices of variance and volatility swaps under a new time-dependent stochastic model for the
dynamics of variance. The main features of the new stochastic differential equation are (1) an empirically validated diffusion term
and (2) a free function of time as a moving target in a reversion term, allowing additional flexibility for model calibration against
market data.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

The uncertain mortality intensity framework: pricing and hedging unit-linked life insurance contracts. Li, Jing; Szimayer,
Alexander [RKN: 44949]
146

Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 471-486.
We study the valuation and hedging of unit-linked life insurance contracts in a setting where mortality intensity is governed by a
stochastic process. We focus on model risk arising from different specifications for the mortality intensity. To do so we assume that
the mortality intensity is almost surely bounded under the statistical measure. Further, we restrict the equivalent martingale
measures and apply the same bounds to the mortality intensity under these measures. For this setting we derive upper and lower
price bounds for unit-linked life insurance contracts using stochastic control techniques. We also show that the induced hedging
strategies indeed produce a dynamic superhedge and subhedge under the statistical measure in the limit when the number of
contracts increases. This justifies the bounds for the mortality intensity under the pricing measures. We provide numerical
examples investigating fixed-term, endowment insurance contracts and their combinations including various guarantee features.
The pricing partial differential equation for the upper and lower price bounds is solved by finite difference methods. For our
contracts and choice of parameters the pricing and hedging is fairly robust with respect to misspecification of the mortality
intensity. The model risk resulting from the uncertain mortality intensity is of minor importance.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

PRIVATE MEDICAL INSURANCE

Modeling dependent yearly claim totals including zero claims in private health insurance. Erhardt, Vinzenz; Czado, Claudia [RKN:
45781]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2012) 2 : 106-129.
In insurance applications yearly claim totals of different coverage fields are often dependent. In many cases there are numerous
claim totals which are zero. A marginal claim distribution will have an additional point mass at zero, hence this probability function
(pf) will not be continuous at zero and the cumulative distribution functions will not be uniform. Therefore using a copula approach
to model dependency is not straightforward. We will illustrate how to express the joint pf by copulas with discrete and continuous
margins. A pair-copula construction will be used for the fit of the continuous copula allowing to choose appropriate copulas for
each pair of margins.
http://www.openathens.net/

PROBABILITY

Bayesian theory and methods with applications. Savchuk, Vladimir; Tsokos, Chris P (2011). Atlantis Press, 2011. - 317 pages.
[RKN: 74707]
Shelved at: 519.5
Bayesian methods are growing more and more popular, finding new practical applications in the fields of health sciences,
engineering, environmental sciences, business and economics and social sciences, among others. This book explores the use of
Bayesian analysis in the statistical estimation of the unknown phenomenon of interest. The contents demonstrate that where such
methods are applicable, they offer the best possible estimate of the unknown. Beyond presenting Bayesian theory and methods of
analysis, the text is illustrated with a variety of applications to real world problems.

A coherent aggregation framework for stress testing and scenario analysis. Kwiatkowski, Jan; Rebonato, Riccardo [RKN: 45257]
Applied Mathematical Finance (2011) 18 (1-2) : 139-154.
We present a methodology to aggregate in a coherent manner conditional stress losses in a trading or banking book. The
approach bypasses the specification of unconditional probabilities of the individual stress events and ensures by a linear
programming approach so that the (subjective or frequentist) conditional probabilities chosen by the risk manager are internally
consistent. The admissibility requirement greatly reduces the degree of arbitrariness in the conditional probability matrix if this is
assigned subjectively. The approach can be used to address the requirements of the regulators on the Instantaneous Risk
Charge.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

Markov decision processes with applications to finance. Buerle, Nicole; Rieder, Ulrich (2011). - London: Springer, 2011. - 388
pages. [RKN: 73684]
Shelved at: 368.01
The theory of Markov decision processes focuses on controlled Markov chains in discrete time. The authors establish the theory
for general state and action spaces and at the same time show its application by means of numerous examples, mostly taken from
the fields of finance and operations research. By using a structural approach many technicalities (concerning measure theory) are
avoided. They cover problems with finite and infinite horizons, as well as partially observable Markov decision processes,
piecewise deterministic Markov decision processes and stopping problems. The book presents Markov decision processes in
action and includes various state-of-the-art applications with a particular view towards finance. It is useful for upper-level
undergraduates, Master's students and researchers in both applied probability and finance, and provides exercises (without
solutions).

Tail distortion risk and its asymptotic analysis. Zhu, Li; Li, Haijun [RKN: 45728]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 115-121.
A distortion risk measure used in finance and insurance is defined as the expected value of potential loss under a scenario
147

probability measure. In this paper, the tail distortion risk measure is introduced to assess tail risks of excess losses modeled by the
right tails of loss distributions. The asymptotic linear relation between tail distortion and value-at-risk is derived for heavy-tailed
losses with the linear proportionality constant depending only on the distortion function and the tail index. Various examples
involving tail distortions for location-invariant, scale-invariant, and shape-invariant loss distribution families are also presented to
illustrate the results.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

PROBABILITY DISTRIBUTION

Determination of the probability distribution measures from market option prices using the method of maximum entropy in the
mean. Gzyl, Henryk; Mayoral, Silvia Routledge, [RKN: 45841]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (3-4) : 299-312.
We consider the problem of recovering the risk-neutral probability distribution of the price of an asset, when the information
available consists of the market price of derivatives of European type having the asset as underlying. The information available
may or may not include the spot value of the asset as data. When we only know the true empirical law of the underlying, our
method will provide a measure that is absolutely continuous with respect to the empirical law, thus making our procedure model
independent. If we assume that the prices of the derivatives include risk premia and/or transaction prices, using this method it is
possible to estimate those values, as well as the no-arbitrage prices. This is of interest not only when the market is not complete,
but also if for some reason we do not have information about the model for the price of the underlying.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

On the Lp-metric between a probability distribution and its distortion. Lpez-Daz, Miguel; Sordo, Miguel A; Surez-Llorens, Alfonso
[RKN: 44784]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51(2) : 257-264.
In actuarial theory, the [Lp]-metric is used to evaluate how well a probability distribution approximates another one. In the context
of the distorted expectation hypothesis, the actuary replaces the original probability distribution by a distorted probability, so it
makes sense to interpret the [Lp]-metric between them as a characteristic of the underlying random variable. We show in this
paper that this is a characteristic of the variability of the random variable, study its properties and give some applications.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

PROBABILITY THEORY

Laws of small numbers : Extremes and rare events. Falk, Michael; Hsler, Jrg; Reis, Rolf-Dieter (2011). - 3rd, revised and extended
ed. Birkhuser, 2011. - 509 pages. [RKN: 73659]
Shelved at: 519.24 22 LAW
"The intention of the book is to give a mathematically oriented development of the theory of rare events underlying various
applications. This characteristic of the book was strengthened in the second edition by incorporating various new results. In this
third edition, the dramatic change of focus of extreme value theory has been taken into account: from concentrating on maxima of
observations it has shifted to large observations, defined as exceedances over high thresholds. One emphasis of the present third
edition lies on multivariate generalized Pareto distributions, their representations, properties such as their peaks-over-threshold
stability, simulation, testing and estimation."

PRODUCTION LAG

Cobweb Theorems with production lags and price forecasting. Dufresne, Daniel; Vazquez-Abad, Felisa J (2012). - Victoria:
University of Melbourne, 2012. - 27 pages. [RKN: 73800]
The classical cobweb theorem is extended to include production lags and price forecasts. Price forecasting based on a longer
period has a stabilizing effect on prices. Longer production lags do not necessarily lead to unstable prices; very long lags lead to
cycles of constant amplitude. The classical cobweb requires elasticity of demand to be greater than that of supply; this is not
necessarily the case in a more general setting, price forecasting has a stabilizing effect. Random shocks are also considered.
No.1 (1993) onwards available online. Download as PDF.
http://www.economics.unimelb.edu.au/ACT/papers.shtml



148

PROSPECT THEORY

Pricing insurance contracts under Cumulative Prospect Theory. Kaluszka, Marek; Krzeszowiec, Michal [RKN: 45532]
Insurance: Mathematics & Economics (2012) 50 (1) : 159-166.
The aim of this paper is to introduce a premium principle which relies on Cumulative Prospect Theory by Kahneman and Tversky.
Some special cases of this premium principle have already been studied in the actuarial literature. In the paper, properties of this
premium principle are examined.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

QUANTITATIVE ANALYSIS

Remarks on quantiles and distortion risk measures. Dhaene, Jan; Kukush, Alexander; Linders, Danil; Tang, Qihe [RKN: 44853]
Shelved at: online only
European Actuarial Journal (2012) 2(2) December : 319-328.
Distorted expectations can be expressed as weighted averages of quantiles. In this note, we show that this statement is
essentially true, but that one has to be careful with the correct formulation of it. Furthermore, the proofs of the additivity property for
distorted expectations of a comonotonic sum that appear in the literature often do not cover the case of a general distortion
function. We present a straightforward proof for the general case, making use of the appropriate expressions for distorted
expectations in terms of quantiles
Available via Athens: Springer -- Published online, December 2012
http://www.openathens.net

QUANTITATIVE METHODS

Alarm system for insurance companies : A strategy for capital allocation. Das, S; Kratz, M [RKN: 45723]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 53-65.
One possible way of risk management for an insurance company is to develop an early and appropriate alarm system before the
possible ruin. The ruin is defined through the status of the aggregate risk process, which in turn is determined by premium
accumulation as well as claim settlement outgo for the insurance company. The main purpose of this work is to design an effective
alarm system, i.e. to define alarm times and to recommend augmentation of capital of suitable magnitude at those points to reduce
the chance of ruin. To draw a fair measure of effectiveness of alarm system, comparison is drawn between an alarm system, with
capital being added at the sound of every alarm, and the corresponding system without any alarm, but an equivalently higher initial
capital. Analytical results are obtained in general setup and this is backed up by simulated performances with various types of loss
severity distributions. This provides a strategy for suitably spreading out the capital and yet addressing survivability concerns at
factory level.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

RATEMAKING

Bayesian multivariate Poisson models for insurance ratemaking. Bermudez, Lluis; Karlis, Dimitris [RKN: 40017]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 226-236.
When actuaries face the problem of pricing an insurance contract that contains different types of coverage, such as a motor
insurance or a homeowners insurance policy, they usually assume that types of claim are independent. However, this assumption
may not be realistic: several studies have shown that there is a positive correlation between types of claim. Here we introduce
different multivariate Poisson regression models in order to relax the independence assumption, including zero-inflated models to
account for excess of zeros and overdispersion. These models have been largely ignored to date, mainly because of their
computational difficulties. Bayesian inference based on MCMC helps to resolve this problem (and also allows us to derive, for
several quantities of interest, posterior summaries to account for uncertainty). Finally, these models are applied to an automobile
insurance claims database with three different types of claim. We analyse the consequences for pure and loaded premiums when
the independence assumption is relaxed by using different multivariate Poisson regression models together with their zero-inflated
versions.
Available via Athens: Palgrave MacMillan
http://www.openathens.net



149

RATES AND RATING

Comparison and bounds for functionals of future lifetimes consistent with life tables. Barz, Christiane; Muller, Alfred [RKN: 45596]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (2) : 229-235.
We derive a new crossing criterion of hazard rates to identify a stochastic order relation between two random variables. We apply
this crossing criterion in the context of life tables to derive stochastic ordering results among three families of fractional age
assumptions: the family of linear force of mortality functions, the family of quadratic survival functions and the power family.
Further, this criterion is used to derive tight bounds for functionals of future lifetimes that exhibit an increasing force of mortality
with given one-year survival probabilities. Numerical examples illustrate our findings.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

RATIONAL NUMBERS

Rational thinking : Letter to the editor. Karsten, H Staple Inn Actuarial Society, - 1 pages. [RKN: 73908]
Shelved at: Per: Actuary (Oxf) Per: Actuary (Lon) Shelved at: JOU
The Actuary (2011) October : 6.
Outlining the background to computable numbers.
http://www.theactuary.com/

REGRESSION

Robustefficient credibility models with heavy-tailed claims: A mixed linear models perspective. Dornheim, Harald; Brazauskas,
Vytaras [RKN: 39365]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 72-84.
In actuarial practice, regression models serve as a popular statistical tool for analyzing insurance data and tariff ratemaking. In this
paper, we consider classical credibility models that can be embedded within the framework of mixed linear models. For inference
about fixed effects and variance components, likelihood-based methods such as (restricted) maximum likelihood estimators are
commonly pursued. However, it is well-known that these standard and fully efficient estimators are extremely sensitive to small
deviations from hypothesized normality of random components as well as to the occurrence of outliers. To obtain better estimators
for premium calculation and prediction of future claims, various robust methods have been successfully adapted to credibility
theory in the actuarial literature. The objective of this work is to develop robust and efficient methods for credibility when
heavy-tailed claims are approximately log-locationscale distributed. To accomplish that, we first show how to express additive
credibility models such as BhlmannStraub and Hachemeister ones as mixed linear models with symmetric or asymmetric
errors. Then, we adjust adaptively truncated likelihood methods and compute highly robust credibility estimates for the ordinary
but heavy-tailed claims part. Finally, we treat the identified excess claims separately and find robustefficient credibility premiums.
Practical performance of this approach is examinedvia simulationsunder several contaminating scenarios. A widely studied
real-data set from workers compensation insurance is used to illustrate functional capabilities of the new robust credibility
estimators.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Statistical analysis of the spreads of catastrophe bonds at the time of issue. Papachristou, Dimitris [RKN: 45308]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU/AST
ASTIN Bulletin (2011) 41 (1) : 251-273.
In this paper the catastrophe bond prices, as determined by the market, are analysed. The limited published work in this area has
been carried out mainly by cat bond investors and is based either on intuition, or on simple linear regression on one factor or on
comparisons of the prices of cat bonds with similar features. In this paper a Generalised Additive Model is fitted to the market data.
The statistical significance of different factors which may affect the cat bond prices is examined and the effect of these factors on
the prices is measured. A statistical framework and analysis could provide insight into the cat bond pricing and could have
applications among other things in the construction of a cat bond portfolio, cat bond price indices and in understanding changes of
the price of risk over time.
online access via International Actuarial Association:
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

REGULATION

A utility-based comparison of pension funds and life insurance companies under regulatory constraints. Broeders, Dirk; Chen,
An; Koos, Birgit [RKN: 44969]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (1) : 1-10.
This paper compares two different types of annuity providers, i.e. defined benefit pension funds and life insurance companies.
150

One of the key differences is that the residual risk in pension funds is collectively borne by the beneficiaries and the sponsors
shareholders while in the case of life insurers it is borne by the external shareholders. First, this paper employs a contingent claim
approach to evaluate the risk return tradeoff for annuitants. For that, we take into account the differences in contract specifications
and in regulatory regimes. Second, a welfare analysis is conducted to examine whether a consumer with power utility experiences
utility gains if she chooses a defined benefit plan or a life annuity contract over a defined contribution plan. We demonstrate that
regulation can be designed to support a level playing field amongst different financial institutions.

Available via Athens: Palgrave MacMillan
http://www.openathens.net

REINSURANCE

Computing bounds on the expected payoff of Alternative Risk Transfer products. Villegas, Andrs M; Medaglia, Andrs L; Zuluaga,
Luis F [RKN: 44786]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51(2) : 271-281.
The demand for integrated risk management solutions and the need for new sources of capital have led to the development of
innovative risk management products that mix the characteristics of traditional insurance and financial products. Such products,
usually referred as Alternative Risk Transfer (ART) products include: (re)insurance contracts that bundle several risks under a
single policy; multi-trigger products where the payment of benefits depends upon the occurrence of several events; and insurance
linked securities that place insurance risks in the capital market. Pricing of these complex products usually requires tailor-made
complex valuation methods that combine derivative pricing and actuarial science techniques for each product, as well as strong
distributional assumptions on the ARTs underlying risk factors. We present here an alternative methodology to compute bounds
on the price of ART products when there is limited information on the distribution of the underlying risk factors. In particular, we
develop a general optimization-based method that computes upper and lower price bounds for different ART products using
market data and possibly expert information about the underlying risk factors. These bounds are useful when the structure of the
product is too complex to develop analytical or simulation valuation methods, or when the scarcity of data makes it difficult to make
strong distributional assumptions on the risk factors. We illustrate our results by computing bounds on the price of a floating
retention insurance contract, and a catastrophe equity put (CatEPut) option.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Dividends and reinsurance under a penalty for ruin. Liang, Zhibin; Young, Virginia R [RKN: 45647]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 437-445.
We find the optimal dividend strategy in a diffusion risk model under a penalty for ruin, as in Thonhauser and Albrecher (2007),
although we allow for both a positive and a negative penalty. Furthermore, we determine the optimal proportional reinsurance
strategy, when so-called expensive reinsurance is available; that is, the premium loading on reinsurance is greater than the
loading on the directly written insurance. One can think of our model as taking the one in Taksar (2000, Section 6) and adding a
penalty for ruin. We use the Legendre transform to obtain the optimal dividend and reinsurance strategies. Not surprisingly, the
optimal dividend strategy is a barrier strategy. Also, we investigate the effect of the penalty P on the optimal strategies. In
particular, we show that the optimal barrier increases with respect to P, while the optimal proportion retained and the value
function decrease with respect to P. In the end, we explore the time of ruin, and find that the expected time of ruin increases with
respect to P under a net profit condition.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Effects of risk management on cost efficiency and cost function of the U.S. Property and liability insurers. Lin, Hong-Jen; Wen,
Min-Ming; Yang, Charles C Society of Actuaries, - 12 pages. [RKN: 74918]
Shelved at: Per: NAAJ (Oxf) Per NAAJ (Lon) Shelved at: JOU
North American Actuarial Journal (2011) 15 (4) : 487-498.
This paper adopts the one-step stochastic frontier approach to investigate the impact of risk management tools of derivatives and
reinsurance on cost efficiency of U.S. property-liability insurance companies. The stochastic frontier approach considers both the
mean and variance of cost efficiency. The sample includes both stock and mutual insurers. Among the findings, the cost function
of the entire sample carries the concavity feature, and insurers tend to use financial derivatives for firm value creation. The results
also show that for the entire sample the use of derivatives enhances the mean of cost efficiency but accompanied with larger
efficiency volatility. Nevertheless, the utilization of financial derivatives mitigates efficiency volatility for mutual insurers. This
research provides important insights for the practice of risk management in the property-liability insurance industry.
http://www.soa.org/news-and-publications/publications/journals/naaj/naaj-detail.aspx

Exponential change of measure applied to term structures of interest rates and exchange rates. Bo, Lijun [RKN: 44964]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (2) : 216-225.
In this paper, we study the term structures of interest rates and foreign exchange rates through establishing a state-price deflator.
The state-price deflator considered here can be viewed as an extension to the potential representation of the state-price density in
[Rogers, L.C.G., 1997. The potential approach to the term structure of interest rates and foreign exchange rates. Mathematical
Finance 7(2), 157164]. We identify a risk-neutral probability measure from the state-price deflator by a technique of exponential
change of measure for Markov processes proposed by [Palmowski, Z., Rolski, T., 2002. A technique for exponential change of
measure for Markov processes. Bernoulli 8(6), 767785] and present examples of several classes of diffusion processes
(jumpdiffusions and diffusions with regime-switching) to illustrate the proposed theory. A comparison between the exponential
change of measure and the Esscher transform for identifying risk-neutral measures is also presented. Finally, we consider the
exchange rate dynamics by virtue of the ratio of the current state-price deflators between two economies and then discuss the
pricing of currency options.
151

Available via Athens: Palgrave MacMillan
http://www.openathens.net

Minimising expected discounted capital injections by reinsurance in a classical risk model. Eisenberg, Julia; Schmidli, Hanspeter
[RKN: 45487]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 3 : 155-176.
In this paper we consider a classical continuous time risk model, where the claims are reinsured by some reinsurance with
retention level [unable to display equation], where [equation] means no reinsurance and b=0 means full reinsurance. The insurer
can change the retention level continuously. To prevent negative surplus the insurer has to inject additional capital. The problem is
to minimise the expected discounted cost over all admissible reinsurance strategies. We show that an optimal reinsurance
strategy exists. For some special cases we will be able to give the optimal strategy explicitly. In other cases the method will be
illustrated only numerically.

Optimal dividend and investing control of an insurance company with higher solvency constraints. Liang, Zongxia; Huang,
Jianping [RKN: 44952]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 501-511.
This paper considers the optimal control problem of a large insurance company under a fixed insolvency probability. The company
controls proportional reinsurance rate, dividend pay-outs and investing process to maximize the expected present value of the
dividend pay-outs until the time of bankruptcy. This paper aims at describing the optimal return function as well as the optimal
policy. As a by-product, the paper theoretically sets a risk-based capital standard to ensure the capital requirement that can cover
the total risk.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Optimal non-proportional reinsurance control and stochastic differential games. Taksar, Michael; Zeng, Xudong [RKN: 38228]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 64-71.
We study stochastic differential games between two insurance companies who employ reinsurance to reduce risk exposure. We
consider competition between two companies and construct a single payoff function of two companies surplus processes. One
company chooses a dynamic reinsurance strategy in order to maximize the payoff function while its opponent is simultaneously
choosing a dynamic reinsurance strategy so as to minimize the same quantity. We describe the Nash equilibrium of the game and
prove a verification theorem for a general payoff function. For the payoff function being the probability that the difference between
two surplus reaches an upper bound before it reaches a lower bound, the game is solved explicitly.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Optimal proportional reinsurance and investment in a stock market with OrnsteinUhlenbeck process. Liang, Zhibin; Yuen, Kam
Chuen; Guo, Junyi [RKN: 44963]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (2) : 207-215.
In this paper, the authors study the optimal investment and proportional reinsurance strategy when an insurance company wishes
to maximize the expected exponential utility of the terminal wealth. It is assumed that the instantaneous rate of investment return
follows an OrnsteinUhlenbeck process. Using stochastic control theory and HamiltonJacobiBellman equations, explicit
expressions for the optimal strategy and value function are derived not only for the compound Poisson risk model but also for the
Brownian motion risk model. Further, we investigate the partially observable optimization problem, and also obtain explicit
expressions for the optimal results.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Optimal reinsurance under VaR and CVaR risk measures. Chi, Yichun; Tan, Ken Seng - 23 pages. [RKN: 74744]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2011) 41 (2) : 487-509.
In this paper, we study two classes of optimal reinsurance models by minimizing the total risk exposure of an insurer under the
criteria of value at risk (VaR) and conditional value at risk (CVaR). We assume that the reinsurance premium is calculated
according to the expected value principle. Explicit solutions for the optimal reinsurance policies are derived over ceded loss
functions with increasing degrees of generality. More precisely, we establish formally that under the VaR minimization model, (i)
the stop-loss reinsurance is optimal among the class of increasing convex ceded loss functions; (ii) when the constraints on both
ceded and retained loss functions are relaxed to increasing functions, the stop-loss reinsurance with an upper limit is shown to be
optimal; (iii) and finally under the set of general increasing and left-continuous retained loss functions, the truncated stop-loss
reinsurance is shown to be optimal. In contrast, under CVaR risk measure, the stop-loss reinsurance is shown to be always
optimal. These results suggest that the VaR-based reinsurance models are sensitive with respect to the constraints imposed on
both ceded and retained loss functions while the corresponding CVaR-based reinsurance models are quite robust.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Optimal time-consistent investment and reinsurance policies for mean-variance insurers. Zeng, Yan; Li, Zhongfei [RKN: 44984]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (1) : 145-154.
This paper investigates the optimal time-consistent policies of an investment-reinsurance problem and an investment-only
problem under the mean-variance criterion for an insurer whose surplus process is approximated by a Brownian motion with drift.
The financial market considered by the insurer consists of one risk-free asset and multiple risky assets whose price processes
follow geometric Brownian motions. A general verification theorem is developed, and explicit closed-form expressions of the
optimal polices and the optimal value functions are derived for the two problems. Economic implications and numerical sensitivity
analysis are presented for our results. Our main findings are: (i) the optimal time-consistent policies of both problems are
independent of their corresponding wealth processes; (ii) the two problems have the same optimal investment policies; (iii) the
152

parameters of the risky assets (the insurance market) have no impact on the optimal reinsurance (investment) policy; (iv) the
premium return rate of the insurer does not affect the optimal policies but affects the optimal value functions; (v) reinsurance can
increase the mean-variance utility.


Available via Athens: Palgrave MacMillan
http://www.openathens.net

Optimal time-consistent investment and reinsurance strategies for insurers under Hestons SV model. Li, Zhongfei; Zeng, Yan;
Lai, Yongzeng [RKN: 45736]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 191-203.
This paper considers the optimal time-consistent investment and reinsurance strategies for an insurer under Hestons stochastic
volatility (SV) model. Such an SV model applied to insurers portfolio problems has not yet been discussed as far as we know. The
surplus process of the insurer is approximated by a Brownian motion with drift. The financial market consists of one risk-free asset
and one risky asset whose price process satisfies Hestons SV model. Firstly, a general problem is formulated and a verification
theorem is provided. Secondly, the closed-form expressions of the optimal strategies and the optimal value functions for the
meanvariance problem without precommitment are derived under two cases: one is the investmentreinsurance case and the
other is the investment-only case. Thirdly, economic implications and numerical sensitivity analysis are presented for our results.
Finally, some interesting phenomena are found and discussed.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Optimality of general reinsurance contracts under CTE risk measure. Tan, Ken Seng; Weng, Chengguo; Zhang, Yi [RKN: 44960]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (2) : 175-187.
By formulating a constrained optimization model, we address the problem of optimal reinsurance design using the criterion of
minimizing the conditional tail expectation (CTE) risk measure of the insurers total risk. For completeness, we analyze the optimal
reinsurance model under both binding and unbinding reinsurance premium constraints. By resorting to the Lagrangian approach
based on the concept of directional derivative, explicit and analytical optimal solutions are obtained in each case under some mild
conditions. We show that pure stop-loss ceded loss function is always optimal. More interestingly, we demonstrate that ceded loss
functions, that are not always non-decreasing, could be optimal. We also show that, in some cases, it is optimal to exhaust the
entire reinsurance premium budget to determine the optimal reinsurance, while in other cases, it is rational to spend less than the
prescribed reinsurance premium budget.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Parameter uncertainty in exponential family tail estimation. Landsman, Z; Tsanakas, A - 30 pages. [RKN: 70746]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2012) 42 (1) : 123-152.
Actuaries are often faced with the task of estimating tails of loss distributions from just a few observations. Thus estimates of tail
probabilities (reinsurance prices) and percentiles (solvency capital requirements) are typically subject to substantial parameter
uncertainty. We study the bias and MSE of estimators of tail probabilities and percentiles, with focus on 1-parameter exponential
families. Using asymptotic arguments it is shown that tail estimates are subject to signifi cant positive bias. Moreover, the use of
bootstrap predictive distributions, which has been proposed in the actuarial literature as a way of addressing parameter
uncertainty, is seen to double the estimation bias. A bias corrected estimator is thus proposed. It is then shown that the MSE of the
MLE, the parametric bootstrap and the bias corrected estimators only differ in terms of order O(n2), which provides
decision-makers with some fl exibility as to which estimator to use. The accuracy of asymptotic methods, even for small samples,
is demonstrated exactly for the exponential and related distributions, while other 1-parameter distributions are considered in a
simulation study. We argue that the presence of positive bias may be desirable in solvency capital calculations, though not
necessarily in pricing problems.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Stochastic projection for large individual losses. Drieskens, Damien; Henry, Marc; Walhin, Jean-Franois; Wielandts, Jrgen [RKN:
44929]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2012) 1 : 1-39.
In this paper we investigate how to estimate ultimate values of large losses. The method is based on the development of individual
losses and therefore allows to compute the netting impact of excess of loss reinsurance. In particular the index clause is properly
accounted for. A numerical example based on real-life data is provided.

RENEWAL PROCESS

Covariance of discounted compound renewal sums with a stochastic interest rate. Lveill, Ghislain; Adkambi, Franck [RKN:
45356]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 2 : 138-153.
Formulas have been obtained for the moments of the discounted aggregate claims process, for a constant instantaneous interest
rate, and for a claims number process that is an ordinary or a delayed renewal process. In this paper, we present explicit formulas
on the first two moments and the joint moment of this risk process, for a non-trivial extension to a stochastic instantaneous interest
rate. Examples are given for Erlang claims number processes, and for the Ho-Lee-Merton and the Vasicek interest rate models.

153

A generalized penalty function for a class of discrete renewal processes. Woo, Jae-Kyung [RKN: 45782]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2012) 2 : 130-152.
Analysis of a generalized GerberShiu function is considered in a discrete-time (ordinary) Sparre Andersen renewal risk process
with time-dependent claim sizes. The results are then applied to obtain ruin-related quantities under some renewal risk processes
assuming specific interclaim distributions such as a discrete K n distribution and a truncated geometric distribution (i.e. compound
binomial process). Furthermore, the discrete delayed renewal risk process is considered and results related to the ordinary
process are derived as well.
http://www.openathens.net/

Refinements of two-sided bounds for renewal equations. Woo, Jae-Kyung [RKN: 40012]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 189-196.
Many quantities of interest in the study of renewal processes may be expressed as the solution to a special type of integral
equation known as a renewal equation. The main purpose of this paper is to provide bounds for the solution of renewal equations
based on various reliability classifications. Exponential and nonexponential types of inequalities are derived. In particular,
two-sided bounds with specific reliability conditions become sharp. Finally, some examples including ultimate ruin for the classical
Poisson model with time-dependent claim sizes, the joint distribution of the surplus prior to and at ruin, and the excess life time, are
provided.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

RESERVE RISK

Double chain ladder. Martnez Miranda, Mara Dolores; Nielsen, Jens Perch; Verrall, Richard - 18 pages. [RKN: 70743]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2012) 42 (1) : 59-76.
By adding the information of reported count data to a classical triangle of reserving data, we derive a suprisingly simple method for
forecasting IBNR and RBNS claims. A simple relationship between development factors allows to involve and then estimate the
reporting and payment delay. Bootstrap methods provide prediction errors and make possible the inference about IBNR and
RBNS claims, separately.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

RESERVING

Double chain ladder. Martnez Miranda, Mara Dolores; Nielsen, Jens Perch; Verrall, Richard - 18 pages. [RKN: 70743]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2012) 42 (1) : 59-76.
By adding the information of reported count data to a classical triangle of reserving data, we derive a suprisingly simple method for
forecasting IBNR and RBNS claims. A simple relationship between development factors allows to involve and then estimate the
reporting and payment delay. Bootstrap methods provide prediction errors and make possible the inference about IBNR and
RBNS claims, separately.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Linear stochastic reserving methods. Dahms, Rene - 34 pages. [RKN: 70741]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2012) 42 (1) : 1-34.
In this article we want to motivate and analyse a wide family of reserving models, called linear stochastic reserving methods
(LSRMs). The main idea behind them is the assumption that the (conditionally) expected changes of claim properties during a
development period are proportional to exposures which depend linearly on the past. This means the discussion about the choice
of reserving methods can be based on heuristic reasons about exposures driving the claims development, which in our opinion is
much better than a pure philosophic approach. Moreover, the assumptions of LSRMs do not include the independence of accident
periods. We will see that many common reserving methods, like the Chain-Ladder-Method, the Bornhuetter-Ferguson-Method
and the Complementary-Loss-Ratio-Method, can be interpreted in this way. But using the LSRM framework you can do more. For
instance you can couple different triangles via exposures. This leads to reserving methods which look at a whole bundle of
triangles at once and use the information of all triangles in order to estimate the future development of each of them. We will
present unbiased estimators for the expected ultimate and estimators for the mean squared error of prediction, which may become
an integral part of IFRS 4. Moreover, we will look at the one period solvency reserving risk, which already is an important part of
Solvency II, and present a corresponding estimator. Finally we will present two examples that illustrate some features of LSRMs.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

RETIREMENT

Optimal commutable annuities to minimize the probability of lifetime ruin. Wang, Ting; Young, Virginia R [RKN: 45536]
Insurance: Mathematics & Economics (2012) 50 (1) : 200-216.
We find the minimum probability of lifetime ruin of an investor who can invest in a market with a risky and a riskless asset and who
154

can purchase a commutable life annuity. The surrender charge of a life annuity is a proportion of its value. Ruin occurs when the
total of the value of the risky and riskless assets and the surrender value of the life annuity reaches zero. We find the optimal
investment strategy and optimal annuity purchase and surrender strategies in two situations: (i) the value of the risky and riskless
assets is allowed to be negative, with the imputed surrender value of the life annuity keeping the total positive; (ii) the value of the
risky and riskless assets is required to be non-negative. In the first case, although the individual has the flexibility to buy or sell at
any time, we find that the individual will not buy a life annuity unless she can cover all her consumption via the annuity and she will
never sell her annuity. In the second case, the individual surrenders just enough annuity income to keep her total assets positive.
However, in this second case, the individuals annuity purchasing strategy depends on the size of the proportional surrender
charge. When the charge is large enough, the individual will not buy a life annuity unless she can cover all her consumption, the
so-called safe level. When the charge is small enough, the individual will buy a life annuity at a wealth lower than this safe level.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Optimal retirement consumption with a stochastic force of mortality. Huang, Huaxiong; Milevsky, Moshe A; Salisbury, Thomas S
[RKN: 44787]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51(2) : 282-291.
We extend the lifecycle model (LCM) of consumption over a random horizon (also known as the Yaari model) to a world in which
(i) the force of mortality obeys a diffusion process as opposed to being deterministic, and (ii) consumers can adapt their
consumption strategy to new information about their mortality rate (also known as health status) as it becomes available. In
particular, we derive the optimal consumption rate and focus on the impact of mortality rate uncertainty versus simple lifetime
uncertainty assuming that the actuarial survival curves are initially identical in the retirement phase where this risk plays a
greater role. In addition to deriving and numerically solving the partial differential equation (PDE) for the optimal consumption rate,
our main general result is that when the utility preferences are logarithmic the initial consumption rates are identical. But, in a
constant relative risk aversion (CRRA) framework in which the coefficient of relative risk aversion is greater (smaller) than one, the
consumption rate is higher (lower) and a stochastic force of mortality does make a difference. That said, numerical experiments
indicate that, even for non-logarithmic preferences, the stochastic mortality effect is relatively minor from the individuals
perspective. Our results should be relevant to researchers interested in calibrating the lifecycle model as well as those who
provide normative guidance (also known as financial advice) to retirees.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Three retirement decision models for defined contribution pension plan members: A simulation study. MacDonald,
Bonnie-Jeanne; Cairns, Andrew J G [RKN: 14542]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 1-18.
This paper examines the hypothetical retirement behavior of defined contribution (DC) pension plan participants. Using a Monte
Carlo simulation approach, we compare and discuss three retirement decision models: the two-thirds replacement ratio
benchmark model, the option-value of continued work model and a newly-developed one-year retirement decision model. Unlike
defined benefit (DB) pension plans where economic incentives create spikes in retirement at particular ages, all three retirement
decision models suggest that the retirement ages of DC participants are much more smoothly distributed over a wide range of
ages. We find that the one-year model possesses several advantages over the other two models when representing the
theoretical retirement choice of a DC pension plan participant. First, its underlying theory for retirement decision-making is more
feasible given the distinct features and pension drivers of a DC plan. Second, its specifications produce a more logical relationship
between an individuals decision to retire and his/her age and accumulated retirement wealth. Lastly, although the one-year model
is less complex than the option-value model as the DC participants scope is only one year, the retirement decision is optimal over
all future projected years if projections are made using reasonable financial assumptions.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

RETURNS

Arbitrage in skew Brownian motion models. Rossello, Damiano [RKN: 44989]
Insurance: Mathematics & Economics (2012) 50 (1) : 50-56.
Empirical skewness of asset returns can be reproduced by stochastic processes other than the Brownian motion with drift. Some
authors have proposed the skew Brownian motion for pricing as well as interest rate modelling. Although the asymmetric feature of
random return involved in the stock price process is driven by a parsimonious one-dimensional model, we will show how this is
intrinsically incompatible with a modern theory of arbitrage in continuous time. Application to investment performance and to the
BlackScholes pricing model clearly emphasize how this process can provide some kind of arbitrage.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

REVERSIONARY ANNUITY

Evolution of coupled lives' dependency across generations and pricing impact. Luciano, Elisa; Spreeuw, Jaap; Vigna, Elena
(2012). - London: Cass Business School, 2012. - 19 pages. [RKN: 73995]
This paper studies the dependence between coupled lives - both within and across generations - and its effects on prices of
reversionary annuities in the presence of longevity risk. Longevity risk is represented via a stochastic mortality intensity.
Dependence is modelled through copula functions. We consider Archimedean single and multi-parameter copulas. We find that
155

dependence decreases when passing from older generations to younger generations. Not only the level of dependence but also
its features - as measured by the copula - change across generations: the best-fit Archimedean copula is not the same across
generations. Moreover, for all the generations under exam the single-parameter copula is dominated by the two-parameter one.
The independence assumption produces quantifiable mispricing of reversionary annuities. The misspecification of the copula
produces different mispricing effects on different generations. The research is conducted using a well-known dataset of double life
contracts.
1995 onwards available online. Download as PDF.
http://www.cass.city.ac.uk/research-and-faculty/faculties/faculty-of-actuarial-science-and-insurance/publications/actuarial-resear
ch-reports

REVIEWS

Multivariate insurance models: an overview. Anastasiadis, Simon; Chukova, Stefanka [RKN: 45739]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 222-227.
This literature review summarizes the results from a collection of research papers that relate to modeling insurance claims and the
processes associated with them. We consider work by more than 55 authors, published or presented between 1971 and 2008.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

RISK

Ambiguity aversion, higher-order risk attitude and optimal effort. Huang, Rachel J [RKN: 45637]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 338-345.
In this paper, we examine whether a more ambiguity-averse individual will invest in more effort to shift her initial starting wealth
distribution toward a better target distribution. We assume that the individual has ambiguous beliefs regarding two target (starting)
distributions and that one distribution is preferred to the other. We find that an increase in ambiguity aversion will decrease
(increase) the optimal effort when the cost of effort is non-monetary. When the cost of effort is monetary, the effect depends on
whether the individual would make more effort when the target (starting) distribution is the preferred distribution than the target
(starting) distributions, the inferior one. We further characterize the individuals higher-order risk preferences to examine the
sufficient conditions.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Copula based hierarchical risk aggregation through sample reordering. Arbenz, Philipp; Hummel, Christoph; Mainik, Georg [RKN:
45729]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 122-133.
For high-dimensional risk aggregation purposes, most popular copula classes are too restrictive in terms of attainable
dependence structures. These limitations aggravate with increasing dimension. We study a hierarchical risk aggregation method
which is flexible in high dimensions. With this method it suffices to specify a low dimensional copula for each aggregation step in
the hierarchy. Copulas and margins of arbitrary kind can be combined. We give an algorithm for numerical approximation which
introduces dependence between originally independent marginal samples through reordering.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Corrections to the prices of derivatives due to market incompleteness. German, David [RKN: 45258]
Applied Mathematical Finance (2011) 18 (1-2) : 155-187.
We compute the first-order corrections to marginal utility-based prices with respect to a 'small' number of random endowments in
the framework of three incomplete financial models. They are a stochastic volatility model, a basis risk and market portfolio model
and a credit-risk model with jumps and stochastic recovery rate.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

DeltaGamma hedging of mortality and interest rate risk. Luciano, Elisa; Regis, Luca; Vigna, Elena [RKN: 45643]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 402-412.
One of the major concerns of life insurers and pension funds is the increasing longevity of their beneficiaries. This paper studies
the hedging problem of annuity cash flows when mortality and interest rates are stochastic. We first propose a DeltaGamma
hedging technique for mortality risk. The risk factor against which to hedge is the difference between the actual mortality intensity
in the future and its forecast today, the forward intensity. We specialize the hedging technique first to the case in which mortality
intensities are affine, then to OrnsteinUhlenbeck and Feller processes, providing actuarial justifications for this selection. We
show that, without imposing no arbitrage, we can get equivalent probability measures under which the HJM condition for no
arbitrage is satisfied. Last, we extend our results to the presence of both interest rate and mortality risk. We provide a UK
calibrated example of DeltaGamma hedging of both mortality and interest rate risk.
Available via Athens: Palgrave MacMillan
http://www.openathens.net
156


Dependence modeling in non-life insurance using the Bernstein copula. Diers, Dorothea; Eling, Martin; Marek, Sebastian D [RKN:
45646]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 430-436.
This paper illustrates the modeling of dependence structures of non-life insurance risks using the Bernstein copula. We conduct a
goodness-of-fit analysis and compare the Bernstein copula with other widely used copulas. Then, we illustrate the use of the
Bernstein copula in a value-at-risk and tail-value-at-risk simulation study. For both analyses we utilize German claims data on
storm, flood, and water damage insurance for calibration. Our results highlight the advantages of the Bernstein copula, including
its flexibility in mapping inhomogeneous dependence structures and its easy use in a simulation context due to its representation
as mixture of independent Beta densities. Practitioners and regulators working toward appropriate modeling of dependences in a
risk management and solvency context can benefit from our results.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Dividends and reinsurance under a penalty for ruin. Liang, Zhibin; Young, Virginia R [RKN: 45647]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 437-445.
We find the optimal dividend strategy in a diffusion risk model under a penalty for ruin, as in Thonhauser and Albrecher (2007),
although we allow for both a positive and a negative penalty. Furthermore, we determine the optimal proportional reinsurance
strategy, when so-called expensive reinsurance is available; that is, the premium loading on reinsurance is greater than the
loading on the directly written insurance. One can think of our model as taking the one in Taksar (2000, Section 6) and adding a
penalty for ruin. We use the Legendre transform to obtain the optimal dividend and reinsurance strategies. Not surprisingly, the
optimal dividend strategy is a barrier strategy. Also, we investigate the effect of the penalty P on the optimal strategies. In
particular, we show that the optimal barrier increases with respect to P, while the optimal proportion retained and the value
function decrease with respect to P. In the end, we explore the time of ruin, and find that the expected time of ruin increases with
respect to P under a net profit condition.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Erlangian approximation to finite time ruin probabilities in perturbed risk models. Stanford, David A; Yu, Kaiqi; Ren, Jiandong
[RKN: 45148]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 1 : 38-58.
In this paper, we consider a class of perturbed risk processes that have an underlying Markov structure, including
Markov-modulated risk processes, and Sparre-Andersen risk processes when both inter-claim times and claim sizes are
phase-type. We apply the Erlangization method to the risk process in the class in order to obtain an accurate approximation of the
finite time ruin probability. In addition, we develop an efficient recursive procedure by recognizing a repeating structure in the
probability matrices we work with. We believe the present work is among the first to either compute or approximate finite time ruin
probabilities in the perturbed risk model.

A generalized penalty function for a class of discrete renewal processes. Woo, Jae-Kyung [RKN: 45782]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2012) 2 : 130-152.
Analysis of a generalized GerberShiu function is considered in a discrete-time (ordinary) Sparre Andersen renewal risk process
with time-dependent claim sizes. The results are then applied to obtain ruin-related quantities under some renewal risk processes
assuming specific interclaim distributions such as a discrete K n distribution and a truncated geometric distribution (i.e. compound
binomial process). Furthermore, the discrete delayed renewal risk process is considered and results related to the ordinary
process are derived as well.
http://www.openathens.net/

Hierarchical structures in the aggregation of premium risk for insurance underwriting. Savelli, Nino; Clemente, Gian Paolo [RKN:
45489]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 3 : 193-213.
In the valuation of the Solvency II capital requirement, the correct appraisal of risk dependencies acquires particular relevance.
These dependencies refer to the recognition of risk diversification in the aggregation process and there are different levels of
aggregation and hence different types of diversification. For instance, for a non-life company at the first level the risk components
of each single line of business (e.g. premium, reserve, and CAT risks) need to be combined in the overall portfolio, the second
level regards the aggregation of different kind of risks as, for example, market and underwriting risk, and finally various solo legal
entities could be joined together in a group.

Solvency II allows companies to capture these diversification effects in capital requirement assessment, but the identification of a
proper methodology can represent a delicate issue. Indeed, while internal models by simulation approaches permit usually to
obtain the portfolio multivariate distribution only in the independence case, generally the use of copula functions can consent to
have the multivariate distribution under dependence assumptions too.

However, the choice of the copula and the parameter estimation could be very problematic when only few data are available. So it
could be useful to find a closed formula based on Internal Models independence results with the aim to obtain the capital
requirement under dependence assumption.

A simple technique, to measure the diversification effect in capital requirement assessment, is the formula, proposed by Solvency
II quantitative impact studies, focused on the aggregation of capital charges, the latter equal to percentile minus average of total
claims amount distribution of single line of business (LoB), using a linear correlation matrix.

On the other hand, this formula produces the correct result only for a restricted class of distributions, while it may underestimate
157

the diversification effect.

In this paper we present an alternative method, based on the idea to adjust that formula with proper calibration factors (proposed
by Sandstrm (2007)) and appropriately extended with the aim to consider very skewed distribution too.

In the last part considering different non-life multi-line insurers, we compare the capital requirements obtained, for only premium
risk, applying the aggregation formula to the results derived by elliptical copulas and hierarchical Archimedean copulas.

The implied market price of weather risk. Hardlea, Wolfgang Karl; Caberaa, Brenda Lopez Routledge, [RKN: 45795]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (1-2) : 59-95.
Weather derivatives (WD) are end-products of a process known as securitization that transforms non-tradable risk factors
(weather) into tradable financial assets. For pricing and hedging non-tradable assets, one essentially needs to incorporate the
market price of risk (MPR), which is an important parameter of the associated equivalent martingale measure (EMM). The majority
of papers so far has priced non-tradable assets assuming zero or constant MPR, but this assumption yields biased prices and has
never been quantified earlier under the EMM framework. Given that liquid-derivative contracts based on daily temperature are
traded on the Chicago Mercantile Exchange (CME), we infer the MPR from traded futures-type contracts (CAT, CDD, HDD and
AAT). The results show how the MPR significantly differs from 0, how it varies in time and changes in sign. It can be
parameterized, given its dependencies on time and temperature seasonal variation. We establish connections between the
market risk premium (RP) and the MPR.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

The joint distribution of the time to ruin and the number of claims until ruin in the classical risk model. Dickson, David C M [RKN:
45636]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 334-337.
We use probabilistic arguments to derive an expression for the joint density of the time to ruin and the number of claims until ruin
in the classical risk model. From this we obtain a general expression for the probability function of the number of claims until ruin.
We also consider the moments of the number of claims until ruin and illustrate our results in the case of exponentially distributed
individual claims. Finally, we briefly discuss joint distributions involving the surplus prior to ruin and deficit at ruin.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Levy risk model with two-sided jumps and a barrier dividend strategy. Bo, Lijun; Song, Renming; Tang, Dan; Wang, Tongjin; Yang,
Xuewei [RKN: 45601]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (2) : 280-291.
In this paper, we consider a general Lvy risk model with two-sided jumps and a constant dividend barrier. We connect the ruin
problem of the ex-dividend risk process with the first passage problem of the Lvy process reflected at its running maximum. We
prove that if the positive jumps of the risk model form a compound Poisson process and the remaining part is a spectrally negative
Lvy process with unbounded variation, the Laplace transform (as a function of the initial surplus) of the upward entrance time of
the reflected (at the running infimum) Lvy process exhibits the smooth pasting property at the reflecting barrier. When the surplus
process is described by a double exponential jump diffusion in the absence of dividend payment, we derive some explicit
expressions for the Laplace transform of the ruin time, the distribution of the deficit at ruin, and the total expected discounted
dividends. Numerical experiments concerning the optimal barrier strategy are performed and new empirical findings are
presented.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Minimising expected discounted capital injections by reinsurance in a classical risk model. Eisenberg, Julia; Schmidli, Hanspeter
[RKN: 45487]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 3 : 155-176.
In this paper we consider a classical continuous time risk model, where the claims are reinsured by some reinsurance with
retention level [unable to display equation], where [equation] means no reinsurance and b=0 means full reinsurance. The insurer
can change the retention level continuously. To prevent negative surplus the insurer has to inject additional capital. The problem is
to minimise the expected discounted cost over all admissible reinsurance strategies. We show that an optimal reinsurance
strategy exists. For some special cases we will be able to give the optimal strategy explicitly. In other cases the method will be
illustrated only numerically.

On allocation of upper limits and deductibles with dependent frequencies and comonotonic severities. Li, Xiaohu; You, Yinping
[RKN: 45645]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 423-429.
With the assumption of Archimedean copula for the occurrence frequencies of the risks covered by an insurance policy, this note
further investigates the allocation problem of upper limits and deductibles addressed in Hua and Cheung (2008a). Sufficient
conditions for a risk averse policyholder to well allocate the upper limits and the deductibles are built, respectively.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

On the absolute ruin problem in a Sparre Andersen risk model with constant interest. Mitric, Ilie-Radu; Badescu, Andrei L; Stanford,
David A [RKN: 45533]
Insurance: Mathematics & Economics (2012) 50 (1) : 167-178.
In this paper, we extend the work of Mitric and Sendova (2010), which considered the absolute ruin problem in a risk model with
debit and credit interest, to renewal and non-renewal structures. Our first results apply to MAP processes, which we later restrict to
158

the Sparre Andersen renewal risk model with interclaim times that are generalized Erlang (n) distributed and claim amounts
following a Matrix-Exponential (ME) distribution (see for e.g. Asmussen and OCinneide (1997)). Under this scenario, we present
a general methodology to analyze the GerberShiu discounted penalty function defined at absolute ruin, as a solution of
high-order linear differential equations with non-constant coefficients. Closed-form solutions for some absolute ruin related
quantities in the generalized Erlang (2) case complement the results obtained under the classical risk model by Mitric and
Sendova (2010).
Available via Athens: Palgrave MacMillan
http://www.openathens.net

On the analysis of a general class of dependent risk processes. Willmot, Gordon E; Woo, Jae-Kyung [RKN: 45730]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 134-141.
A generalized Sparre Andersen risk process is examined, whereby the joint distribution of the interclaim time and the ensuing
claim amount is assumed to have a particular mathematical structure. This structure is present in various dependency models
which have previously been proposed and analyzed. It is then shown that this structure in turn often implies particular functional
forms for joint discounted densities of ruin related variables including some or all of the deficit at ruin, the surplus immediately prior
to ruin, and the surplus after the second last claim. Then, employing a fairly general interclaim time structure which involves a
combination of Erlang type densities, a complete identification of a generalized GerberShiu function is provided. An application is
given applying these results to a situation involving a mixed Erlang type of claim amount assumption. Various examples and
special cases of the model are then considered, including one involving a bivariate Erlang mixture model.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

On the HaezendonckGoovaerts risk measure for extreme risks. Tang, Qihe; Yang, Fan [RKN: 45537]
Insurance: Mathematics & Economics (2012) 50 (1) : 217-227.
In this paper, we are interested in the calculation of the HaezendonckGoovaerts risk measure, which is defined via a convex
Young function and a parameter q (0,1) representing the confidence level. We mainly focus on the case in which the risk variable
follows a distribution function from a max-domain of attraction. For this case, we restrict the Young function to be a power function
and we derive exact asymptotics for the HaezendonckGoovaerts risk measure as q 1. As a subsidiary, we also consider the case
with an exponentially distributed risk variable and a general Young function, and we obtain an analytical expression for the
HaezendonckGoovaerts risk measure.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

On the invariant properties of notions of positive dependence and copulas under increasing transformations. Cai, Jun; Wei, Wei
[RKN: 44988]
Insurance: Mathematics & Economics (2012) 50 (1) : 43-49.
Notions of positive dependence and copulas play important roles in modeling dependent risks. The invariant properties of notions
of positive dependence and copulas under increasing transformations are often used in the studies of economics, finance,
insurance and many other fields. In this paper, we examine the notions of the conditionally increasing (CI), the conditionally
increasing in sequence (CIS), the positive dependence through the stochastic ordering (PDS), and the positive dependence
through the upper orthant ordering (PDUO). We first use counterexamples to show that the statements in Theorem 3.10.19 of
Mller and Stoyan (2002) about the invariant properties of CIS and CI under increasing transformations are not true. We then
prove that the invariant properties of CIS and CI hold under strictly increasing transformations. Furthermore, we give rigorous
proofs for the invariant properties of PDS and PDUO under increasing transformations. These invariant properties enable us to
show that a continuous random vector is PDS (PDUO) if and only of its copula is PDS (PDUO). In addition, using the properties of
generalized left-continuous and right-continuous inverse functions, we give a rigorous proof for the invariant property of copulas
under increasing transformations on the components of any random vector. This result generalizes Proposition 4.7.4 of Denuit et
al. (2005) and Proposition 5.6. of McNeil et al. (2005).
Available via Athens: Palgrave MacMillan
http://www.openathens.net

On the threshold dividend strategy for a generalized jumpdiffusion risk model. Chi, Yichun; Lin, X. Sheldon [RKN: 45126]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (3) : 326-337.
In this paper, we generalize the CramrLundberg risk model perturbed by diffusion to incorporate jumps due to surplus
fluctuation and to relax the positive loading condition. Assuming that the surplus process has exponential upward and arbitrary
downward jumps, we analyze the expected discounted penalty (EDP) function of Gerber and Shiu (1998) under the threshold
dividend strategy. An integral equation for the EDP function is derived using the WienerHopf factorization. As a result, an explicit
analytical expression is obtained for the EDP function by solving the integral equation. Finally, phase-type downward jumps are
considered and a matrix representation of the EDP function is presented.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Optimal asset allocation for DC pension plans under inflation. Han, Nan-wei; Hung, Mao-Wei [RKN: 45734]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 172-181.
In this paper, the stochastic dynamic programming approach is used to investigate the optimal asset allocation for a
defined-contribution pension plan with downside protection under stochastic inflation. The plan participant invests the fund wealth
and the stochastic interim contribution flows into the financial market. The nominal interest rate model is described by the
CoxIngersollRoss (Cox et al., 1985) dynamics. To cope with the inflation risk, the inflation indexed bond is included in the asset
menu. The retired individuals receive an annuity that is indexed by inflation and a downside protection on the amount of this
annuity is considered. The closed-form solution is derived under the CRRA utility function. Finally, a numerical application is
presented to characterize the dynamic behavior of the optimal investment strategy.
Available via Athens: Palgrave MacMillan
http://www.openathens.net
159


Optimal non-proportional reinsurance control and stochastic differential games. Taksar, Michael; Zeng, Xudong [RKN: 38228]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 64-71.
We study stochastic differential games between two insurance companies who employ reinsurance to reduce risk exposure. We
consider competition between two companies and construct a single payoff function of two companies surplus processes. One
company chooses a dynamic reinsurance strategy in order to maximize the payoff function while its opponent is simultaneously
choosing a dynamic reinsurance strategy so as to minimize the same quantity. We describe the Nash equilibrium of the game and
prove a verification theorem for a general payoff function. For the payoff function being the probability that the difference between
two surplus reaches an upper bound before it reaches a lower bound, the game is solved explicitly.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Optimal reinsurance with positively dependent risks. Cai, Jun; Wei, Wei [RKN: 44990]
Insurance: Mathematics & Economics (2012) 50 (1) : 57-63.
In the individual risk model, one is often concerned about positively dependent risks. Several notions of positive dependence have
been proposed to describe such dependent risks. In this paper, we assume that the risks in the individual risk model are positively
dependent through the stochastic ordering (PDS). The PDS risks include independent, comonotonic, conditionally stochastically
increasing (CI) risks, and other interesting dependent risks. By proving the convolution preservation of the convex order for PDS
random vectors, we show that in individualized reinsurance treaties, to minimize certain risk measures of the retained loss of an
insurer, the excess-of-loss treaty is the optimal reinsurance form for an insurer with PDS dependent risks among a general class
of individualized reinsurance contracts. This extends the study in Denuit and Vermandele (1998) on individualized reinsurance
treaties to dependent risks. We also derive the explicit expressions for the retentions in the optimal excess-of-loss treaty in a
two-line insurance business model.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Portfolio selection problem with multiple risky assets under the constant elasticity of variance model. Zhao, Hui; Rong, Ximin
[RKN: 45534]
Insurance: Mathematics & Economics (2012) 50 (1) : 179-190.
This paper focuses on the constant elasticity of variance (CEV) model for studying the utility maximization portfolio selection
problem with multiple risky assets and a risk-free asset. The HamiltonJacobiBellman (HJB) equation associated with the
portfolio optimization problem is established. By applying a power transform and a variable change technique, we derive the
explicit solution for the constant absolute risk aversion (CARA) utility function when the elasticity coefficient is -1 or 0. In order to
obtain a general optimal strategy for all values of the elasticity coefficient, we propose a model with two risky assets and one
risk-free asset and solve it under a given assumption. Furthermore, we analyze the properties of the optimal strategies and
discuss the effects of market parameters on the optimal strategies. Finally, a numerical simulation is presented to illustrate the
similarities and differences between the results of the two models proposed in this paper.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

The proper distribution function of the deficit in the delayed renewal risk model. Kim, So-Yeun; Willmot, Gordon E [RKN: 45355]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 2 : 118-137.
The main focus of this paper is to extend the analysis of ruin-related quantities to the delayed renewal risk models. First, the
background for the delayed renewal risk model is introduced and a general equation that is used as a framework is derived. The
equation is obtained by conditioning on the first drop below the initial surplus level. Then, we consider the deficit at ruin among
many random variables associated with ruin. The properties of the distribution function (DF) of the proper deficit are examined in
particular.

Recursive methods for a multi-dimensional risk process with common shocks. Gong, Lan; Badescu, Andrei L; Cheung, Eric C K
[RKN: 44996]
Insurance: Mathematics & Economics (2012) 50 (1) : 109-120.
In this paper, a multi-dimensional risk model with common shocks is studied. Using a simple probabilistic approach via observing
the risk processes at claim instants, recursive integral formulas are developed for the survival probabilities as well as for a class of
GerberShiu expected discounted penalty functions that include the surplus levels at ruin. Under the assumption of exponential or
mixed Erlang claims, the recursive integrals can be simplified to give recursive sums which are computationally more tractable.
Numerical examples including an optimal capital allocation problem are also given towards the end.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Risk concentration of aggregated dependent risks: The second-order properties. Tong, Bin; Wu, Chongfeng; Xu, Weidong [RKN:
44999]
Insurance: Mathematics & Economics (2012) 50 (1) : 139-149.
Under the current regulatory guidelines for banks and insurance companies, the quantification of diversification benefits due to risk
aggregation plays a prominent role. In this paper we establish second-order approximation of risk concentration associated with a
random vector in terms of Value at Risk (VaR) within the methodological framework of second-order regular variation and the
theory of Archimedean copula. Moreover, we find that the rate of convergence of the first-order approximation of risk concentration
depends on the the interplay between the tail behavior of the marginal loss random variables and their dependence structure.
Specifically, we find that the rate of convergence is determined by either the second-order parameter ( 1) of Archimedean copula
generator or the second-order parameter ( ) of the tail margins, leading to either the so-called dependence dominated case or
margin dominated case.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

160

Risk models based on time series for count random variables. Cossette, Hlne; Marceau, tienne; Toureille, Florent [RKN: 10962]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 19-28.
In this paper, we generalize the classical discrete time risk model by introducing a dependence relationship in time between the
claim frequencies. The models used are the Poisson autoregressive model and the Poisson moving average model. In particular,
the aggregate claim amount and related quantities such as the stop-loss premium, value at risk and tail value at risk are discussed
within this framework.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Risk processes with shot noise Cox claim number process and reserve dependent premium rate. Macci, Claudio; Torrisi, Giovanni
Luca [RKN: 39936]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 134-145.
We consider a suitable scaling, called the slow Markov walk limit, for a risk process with shot noise Cox claim number process and
reserve dependent premium rate. We provide large deviation estimates for the ruin probability. Furthermore, we find an
asymptotically efficient law for the simulation of the ruin probability using importance sampling. Finally, we present asymptotic
bounds for ruin probabilities in the Bayesian setting.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Ruin by dynamic contagion claims. Dassios, Angelos; Zhao, Hongbiao [RKN: 45726]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 93-106.
In this paper, we consider a risk process with the arrival of claims modelled by a dynamic contagion process, a generalisation of
the Cox process and Hawkes process introduced by Dassios and Zhao (2011). We derive results for the infinite horizon model that
are generalisations of the CramrLundberg approximation, Lundbergs fundamental equation, some asymptotics as well as
bounds for the probability of ruin. Special attention is given to the case of exponential jumps and a numerical example is provided.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Statistical analysis of model risk concerning temperature residuals and its impact on pricing weather derivatives. Ahcan, Ales
[RKN: 44998]
Insurance: Mathematics & Economics (2012) 50 (1) : 131-138.
In this paper we model the daily average temperature via an extended version of the standard Ornstein Uhlenbeck process driven
by a Levy noise with seasonally adjusted asymmetric ARCH process for volatility. More precisely, we model the disturbances with
the Normal inverse Gaussian (NIG) and Variance gamma (VG) distribution. Besides modelling the residuals we also compare the
prices of January 2010 out of the money call and put options for two of the Slovenian largest cities Ljubljana and Maribor under
normally distributed disturbances and NIG and VG distributed disturbances. The results of our numerical analysis demonstrate
that the normal model fails to capture adequately tail risk, and consequently significantly misprices out of the money options. On
the other hand prices obtained using NIG and VG distributed disturbances fit well to the results obtained by bootstrapping the
residuals. Thus one should take extreme care in choosing the appropriate statistical model.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

The time to ruin and the number of claims until ruin for phase-type claims. Frostig, Esther; Pitts, Susan M; Politis, Konstadinos
[RKN: 45720]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 19-25.
We consider a renewal risk model with phase-type claims, and obtain an explicit expression for the joint transform of the time to
ruin and the number of claims until ruin, with a penalty function applied to the deficit at ruin. The approach is via the duality
between a risk model with phase-type claims and a particular single server queueing model with phase-type customer interarrival
times; see Frostig (2004). This result specializes to one for the probability generating function of the number of claims until ruin.
We obtain explicit expressions for the distribution of the number of claims until ruin for exponentially distributed claims when the
inter-claim times have an Erlang-n distribution.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Translation-invariant and positive-homogeneous risk measures and optimal portfolio management in the presence of a riskless
component. Landsman, Zinoviy; Makov, Udi E [RKN: 44994]
Insurance: Mathematics & Economics (2012) 50 (1) : 94-98.
Risk portfolio optimization, with translation-invariant and positive-homogeneous risk measures, leads to the problem of minimizing
a combination of a linear functional and a square root of a quadratic functional for the case of elliptical multivariate underlying
distributions.

This problem was recently treated by the authors for the case when the portfolio does not contain a riskless component. When it
does, however, the initial covariance matrix S becomes singular and the problem becomes more complicated. In the paper we
focus on this case and provide an explicit closed-form solution of the minimization problem, and the condition under which this
solution exists. The results are illustrated using data of 10 stocks from the NASDAQ Computer Index.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

TVaR-based capital allocation for multivariate compound distributions with positive continuous claim amounts. Cossette,
Helene; Mailhot, Melina; Marceau, Etienne [RKN: 45598]
Shelved at: Online Only Shelved at: Online Only
161

Insurance: Mathematics & Economics (2012) 50 (2) : 247-256.
In this paper, we consider a portfolio of n dependent risks X1,,Xn and we study the stochastic behavior of the aggregate claim
amount [unable to display]. Our objective is to determine the amount of economic capital needed for the whole portfolio and to
compute the amount of capital to be allocated to each risk X1,,Xn. To do so, we use a topdown approach. For (X1,,Xn), we
consider risk models based on multivariate compound distributions defined with a multivariate counting distribution. We use the
TVaR to evaluate the total capital requirement of the portfolio based on the distribution of S, and we use the TVaR-based capital
allocation method to quantify the contribution of each risk. To simplify the presentation, the claim amounts are assumed to be
continuously distributed. For multivariate compound distributions with continuous claim amounts, we provide general formulas for
the cumulative distribution function of S, for the TVaR of S and the contribution to each risk. We obtain closed-form expressions for
those quantities for multivariate compound distributions with gamma and mixed Erlang claim amounts. Finally, we treat in detail
the multivariate compound Poisson distribution case. Numerical examples are provided in order to examine the impact of the
dependence relation on the TVaR of S, the contribution to each risk of the portfolio, and the benefit of the aggregation of several
risks.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

RISK ANALYSIS

Analysis of risk models using a level crossing technique. Brill, Percy H; Yu, Kaiqi [RKN: 44932]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 298-309.
This paper analyzes ruin-like risk models in Insurance, which are variants of the CramerLundberg (CL) model with a barrier or a
threshold. We consider three model variants, which have different portfolio strategies when the risk reserve reaches the barrier or
exceeds the threshold. In these models we construct a time-extended risk process defined on cycles of a specific renewal
process. The time until ruin is equal to one cycle of the specific renewal process. We also consider a fourth model, which is a
variant of a model proposed by . The analysis of each model employs a level crossing method (LC) to derive the steady-state
probability distribution of the time-extended risk process. From the derived distribution we compute the expected time until ruin,
the probability distribution of the deficit at ruin, and related quantities of interest.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Modelling losses and locating the tail with the Pareto Positive Stable distribution. Guilln, Montserrat; Prieto, Faustino; Sarabia,
Jos Mara [RKN: 44947]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 454-461.
This paper focuses on modelling the severity distribution. We directly model the small, moderate and large losses with the Pareto
Positive Stable (PPS) distribution and thus it is not necessary to fix a threshold for the tail behaviour. Estimation with the method of
moments is straightforward. Properties, graphical tests and expressions for value-at risk and tail value-at-risk are presented.
Furthermore, we show that the PPS distribution can be used to construct a statistical test for the Pareto distribution and to
determine the threshold for the Pareto shape if required. An application to loss data is presented. We conclude that the PPS
distribution can perform better than commonly used distributions when modelling a single loss distribution for moderate and large
losses. This approach avoids the pitfalls of cut-off selection and it is very simple to implement for quantitative risk analysis.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Risk analysis and valuation of life insurance contracts: combining actuarial and financial approaches. Graf, Stefan; Kling,
Alexander; Ru, Jochen [RKN: 44981]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (1) : 115-125.
In this paper, we analyze traditional (i.e. not unit-linked) participating life insurance contracts with a guaranteed interest rate and
surplus participation. We consider three different surplus distribution models and an asset allocation that consists of money
market, bonds with different maturities, and stocks. In this setting, we combine actuarial and financial approaches by selecting a
risk minimizing asset allocation (under the real world measure P) and distributing terminal surplus such that the contract value
(under the pricing measure Q) is fair. We prove that this strategy is always possible unless the insurance contracts introduce
arbitrage opportunities in the market. We then analyze differences between the different surplus distribution models and
investigate the impact of the selected risk measure on the risk minimizing portfolio.


Available via Athens: Palgrave MacMillan
http://www.openathens.net

RISK ASSESSMENT

Managing longevity and disability risks in life annuities with long term care. Levantesi, Susanna; Menzietti, Massimiliano [RKN:
45642]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 391-401.
The aim of the paper is twofold. Firstly, it develops a model for risk assessment in a portfolio of life annuities with long term care
benefits. These products are usually represented by a Markovian Multi-State model and are affected by both longevity and
162

disability risks. Here, a stochastic projection model is proposed in order to represent the future evolution of mortality and disability
transition intensities. Data from the Italian National Institute of Social Security (INPS) and from Human Mortality Database (HMD)
are used to estimate the model parameters. Secondly, it investigates the solvency in a portfolio of enhanced pensions. To this aim
a risk model based on the portfolio risk reserve is proposed and different rules to calculate solvency capital requirements for life
underwriting risk are examined. Such rules are then compared with the standard formula proposed by the Solvency II project.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Risk comparison of different bonus distribution approaches in participating life insurance. Zemp, Alexandra [RKN: 44967]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (2) : 249-264.
The fair pricing of explicit and implicit options in life insurance products has received broad attention in the academic literature over
the past years. Participating life insurance (PLI) contracts have been the focus especially. These policies are typically
characterized by a term life insurance, a minimum interest rate guarantee, and bonus participation rules with regard to the
insurers asset returns or reserve situation. Researchers replicate these bonus policies quite differently. We categorize and
formally present the most common PLI bonus distribution mechanisms. These bonus models closely mirror the Danish, German,
British, and Italian regulatory framework. Subsequently, we perform a comparative analysis of the different bonus models with
regard to risk valuation. We calibrate contract parameters so that the compared contracts have a net present value of zero and the
same safety level as the initial position, using risk-neutral valuation. Subsequently, we analyze the effect of changes in the asset
volatility and in the initial reserve amount (per contract) on the value of the default put option (DPO), while keeping all other
parameters constant. Our results show that DPO values obtained with the PLI bonus distribution model of Bacinello (2001), which
replicates the Italian regulatory framework, are most sensitive to changes in volatility and initial reserves.

Available via Athens: Palgrave MacMillan
http://www.openathens.net

RISK AVERSION

Archimedean copulas derived from Morgenstern utility functions. Spreeuw, Jaap (2012). - London: Cass Business School, 2012.
- 21 pages. [RKN: 70094]
The (additive) generator of an Archimedean copula - as well as the inverse of the generator- is a strictly decreasing and convex
function, while Morgenstern utility functions (applying to risk averse decision makers) are non decreasing and concave. This
provides a basis for deriving either a generator of Archimedean copulas, or its inverse, from a Morgenstern utility function. If we
derive the generator in this way, dependence properties of an Archimedean copula that are often taken to be desirable, match with
generally sought after properties of the corresponding utility function. It is shown how well known copula families are derived from
established utility functions. Also, some new copula families are derived, and their properties are discussed. If, on the other hand,
we instead derive the inverse of the generator from the utility function, there is a link between the magnitude of measures of risk
attitude (like the very common Arrow-Pratt coefficient of absolute risk aversion) and the strength of dependence featured by the
corresponding Archimedean copula.
1995 onwards available online. Download as PDF.
http://www.cass.city.ac.uk/research-and-faculty/faculties/faculty-of-actuarial-science-and-insurance/publications/actuarial-resear
ch-reports

Characterization of left-monotone risk aversion in the RDEU model. Mao, Tiantian; Hu, Taizhong [RKN: 45644]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 413-422.
We extend the characterization of the left-monotone risk aversion developed by Ryan (2006) to the case of unbounded random
variables. The notion of weak convergence is insufficient for such an extension. It requires the solution of a host of delicate
convergence problems. To this end, some further intrinsic properties of the location independent risk order are investigated. The
characterization of the right-monotone risk aversion for unbounded random variables is also mentioned. Moreover, we remove the
gap in the proof of the main result in Ryan (2006).
Available via Athens: Palgrave MacMillan
http://www.openathens.net

RISK-BASED CAPITAL

Asymptotics for risk capital allocations based on Conditional Tail Expectation. Asimit, Alexandru V; Furman, Edward; Tang, Qihe;
Vernic, Raluca [RKN: 44933]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 310-324.
An investigation of the limiting behavior of a risk capital allocation rule based on the Conditional Tail Expectation (CTE) risk
measure is carried out. More specifically, with the help of general notions of Extreme Value Theory (EVT), the aforementioned risk
capital allocation is shown to be asymptotically proportional to the corresponding Value-at-Risk (VaR) risk measure. The existing
methodology acquired for VaR can therefore be applied to a somewhat less well-studied CTE. In the context of interest, the EVT
approach is seemingly well-motivated by modern regulations, which openly strive for the excessive prudence in determining risk
capitals.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

163

Optimal dividend and investing control of an insurance company with higher solvency constraints. Liang, Zongxia; Huang,
Jianping [RKN: 44952]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 501-511.
This paper considers the optimal control problem of a large insurance company under a fixed insolvency probability. The company
controls proportional reinsurance rate, dividend pay-outs and investing process to maximize the expected present value of the
dividend pay-outs until the time of bankruptcy. This paper aims at describing the optimal return function as well as the optimal
policy. As a by-product, the paper theoretically sets a risk-based capital standard to ensure the capital requirement that can cover
the total risk.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

RISK MANAGEMENT

Alarm system for insurance companies : A strategy for capital allocation. Das, S; Kratz, M [RKN: 45723]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 53-65.
One possible way of risk management for an insurance company is to develop an early and appropriate alarm system before the
possible ruin. The ruin is defined through the status of the aggregate risk process, which in turn is determined by premium
accumulation as well as claim settlement outgo for the insurance company. The main purpose of this work is to design an effective
alarm system, i.e. to define alarm times and to recommend augmentation of capital of suitable magnitude at those points to reduce
the chance of ruin. To draw a fair measure of effectiveness of alarm system, comparison is drawn between an alarm system, with
capital being added at the sound of every alarm, and the corresponding system without any alarm, but an equivalently higher initial
capital. Analytical results are obtained in general setup and this is backed up by simulated performances with various types of loss
severity distributions. This provides a strategy for suitably spreading out the capital and yet addressing survivability concerns at
factory level.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Are quantile risk measures suitable for risk-transfer decisions?. Guerra, Manuel; Centeno, M L [RKN: 45648]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 446-461.
Although controversial from the theoretical point of view, quantile risk measures are widely used by institutions and regulators. In
this paper, we use a unified approach to find the optimal treaties for an agent who seeks to minimize one of these measures,
assuming premium calculation principles of various types. We show that the use of measures like Value at Risk or Conditional Tail
Expectation as optimization criteria for insurance or reinsurance leads to treaties that are not enforceable and/or are clearly bad
for the cedent. We argue that this is one further argument against the use of quantile risk measures, at least for the purpose of
risk-transfer decisions.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Effects of risk management on cost efficiency and cost function of the U.S. Property and liability insurers. Lin, Hong-Jen; Wen,
Min-Ming; Yang, Charles C Society of Actuaries, - 12 pages. [RKN: 74918]
Shelved at: Per: NAAJ (Oxf) Per NAAJ (Lon) Shelved at: JOU
North American Actuarial Journal (2011) 15 (4) : 487-498.
This paper adopts the one-step stochastic frontier approach to investigate the impact of risk management tools of derivatives and
reinsurance on cost efficiency of U.S. property-liability insurance companies. The stochastic frontier approach considers both the
mean and variance of cost efficiency. The sample includes both stock and mutual insurers. Among the findings, the cost function
of the entire sample carries the concavity feature, and insurers tend to use financial derivatives for firm value creation. The results
also show that for the entire sample the use of derivatives enhances the mean of cost efficiency but accompanied with larger
efficiency volatility. Nevertheless, the utilization of financial derivatives mitigates efficiency volatility for mutual insurers. This
research provides important insights for the practice of risk management in the property-liability insurance industry.
http://www.soa.org/news-and-publications/publications/journals/naaj/naaj-detail.aspx

Estimating copulas for insurance from scarce observations, expert opinion and prior information : A Bayesian approach.
Arbenz, Philipp; Canestrabo, Davide - 20 pages. [RKN: 70751]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2012) 42 (1) : 271-290.
A prudent assessment of dependence is crucial in many stochastic models for insurance risks. Copulas have become popular to
model such dependencies. However, estimation procedures for copulas often lead to large parameter uncertainty when
observations are scarce. In this paper, we propose a Bayesian method which combines prior information (e.g. from regulators),
observations and expert opinion in order to estimate copula parameters and determine the estimation uncertainty. The
combination of different sources of information can significantly reduce the parameter uncertainty compared to the use of only one
source. The model can also account for uncertainty in the marginal distributions. Furthermore, we describe the methodology for
obtaining expert opinion and explain involved psychological effects and popular fallacies. We exemplify the approach in a case
study.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Extreme Events: Robust Portfolio Construction in the Presence of Fat Tails. Kemp, Malcolm H D (2011). - Chichester: John Wiley
& Sons Ltd, 2011. - 312 pages. [RKN: 13141]
Shelved at: EF/JNH (Lon) Shelved at: 368.01 KEM
Markets are fat-tailed; extreme outcomes occur more often than many might hope, or indeed the statistics or normal distributions
164

might indicate. In this book, the author provides readers with the latest tools and techniques on how best to adapt portfolio
construction techniques to cope with extreme events. Beginning with an overview of portfolio construction and market drivers, the
book will analyze fat tails, what they are, their behavior, how they can differ and what their underlying causes are. The book will
then move on to look at portfolio construction techniques which take into account fat tailed behavior, and how to stress test your
portfolio against extreme events. Finally, the book will analyze really extreme events in the context of portfolio choice and
problems. The book will offer readers: Ways of understanding and analyzing sources of extreme events, Tools for analyzing the
key drivers of risk and return, their potential magnitude and how they might interact, Methodologies for achieving efficient portfolio
construction and risk budgeting, Approaches for catering for the time-varying nature of the world in which we live, Back-stop
approaches for coping with really extreme events, Illustrations and real life examples of extreme events across asset classes. This
will be an indispensible guide for portfolio and risk managers who will need to better protect their portfolios against extreme events
which, within the financial markets, occur more frequently than we might expect.

A numerical method for the expected penaltyreward function in a Markov-modulated jumpdiffusion process. Diko, Peter;
Usbel, Miguel [RKN: 44982]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (1) : 126-131.
A generalization of the CramrLundberg risk model perturbed by a diffusion is proposed. Aggregate claims of an insurer follow a
compound Poisson process and premiums are collected at a constant rate with additional random fluctuation. The insurer is
allowed to invest the surplus into a risky asset with volatility dependent on the level of the investment, which permits the
incorporation of rational investment strategies as proposed by Berk and Green (2004). The return on investment is modulated by
a Markov process which generalizes previously studied settings for the evolution of the interest rate in time. The GerberShiu
expected penaltyreward function is studied in this context, including ruin probabilities (a first-passage problem) as a special case.
The second order integro-differential system of equations that characterizes the function of interest is obtained. As a closed-form
solution does not exist, a numerical procedure based on the Chebyshev polynomial approximation through a collocation method is
proposed. Finally, some examples illustrating the procedure are presented.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

RISK MEASUREMENT

Are quantile risk measures suitable for risk-transfer decisions?. Guerra, Manuel; Centeno, M L [RKN: 45648]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 446-461.
Although controversial from the theoretical point of view, quantile risk measures are widely used by institutions and regulators. In
this paper, we use a unified approach to find the optimal treaties for an agent who seeks to minimize one of these measures,
assuming premium calculation principles of various types. We show that the use of measures like Value at Risk or Conditional Tail
Expectation as optimization criteria for insurance or reinsurance leads to treaties that are not enforceable and/or are clearly bad
for the cedent. We argue that this is one further argument against the use of quantile risk measures, at least for the purpose of
risk-transfer decisions.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

HaezendonckGoovaerts risk measures and Orlicz quantiles. Bellini, Fabio; Gianin, Emanuela Rosazza [RKN: 45727]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 107-114.
In this paper, we study the well-known HaezendonckGoovaerts risk measures on their natural domain, that is on Orlicz spaces
and, in particular, on Orlicz hearts. We provide a dual representation as well as the optimal scenario in such a representation and
investigate the properties of the minimizer (that we call Orlicz quantile) in the definition of the HaezendonckGoovaerts risk
measure. Since Orlicz quantiles fail to satisfy an internality property, bilateral Orlicz quantiles are also introduced and analyzed.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Jackknife empirical likelihood intervals for Spearmans rho. Wang, Ruodu; Peng, Liang Society of Actuaries, - 12 pages. [RKN:
74917]
Shelved at: Per: NAAJ (Oxf) Per NAAJ (Lon) Shelved at: JOU
North American Actuarial Journal (2011) 15 (4) : 475-486.
In connection with copulas, rank correlation such as Kendalls tau and Spearmans rho has been employed in risk management for
summarizing dependence between two variables and estimating parameters in bivariate copulas and elliptical models. In this
paper a jackknife empirical likelihood method is proposed to construct confidence intervals for Spearmans rho without estimating
the asymptotic variance. A simulation study confirms the advantages of the proposed method.
http://www.soa.org/news-and-publications/publications/journals/naaj/naaj-detail.aspx

Jackknife empirical likelihood method for some risk measures and related quantities. Peng, Liang; Qi, Yongcheng; Wang, Ruodu;
Yang, Jingping [RKN: 45731]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 142-150.
Quantifying risks is of importance in insurance. In this paper, we employ the jackknife empirical likelihood method to construct
confidence intervals for some risk measures and related quantities studied by Jones and Zitikis (2003). A simulation study shows
the advantages of the new method over the normal approximation method and the naive bootstrap method.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

165

Large sample behavior of the CTE and VaR estimators under importance sampling. Ahn, Jae Youn; Shyamalkumar, Nariankadu D
Society of Actuaries, - 24 pages. [RKN: 74826]
Shelved at: Per: NAAJ (Oxf) Per NAAJ (Lon) Shelved at: JOU
North American Actuarial Journal (2011) 15 (3) : 393-416.
The -level value at risk (VaR) and the -level conditional tail expectation (CTE) of a continuous random variable X are defined as
its -level quantile (denoted by q ) and its conditional expectation given the event {X q }, respectively. VaR is a popular risk
measure in the banking sector, for both external and internal reporting purposes, while the CTE has recently become the risk
measure of choice for insurance regulation in North America. Estimation of the CTE for company assets and liabilities is becoming
an important actuarial exercise, and the size and complexity of these liabilities make inference procedures with good small sample
performance very desirable. A common situation is one in which the CTE of the portfolio loss is estimated using simulated values,
and in such situations use of variance reduction techniques such as importance sampling have proved to be fruitful. Construction
of confidence intervals for the CTE relies on the availability of the asymptotic distribution of the normalized CTE estimator, and
although such a result has been available to actuaries, it has so far been supported only by heuristics. The main goal of this paper
is to provide an honest theorem establishing the convergence of the normalized CTE estimator under importance sampling to a
normal distribution. In the process, we also provide a similar result for the VaR estimator under importance sampling, which
improves upon an earlier result. Also, through examples we motivate the practical need for such theoretical results and include
simulation studies to lend insight into the sample sizes at which these asymptotic results become meaningful.
http://www.soa.org/news-and-publications/publications/journals/naaj/naaj-detail.aspx

Multivariate stress scenarios and solvency. McNeil, Alexander J; Smith, Andrew D [RKN: 45634]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 299-308.
We show how the probabilistic concepts of half-space trimming and depth may be used to define convex scenario sets Qa for
stress testing the risk factors that affect the solvency of an insurance company over a prescribed time period. By choosing the
scenario in Qa which minimizes net asset value at the end of the time period, we propose the idea of the least solvent likely event
(LSLE) as a solution to the forward stress testing problem. By considering the support function of the convex scenario set Qa, we
establish theoretical properties of the LSLE when financial risk factors can be assumed to have a linear effect on the net assets of
an insurer. In particular, we show that the LSLE may be interpreted as a scenario causing a loss equivalent to the Value-at-Risk
(VaR) at confidence level a, provided the a-quantile is a subadditive risk measure on linear combinations of the risk factors. In this
case, we also show that the LSLE has an interpretation as a per-unit allocation of capital to the underlying risk factors when the
overall capital is determined according to the VaR. These insights allow us to define alternative scenario sets that relate in similar
ways to coherent measures, such as expected shortfall. We also introduce the most likely ruin event (MLRE) as a solution to the
problem of reverse stress testing.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

On approximating law-invariant comonotonic coherent risk measures. Nakano, Yumiharu - 11 pages. [RKN: 70754]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2012) 42 (1) : 343-353.
The optimal quantization theory is applied for approximating law-invariant comonotonic coherent risk measures. Simple Lp-norm
estimates for the risk measures provide the rate of convergence of that approximation as the number of quantization points goes
to infinity.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

On the HaezendonckGoovaerts risk measure for extreme risks. Tang, Qihe; Yang, Fan [RKN: 45537]
Insurance: Mathematics & Economics (2012) 50 (1) : 217-227.
In this paper, we are interested in the calculation of the HaezendonckGoovaerts risk measure, which is defined via a convex
Young function and a parameter q (0,1) representing the confidence level. We mainly focus on the case in which the risk variable
follows a distribution function from a max-domain of attraction. For this case, we restrict the Young function to be a power function
and we derive exact asymptotics for the HaezendonckGoovaerts risk measure as q 1. As a subsidiary, we also consider the case
with an exponentially distributed risk variable and a general Young function, and we obtain an analytical expression for the
HaezendonckGoovaerts risk measure.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

On the interplay between distortion, mean value and Haezendonck-Goovaerts risk measures. Goovaerts, Marc; Linders, Daniel;
Van Weert, Koen; Tank, Fatih [RKN: 45719]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 10-18.
In the actuarial research, distortion, mean value and HaezendonckGoovaerts risk measures are concepts that are usually treated
separately. In this paper we indicate and characterize the relation between these different risk measures, as well as their relation
to convex risk measures. While it is known that the mean value principle can be used to generate premium calculation principles,
we will show how they also allow to generate solvency calculation principles. Moreover, we explain the role provided for the
distortion risk measures as an extension of the Tail Value-at-Risk (TVaR) and Conditional Tail Expectation (CTE).
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Portfolio adjusting optimization with added assets and transaction costs based on credibility measures. Zhang, Wei-Guo; Zhang,
Xi-Li; Chen, Yunxia [RKN: 44937]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 353-360.
In response to changeful financial markets and investors capital, we discuss a portfolio adjusting problem with additional risk
assets and a riskless asset based on credibility theory. We propose two credibilistic meanvariance portfolio adjusting models with
general fuzzy returns, which take lending, borrowing, transaction cost, additional risk assets and capital into consideration in
portfolio adjusting process. We present crisp forms of the models when the returns of risk assets are some deterministic fuzzy
variables such as trapezoidal, triangular and interval types. We also employ a quadratic programming solution algorithm for
166

obtaining optimal adjusting strategy. The comparisons of numeral results from different models illustrate the efficiency of the
proposed models and the algorithm.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Portfolio insurance under a risk-measure constraint. De Franco, Carmine; Tankov, Peter [RKN: 44938]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 361-370.
The authors study the problem of portfolio insurance from the point of view of a fund manager, who guarantees to the investor that
the portfolio value at maturity will be above a fixed threshold. If, at maturity, the portfolio value is below the guaranteed level, a third
party will refund the investor up to the guarantee. In exchange for this protection, the third party imposes a limit on the risk
exposure of the fund manager, in the form of a convex monetary risk measure. The fund manager therefore tries to maximize the
investors utility function subject to the risk-measure constraint. The authors give a full solution to this non-convex optimization
problem in the complete market setting and show in particular that the choice of the risk measure is crucial for the optimal portfolio
to exist. Explicit results are provided for the entropic risk measure (for which the optimal portfolio always exists) and for the class
of spectral risk measures (for which the optimal portfolio may fail to exist in some cases).
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Remarks on quantiles and distortion risk measures. Dhaene, Jan; Kukush, Alexander; Linders, Danil; Tang, Qihe [RKN: 44853]
Shelved at: online only
European Actuarial Journal (2012) 2(2) December : 319-328.
Distorted expectations can be expressed as weighted averages of quantiles. In this note, we show that this statement is
essentially true, but that one has to be careful with the correct formulation of it. Furthermore, the proofs of the additivity property for
distorted expectations of a comonotonic sum that appear in the literature often do not cover the case of a general distortion
function. We present a straightforward proof for the general case, making use of the appropriate expressions for distorted
expectations in terms of quantiles
Available via Athens: Springer -- Published online, December 2012
http://www.openathens.net

Risk measures in ordered normed linear spaces with non-empty cone-interior. Konstantinides, Dimitrios G; Kountzakis, Christos E
[RKN: 39934]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 111-122.
In this paper, we use tools from the theory of partially ordered normed linear spaces, especially the bases of cones. This work
extends the well-known results for convex and coherent risk measures. Its linchpin consists in the replacement of the riskless bond
by some interior point in the cone of the space of risks, which stands as the alternative numeraire.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Second order regular variation and conditional tail expectation of multiple risks. Hua, Lei; Joe, Harry [RKN: 44955]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 537-546.
For the purpose of risk management, the study of tail behavior of multiple risks is more relevant than the study of their overall
distributions. Asymptotic study assuming that each marginal risk goes to infinity is more mathematically tractable and has also
uncovered some interesting performance of risk measures and relationships between risk measures by their first order
approximations. However, the first order approximation is only a crude way to understand tail behavior of multiple risks, and
especially for sub-extremal risks. In this paper, we conduct asymptotic analysis on conditional tail expectation (CTE) under the
condition of second order regular variation (2RV). First, the closed-form second order approximation of CTE is obtained for the
univariate case. Then CTE of the form , as , is studied, where is a loss aggregating function and with independent of and the
survivor function of satisfying the condition of 2RV. Closed-form second order approximations of CTE for this multivariate form
have been derived in terms of corresponding value at risk. For both the univariate and multivariate cases, we find that the first
order approximation is affected by only the regular variation index of marginal survivor functions, while the second order
approximation is influenced by both the parameters for first and second order regular variation, and the rate of convergence to the
first order approximation is dominated by the second order parameter only. We have also shown that the 2RV condition and the
assumptions for the multivariate form are satisfied by many parametric distribution families, and thus the closed-form
approximations would be useful for applications. Those closed-form results extend the study of.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Sensitivity of risk measures with respect to the normal approximation of total claim distributions. Krtschmer, Volker; Zhle,
Henryk [RKN: 44935]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 335-344.
A simple and commonly used method to approximate the total claim distribution of a (possibly weakly dependent) insurance
collective is the normal approximation. In this article, we investigate the error made when the normal approximation is plugged in
a fairly general distribution-invariant risk measure. We focus on the rate of convergence of the error relative to the number of
clients, we specify the relative errors asymptotic distribution, and we illustrate our results by means of a numerical example.
Regarding the risk measure, we take into account distortion risk measures as well as distribution-invariant coherent risk
measures.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

The strictest common relaxation of a family of risk measures. Roorda, Berend; Schumacher, J M [RKN: 13027]
Shelved at: Per: IME (Oxf)
167

Insurance: Mathematics & Economics (2011) 48 (1) : 29-34.
Operations which form new risk measures from a collection of given (often simpler) risk measures have been used extensively in
the literature. Examples include convex combination, convolution, and the worst-case operator. Here we study the risk measure
that is constructed from a family of given risk measures by the best-case operator; that is, the newly constructed risk measure is
defined as the one that is as restrictive as possible under the condition that it accepts all positions that are accepted under any of
the risk measures from the family. In fact we define this operation for conditional risk measures, to allow a multiperiod setting. We
show that the well-known VaR risk measure can be constructed from a family of conditional expectations by a combination that
involves both worst-case and best-case operations. We provide an explicit description of the acceptance set of the conditional risk
measure that is obtained as the strictest common relaxation of two given conditional risk measures.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Tail distortion risk and its asymptotic analysis. Zhu, Li; Li, Haijun [RKN: 45728]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 115-121.
A distortion risk measure used in finance and insurance is defined as the expected value of potential loss under a scenario
probability measure. In this paper, the tail distortion risk measure is introduced to assess tail risks of excess losses modeled by the
right tails of loss distributions. The asymptotic linear relation between tail distortion and value-at-risk is derived for heavy-tailed
losses with the linear proportionality constant depending only on the distortion function and the tail index. Various examples
involving tail distortions for location-invariant, scale-invariant, and shape-invariant loss distribution families are also presented to
illustrate the results.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Worst case risk measurement: Back to the future?. Goovaerts, Marc J; Kaas, Rob; Laeven, Roger J A [RKN: 44940]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 380-392.
This paper studies the problem of finding best-possible upper bounds on a rich class of risk measures, expressible as integrals
with respect to measures, under incomplete probabilistic information. Both univariate and multivariate risk measurement problems
are considered. The extremal probability distributions, generating the worst case scenarios, are also identified.
The problem of worst case risk measurement has been studied extensively by Etienne De Vijlder and his co-authors, within the
framework of finite-dimensional convex analysis. This paper revisits and extends some of their results.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

RISK MODELS

The finite time ruin probability in a risk model with capital injections. Nie, Ciyu; Dickson, David C M; Li, Shuanming (2012). -
Victoria: University of Melbourne, 2012. - 21 pages. [RKN: 70581]
We consider a risk model with capital injections as described in Nie et al. (2011). We show that in the Sparre Andersen framework
the density of the time to ruin for the model with capital injections can be expressed in terms of the density of the time to ruin in an
ordinary Sparre Andersen risk process. In the special case of Erlang inter-claim times and exponential claims we show that there
exists a readily computable formula for the density of the time to ruin. When the inter-claim time distribution is exponential, we
obtain an explicit solution for the density of the time to ruin when the individual claim amount distribution is Erlang(2), and we
explain techniques to find the moments of the time to ruin. In the final section we consider the related problem of the distribution of
the duration of negative surplus in the classical risk model, and we obtain explicit solutions for the (defective) density of the total
duration of negative surplus for two individual claim amount distributions.
No.1 (1993) onwards available online. Download as PDF.
http://www.economics.unimelb.edu.au/ACT/papers.shtml

RISK SHARING

Convex order and comonotonic conditional mean risk sharing. Denuit, Michel; Dhaene, Jan [RKN: 44785]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51(2) : 265-270.
Using a standard reduction argument based on conditional expectations, this paper argues that risk sharing is always beneficial
(with respect to convex order or second degree stochastic dominance) provided the risk-averse agents share the total losses
appropriately (whatever the distribution of the losses, their correlation structure and individual degrees of risk aversion).
Specifically, all agents hand their individual losses over to a pool and each of them is liable for the conditional expectation of his
own loss given the total loss of the pool. We call this risk sharing mechanism the conditional mean risk sharing. If all the conditional
expectations involved are non-decreasing functions of the total loss then the conditional mean risk sharing is shown to be
Pareto-optimal. Explicit expressions for the individual contributions to the pool are derived in some special cases of interest:
independent and identically distributed losses, comonotonic losses, and mutually exclusive losses. In particular, conditions under
which this payment rule leads to a comonotonic risk sharing are examined.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

168

RISK THEORY

Allais for all: Revisiting the paradox in a large representative sample. Huck, Steffen; Mller, Wieland Springer, - 33 pages. [RKN:
73974]
Shelved at: Per: J Risk Uncrtnty
Journal of Risk and Uncertainty (2012) 44 (3) : 261-293.
We administer the Allais paradox questions to both a representative sample of the Dutch population and to student subjects.
Three treatments are implemented: one with the original high hypothetical payoffs, one with low hypothetical payoffs and a third
with low real payoffs. Our key findings are: (i) violations in the non-lab sample are systematic and a large bulk of violations is likely
to stem from non-familiarity with large payoffs, (ii) we can identify groups of the general population that have much higher than
average violation rates; this concerns mainly the lowly educated and unemployed, and (iii) the relative treatment differences in the
population at large are accurately predicted by the lab sample, but violation rates in all lab treatments are about 15 percentage
points lower than in the corresponding non-lab treatments.

Decreasing absolute risk aversion, prudence and increased downside risk aversion. Meyer, Jack; Liu, Liqun Springer, - 18 pages.
[RKN: 73973]
Shelved at: Per: J Risk Uncrtnty
Journal of Risk and Uncertainty (2012) 44 (3) : 243-260.
Downside risk increases have previously been characterized as changes preferred by all decision makers u(x) with u'''(x) > 0. For
risk averse decision makers, u'''(x) > 0 also defines prudence. This paper finds that downside risk increases can also be
characterized as changes preferred by all decision makers displaying decreasing absolute risk aversion (DARA) since those
changes involve random variables that have equal means. Building on these findings, the paper proposes using more
decreasingly absolute risk averse or more prudent as alternative definitions of increased downside risk aversion. These
alternative definitions generate a transitive ordering, while the existing definition based on a transformation function with a positive
third derivative does not. Other properties of the new definitions of increased downside risk aversion are also presented.
http://www.openathens.net

A new look at the homogeneous risk model. Lefvre, Claude; Picard, Philippe [RKN: 44953]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 512-519.
The present paper aims to revisit the homogeneous risk model investigated by, first, a claim arrival process is defined on a fixed
time interval by assuming that the arrival times satisfy an order statistic property. Then, the variability and the covariance of an
aggregate claim amount process is discussed. The distribution of the aggregate discounted claims is also examined. Finally, a
closed-form expression for the non-ruin probability is derived in terms of a family of Appell polynomials. This formula holds for all
claim distributions, even dependent. It generalizes several results obtained so far.

Available via Athens: Palgrave MacMillan
http://www.openathens.net

Non-parametric estimation of the GerberShiu function for the WienerPoisson risk model. Shimizu, Yasutaka [RKN: 44881]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2012) 1 : 56-69.
A non-parametric estimator of the GerberShiu function is proposed for a risk process with a compound Poisson claim process
plus a diffusion perturbation; the WienerPoisson risk model. The estimator is based on a regularized inversion of an
empirical-type estimator of the Laplace transform of the GerberShiu function. We show the weak consistency of the estimator in
the sense of an integrated squared error with the rate of convergence.
http://www.openathens.net

An operator-based approach to the analysis of ruin-related quantities in jump diffusion risk models. Feng, Runhuan [RKN: 40025]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 304-313.
Recent developments in ruin theory have seen the growing popularity of jump diffusion processes in modeling an insurers assets
and liabilities. Despite the variations of technique, the analysis of ruin-related quantities mostly relies on solutions to certain
differential equations. In this paper, we propose in the context of Lvy-type jump diffusion risk models a solution method to a
general class of ruin-related quantities. Then we present a novel operator-based approach to solving a particular type of
integro-differential equations. Explicit expressions for resolvent densities for jump diffusion processes killed on exit below zero are
obtained as by-products of this work.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Randomized observation periods for the compound poisson risk model: dividends. Albrecher, Hansjrg; Cheung, Eric C K;
Thonhauser, Stefan - 28 pages. [RKN: 74900]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2011) 41 (2) : 645-672.
In the framework of the classical compound Poisson process in collective risk theory, we study a modification of the horizontal
dividend barrier strategy by introducing random observation times at which dividends can be paid and ruin can be observed. This
model contains both the continuous-time and the discrete-time risk model as a limit and represents a certain type of bridge
between them which still enables the explicit calculation of moments of total discounted dividend payments until ruin. Numerical
illustrations for several sets of parameters are given and the effect of random observation times on the performance of the
dividend strategy is studied.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Updating beliefs with imperfect signals: Experimental evidence. Poinas, Franois; Rosaz, Julie; Roussillon, Batrice Springer, - 23
pages. [RKN: 73972]
Shelved at: Per: J Risk Uncrtnty
169

Journal of Risk and Uncertainty (2012) 44 (3) : 219-241.
We conduct an experiment on individual choice under risk in which we study belief updating when an agent receives a signal that
restricts the number of possible states of the world. Subjects observe a sample drawn from an urn and form initial beliefs about the
urns composition. We then elicit how beliefs are modified after subjects receive a signal that restricts the set of the possible urns
from which the observed sample could have been drawn. We find that this type of signal increases the frequency of correct
assessments and that prediction accuracy is higher for lower levels of risk. We also show that prediction accuracy is higher after
invalidating signals (i.e. signals that contradict the initial belief). This pattern is explained by the lower level of risk associated with
invalidating signals. Finally, we find evidence for a lack of persistence of choices under high risk.
http://www.openathens.net

RUIN PROBABILITY

Alarm system for insurance companies : A strategy for capital allocation. Das, S; Kratz, M [RKN: 45723]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 53-65.
One possible way of risk management for an insurance company is to develop an early and appropriate alarm system before the
possible ruin. The ruin is defined through the status of the aggregate risk process, which in turn is determined by premium
accumulation as well as claim settlement outgo for the insurance company. The main purpose of this work is to design an effective
alarm system, i.e. to define alarm times and to recommend augmentation of capital of suitable magnitude at those points to reduce
the chance of ruin. To draw a fair measure of effectiveness of alarm system, comparison is drawn between an alarm system, with
capital being added at the sound of every alarm, and the corresponding system without any alarm, but an equivalently higher initial
capital. Analytical results are obtained in general setup and this is backed up by simulated performances with various types of loss
severity distributions. This provides a strategy for suitably spreading out the capital and yet addressing survivability concerns at
factory level.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Analysis of risk models using a level crossing technique. Brill, Percy H; Yu, Kaiqi [RKN: 44932]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 298-309.
This paper analyzes ruin-like risk models in Insurance, which are variants of the CramerLundberg (CL) model with a barrier or a
threshold. We consider three model variants, which have different portfolio strategies when the risk reserve reaches the barrier or
exceeds the threshold. In these models we construct a time-extended risk process defined on cycles of a specific renewal
process. The time until ruin is equal to one cycle of the specific renewal process. We also consider a fourth model, which is a
variant of a model proposed by . The analysis of each model employs a level crossing method (LC) to derive the steady-state
probability distribution of the time-extended risk process. From the derived distribution we compute the expected time until ruin,
the probability distribution of the deficit at ruin, and related quantities of interest.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Erlangian approximation to finite time ruin probabilities in perturbed risk models. Stanford, David A; Yu, Kaiqi; Ren, Jiandong
[RKN: 45148]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 1 : 38-58.
In this paper, we consider a class of perturbed risk processes that have an underlying Markov structure, including
Markov-modulated risk processes, and Sparre-Andersen risk processes when both inter-claim times and claim sizes are
phase-type. We apply the Erlangization method to the risk process in the class in order to obtain an accurate approximation of the
finite time ruin probability. In addition, we develop an efficient recursive procedure by recognizing a repeating structure in the
probability matrices we work with. We believe the present work is among the first to either compute or approximate finite time ruin
probabilities in the perturbed risk model.

Explicit ruin formulas for models with dependence among risks. Albrecher, Hansjrg; Constantinescu, Corina; Loisel, Stphane
[RKN: 40021]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 265-270.
We show that a simple mixing idea allows one to establish a number of explicit formulas for ruin probabilities and related quantities
in collective risk models with dependence among claim sizes and among claim inter-occurrence times. Examples include
compound Poisson risk models with completely monotone marginal claim size distributions that are dependent according to
Archimedean survival copulas as well as renewal risk models with dependent inter-occurrence times.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

A hybrid estimate for the finite-time ruin probability in a bivariate autoregressive risk model with application to portfolio
optimization. Tang, Qihe; Yuan, Zhongyi Society of Actuaries, - 20 pages. [RKN: 70667]
Shelved at: Per: NAAJ (Oxf) Per NAAJ (Lon) Shelved at: JOU
North American Actuarial Journal (2012) 16 (3) : 378-397.
Consider a discrete-time risk model in which the insurer is allowed to invest a proportion of its wealth in a risky stock and keep the
rest in a risk-free bond. Assume that the claim amounts within individual periods follow an autoregressive process with
heavy-tailed innovations and that the logreturns of the stock follow another autoregressive process, independent of the former
one. We derive an asymptotic formula for the finite-time ruin probability and propose a hybrid method, combining simulation with
asymptotics, to compute this ruin probability more efficiently. As an application, we consider a portfolio optimization problem in
which we determine the proportion invested in the risky stock that maximizes the expected terminal wealth subject to a constraint
170

on the ruin probability.
http://www.soa.org/news-and-publications/publications/journals/naaj/naaj-detail.aspx

Joint densities involving the time to ruin in the Sparre Andersen risk model under exponential assumptions. Landriault, David;
Shi, Tianxiang; Willmot, Gordon E [RKN: 44939]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 371-379.
Recent research into the nature of the distribution of the time of ruin in some Sparre Andersen risk models has resulted in series
expansions for the associated density function. Examples include in the classical Poisson model with exponential interclaim times,
and , who used a duality argument in the case with exponential claim amounts. The aim of this paper is not only to unify previous
methodology through the use of Lagranges expansion theorem, but also to provide insight into the nature of the series expansions
by identifying the probabilistic contribution of each term in the expansion through analysis involving the distribution of the number
of claims until ruin. The (defective) distribution of the number of claims until ruin is then further examined. Interestingly, a
connection to the well-known extended truncated negative binomial (ETNB) distribution is also established. Finally, a closed-form
expression for the joint density of the time to ruin, the surplus prior to ruin, and the number of claims until ruin is derived. In the last
section, the formula of for the density of the time to ruin in the classical risk model is re-examined to identify its individual
contributions based on the number of claims until ruin.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

The joint distribution of the time to ruin and the number of claims until ruin in the classical risk model. Dickson, David C M (2011).
- Victoria: University of Melbourne, 2011. - 15 pages. [RKN: 74756]
We use probabilistic arguments to derive an expression for the joint
density of the time of ruin and the number of claims until ruin in the
classical risk model. From this we obtain a general expression for
the probability function of the number of claims until ruin. We also
consider the moments of the number of claims until ruin and illustrate
our results in the case of exponentially distributed individual claims.
We find a very strong correlation between the number of claims until
ruin and the time of ruin in this case. Finally, we briefly discuss joint
distributions involving the surplus prior to ruin and deficit at ruin.
No.1 (1993) onwards available online. Download as PDF.
http://www.economics.unimelb.edu.au/ACT/papers.shtml

Minimizing the probability of lifetime ruin under stochastic volatility. Bayraktar, Erhan; Hu, Xueying; Young, Virginia R [RKN: 44962]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (2) : 194-206.
The authors assume that an individual invests in a financial market with one riskless and one risky asset, with the latters price
following a diffusion with stochastic volatility. Given the rate of consumption, we find the optimal investment strategy for the
individual who wishes to minimize the probability of going bankrupt. To solve this minimization problem, the authors use
techniques from stochastic optimal control.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

On absolute ruin minimization under a diffusion approximation model. Luo, Shangzhen; Taksar, Michael [RKN: 39935]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 123-133.
In this paper, we assume that the surplus process of an insurance entity is represented by a pure diffusion. The company can
invest its surplus into a BlackScholes risky asset and a risk free asset. We impose investment restrictions that only a limited
amount is allowed in the risky asset and that no short-selling is allowed. We further assume that when the surplus level becomes
negative, the company can borrow to continue financing. The ultimate objective is to seek an optimal investment strategy that
minimizes the probability of absolute ruin, i.e. the probability that the liminf of the surplus process is negative infinity. The
corresponding HamiltonJacobiBellman (HJB) equation is analyzed and a verification theorem is proved; applying the HJB
method we obtain explicit expressions for the S-shaped minimal absolute ruin function and its associated optimal investment
strategy. In the second part of the paper, we study the optimization problem with both investment and proportional reinsurance
control. There the minimal absolute ruin function and the feedback optimal investmentreinsurance control are found explicitly as
well.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Optimal commutable annuities to minimize the probability of lifetime ruin. Wang, Ting; Young, Virginia R [RKN: 45536]
Insurance: Mathematics & Economics (2012) 50 (1) : 200-216.
We find the minimum probability of lifetime ruin of an investor who can invest in a market with a risky and a riskless asset and who
can purchase a commutable life annuity. The surrender charge of a life annuity is a proportion of its value. Ruin occurs when the
total of the value of the risky and riskless assets and the surrender value of the life annuity reaches zero. We find the optimal
investment strategy and optimal annuity purchase and surrender strategies in two situations: (i) the value of the risky and riskless
assets is allowed to be negative, with the imputed surrender value of the life annuity keeping the total positive; (ii) the value of the
risky and riskless assets is required to be non-negative. In the first case, although the individual has the flexibility to buy or sell at
any time, we find that the individual will not buy a life annuity unless she can cover all her consumption via the annuity and she will
never sell her annuity. In the second case, the individual surrenders just enough annuity income to keep her total assets positive.
However, in this second case, the individuals annuity purchasing strategy depends on the size of the proportional surrender
charge. When the charge is large enough, the individual will not buy a life annuity unless she can cover all her consumption, the
so-called safe level. When the charge is small enough, the individual will buy a life annuity at a wealth lower than this safe level.
Available via Athens: Palgrave MacMillan
http://www.openathens.net
171


Refinements of two-sided bounds for renewal equations. Woo, Jae-Kyung [RKN: 40012]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 189-196.
Many quantities of interest in the study of renewal processes may be expressed as the solution to a special type of integral
equation known as a renewal equation. The main purpose of this paper is to provide bounds for the solution of renewal equations
based on various reliability classifications. Exponential and nonexponential types of inequalities are derived. In particular,
two-sided bounds with specific reliability conditions become sharp. Finally, some examples including ultimate ruin for the classical
Poisson model with time-dependent claim sizes, the joint distribution of the surplus prior to and at ruin, and the excess life time, are
provided.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Risk processes with shot noise Cox claim number process and reserve dependent premium rate. Macci, Claudio; Torrisi, Giovanni
Luca [RKN: 39936]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 134-145.
We consider a suitable scaling, called the slow Markov walk limit, for a risk process with shot noise Cox claim number process and
reserve dependent premium rate. We provide large deviation estimates for the ruin probability. Furthermore, we find an
asymptotically efficient law for the simulation of the ruin probability using importance sampling. Finally, we present asymptotic
bounds for ruin probabilities in the Bayesian setting.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Ruin by dynamic contagion claims. Dassios, Angelos; Zhao, Hongbiao [RKN: 45726]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 93-106.
In this paper, we consider a risk process with the arrival of claims modelled by a dynamic contagion process, a generalisation of
the Cox process and Hawkes process introduced by Dassios and Zhao (2011). We derive results for the infinite horizon model that
are generalisations of the CramrLundberg approximation, Lundbergs fundamental equation, some asymptotics as well as
bounds for the probability of ruin. Special attention is given to the case of exponential jumps and a numerical example is provided.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Ruin probabilities for a regenerative Poisson gap generated risk process. Asmussen, Sren; Biard, Romain [RKN: 44802]
Shelved at: online only
European Actuarial Journal (2011) 1(1) July : 3-22.
A risk process with constant premium rate c and Poisson arrivals of claims is considered. A threshold r is defined for claim
interarrival times, such that if k consecutive interarrival times are larger than r, then the next claim has distribution G. Otherwise,
the claim size distribution is F. Asymptotic expressions for the infinite horizon ruin probabilities are given for both light- and the
heavy-tailed cases. A basic observation is that the process regenerates at each G-claim. Also an approach via Markov additive
processes is outlined, and heuristics are given for the distribution of the time to ruin.
Available via Athens: Springer

RUIN THEORY

Archimedean copulas in finite and infinite dimensions - with application to ruin problems. Constantinescu, Corina; Hashorva,
Enkelejd; Ji, Lanpeng [RKN: 44950]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 487-495.
In this paper the authors discuss the link between Archimedean copulas and Dirichlet distributions for both finite and infinite
dimensions. With motivation from the recent papers and the authors apply their results to certain ruin problems.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Erlang risk models and finite time ruin problems. Dickson, David C M; Li, Shuanming [RKN: 44884]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2012) 3 : 183-202.
We consider the joint density of the time of ruin and deficit at ruin in the Erlang(n) risk model. We give a general formula for this
joint density and illustrate how the components of this formula can be found in the special case when n=2. We then show how the
formula can be implemented numerically for a general value of n. We also discuss how the ideas extend to the generalised
Erlang(n) risk model.
Available via Athens access
http://www.openathens.net/

A generalized penalty function in Sparre Andersen risk models with surplus-dependent premium. Cheung, Eric C K [RKN: 45133]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (3) : 384-397.
In a general Sparre Andersen risk model with surplus-dependent premium income, the generalization of GerberShiu function
proposed by Cheung et al. (2010a) is studied. A general expression for such GerberShiu function is derived, and it is shown that
its determination reduces to the evaluation of a transition function which is independent of the penalty function. Properties of and
explicit expressions for such a transition function are derived when the surplus process is subject to (i) constant premium; (ii) a
172

threshold dividend strategy; or (iii) credit interest. Extension of the approach is discussed for an absolute ruin model with debit
interest.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Joint densities involving the time to ruin in the Sparre Andersen risk model under exponential assumptions. Landriault, David;
Shi, Tianxiang; Willmot, Gordon E [RKN: 44939]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 371-379.
Recent research into the nature of the distribution of the time of ruin in some Sparre Andersen risk models has resulted in series
expansions for the associated density function. Examples include in the classical Poisson model with exponential interclaim times,
and , who used a duality argument in the case with exponential claim amounts. The aim of this paper is not only to unify previous
methodology through the use of Lagranges expansion theorem, but also to provide insight into the nature of the series expansions
by identifying the probabilistic contribution of each term in the expansion through analysis involving the distribution of the number
of claims until ruin. The (defective) distribution of the number of claims until ruin is then further examined. Interestingly, a
connection to the well-known extended truncated negative binomial (ETNB) distribution is also established. Finally, a closed-form
expression for the joint density of the time to ruin, the surplus prior to ruin, and the number of claims until ruin is derived. In the last
section, the formula of for the density of the time to ruin in the classical risk model is re-examined to identify its individual
contributions based on the number of claims until ruin.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

On the absolute ruin problem in a Sparre Andersen risk model with constant interest. Mitric, Ilie-Radu; Badescu, Andrei L; Stanford,
David A [RKN: 45533]
Insurance: Mathematics & Economics (2012) 50 (1) : 167-178.
In this paper, we extend the work of Mitric and Sendova (2010), which considered the absolute ruin problem in a risk model with
debit and credit interest, to renewal and non-renewal structures. Our first results apply to MAP processes, which we later restrict to
the Sparre Andersen renewal risk model with interclaim times that are generalized Erlang (n) distributed and claim amounts
following a Matrix-Exponential (ME) distribution (see for e.g. Asmussen and OCinneide (1997)). Under this scenario, we present
a general methodology to analyze the GerberShiu discounted penalty function defined at absolute ruin, as a solution of
high-order linear differential equations with non-constant coefficients. Closed-form solutions for some absolute ruin related
quantities in the generalized Erlang (2) case complement the results obtained under the classical risk model by Mitric and
Sendova (2010).
Available via Athens: Palgrave MacMillan
http://www.openathens.net

On the analysis of a general class of dependent risk processes. Willmot, Gordon E; Woo, Jae-Kyung [RKN: 45730]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 134-141.
A generalized Sparre Andersen risk process is examined, whereby the joint distribution of the interclaim time and the ensuing
claim amount is assumed to have a particular mathematical structure. This structure is present in various dependency models
which have previously been proposed and analyzed. It is then shown that this structure in turn often implies particular functional
forms for joint discounted densities of ruin related variables including some or all of the deficit at ruin, the surplus immediately prior
to ruin, and the surplus after the second last claim. Then, employing a fairly general interclaim time structure which involves a
combination of Erlang type densities, a complete identification of a generalized GerberShiu function is provided. An application is
given applying these results to a situation involving a mixed Erlang type of claim amount assumption. Various examples and
special cases of the model are then considered, including one involving a bivariate Erlang mixture model.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

An operator-based approach to the analysis of ruin-related quantities in jump diffusion risk models. Feng, Runhuan [RKN: 40025]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 304-313.
Recent developments in ruin theory have seen the growing popularity of jump diffusion processes in modeling an insurers assets
and liabilities. Despite the variations of technique, the analysis of ruin-related quantities mostly relies on solutions to certain
differential equations. In this paper, we propose in the context of Lvy-type jump diffusion risk models a solution method to a
general class of ruin-related quantities. Then we present a novel operator-based approach to solving a particular type of
integro-differential equations. Explicit expressions for resolvent densities for jump diffusion processes killed on exit below zero are
obtained as by-products of this work.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

The time to ruin and the number of claims until ruin for phase-type claims. Frostig, Esther; Pitts, Susan M; Politis, Konstadinos
[RKN: 45720]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 19-25.
We consider a renewal risk model with phase-type claims, and obtain an explicit expression for the joint transform of the time to
ruin and the number of claims until ruin, with a penalty function applied to the deficit at ruin. The approach is via the duality
between a risk model with phase-type claims and a particular single server queueing model with phase-type customer interarrival
times; see Frostig (2004). This result specializes to one for the probability generating function of the number of claims until ruin.
We obtain explicit expressions for the distribution of the number of claims until ruin for exponentially distributed claims when the
inter-claim times have an Erlang-n distribution.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

173

SAMPLING

Adaptive Importance Sampling for simulating copula-based distributions. Bee, Marco [RKN: 40018]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 237-245.
In this paper, we propose a generalization of importance sampling, called Adaptive Importance Sampling, to approximate
simulation of copula-based distributions. Unlike existing methods for copula simulation that have appeared in the literature, this
algorithm is broad enough to be used for any absolutely continuous copula. We provide details of the algorithm including rules for
stopping the iterative process and consequently assess its performance using extensive Monte Carlo experiments. To assist in its
extension to several dimensions, we discuss procedures for identifying the crucial parameters in order to achieve desirable results
especially as the size of the dimension increases. Finally, for practical illustration, we demonstrate the use of the algorithm to price
First-to-Default credit swap, an important credit derivative instrument in the financial market. The method works exquisitely well
even for large dimensions making it a valuable tool for simulating from many different classes of copulas including those which
have been difficult to sample from using traditional techniques.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

SCENARIO GENERATION

Multivariate stress scenarios and solvency. McNeil, Alexander J; Smith, Andrew D [RKN: 45634]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 299-308.
We show how the probabilistic concepts of half-space trimming and depth may be used to define convex scenario sets Qa for
stress testing the risk factors that affect the solvency of an insurance company over a prescribed time period. By choosing the
scenario in Qa which minimizes net asset value at the end of the time period, we propose the idea of the least solvent likely event
(LSLE) as a solution to the forward stress testing problem. By considering the support function of the convex scenario set Qa, we
establish theoretical properties of the LSLE when financial risk factors can be assumed to have a linear effect on the net assets of
an insurer. In particular, we show that the LSLE may be interpreted as a scenario causing a loss equivalent to the Value-at-Risk
(VaR) at confidence level a, provided the a-quantile is a subadditive risk measure on linear combinations of the risk factors. In this
case, we also show that the LSLE has an interpretation as a per-unit allocation of capital to the underlying risk factors when the
overall capital is determined according to the VaR. These insights allow us to define alternative scenario sets that relate in similar
ways to coherent measures, such as expected shortfall. We also introduce the most likely ruin event (MLRE) as a solution to the
problem of reverse stress testing.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

SECURITIES

Ambiguity aversion and an intertemporal equilibrium model of catastrophe-linked securities pricing. Zhu, Wenge [RKN: 44973]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (1) : 38-46.
To explain several stylized facts concerning catastrophe-linked securities premium spread, the author proposes an intertemporal
equilibrium model by allowing agents to act in a robust control framework against model misspecification with respect to rare
events. The author has presented closed-form pricing formulas in some special cases and tested the model using empirical data
and simulation.


Available via Athens: Palgrave MacMillan
http://www.openathens.net

Pricing fixed-income securities in an information-based framework. Hughston, Lane P; Macrina, Andrea Routledge, [RKN: 45844]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (3-4) : 361-379.
The purpose of this article is to introduce a class of information-based models for the pricing of fixed-income securities. We
consider a set of continuous-time processes that describe the flow of information concerning market factors in a monetary
economy. The nominal pricing kernel is assumed to be given at any specified time by a function of the values of information
processes at that time. Using a change-of-measure technique, we derive explicit expressions for the prices of nominal discount
bonds and deduce the associated dynamics of the short rate of interest and the market price of risk. The interest rate positivity
condition is expressed as a differential inequality. An example that shows how the model can be calibrated to an arbitrary initial
yield curve is presented. We proceed to model the price level, which is also taken at any specified time to be given by a function of
the values of the information processes at that time. A simple model for a stochastic monetary economy is introduced in which the
prices of the nominal discount bonds and inflation-linked notes can be expressed in terms of aggregate consumption and the
liquidity benefit generated by the money supply.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

174

SHARE PRICES

An extension of the WhittakerHenderson method of graduation. Nocon, Alicja S; Scott, William F [RKN: 44882]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2012) 1 : 70-79.
We present an outline and historical summary of the WhittakerHenderson method of graduation (or data smoothing), together
with an extension of the method in which the graduated values are obtained by minimising a WhittakerHenderson criterion
subject to constraints. Examples are given, using data for the global average temperature anomaly and for a set of share prices, in
which the proposed method appears to give good results.

SIMULATION

Closed form approximations for spread options. Venkatramanan, Aanand; Alexander, Carol Routledge, [RKN: 45523]
Applied Mathematical Finance (2011) 18 (5-6) : 447-472.
This article expresses the price of a spread option as the sum of the prices of two compound options. One compound option is to
exchange vanilla call options on the two underlying assets and the other is to exchange the corresponding put options. This way
we derive a new closed form approximation for the price of a European spread option and a corresponding approximation for each
of its price, volatility and correlation hedge ratios. Our approach has many advantages over existing analytical approximations,
which have limited validity and an indeterminacy that renders them of little practical use. The compound exchange option
approximation for European spread options is then extended to American spread options on assets that pay dividends or incur
costs. Simulations quantify the accuracy of our approach; we also present an empirical application to the American crack spread
options that are traded on NYMEX. For illustration, we compare our results with those obtained using the approximation attributed
to Kirk (1996, Correlation in energy markets In "Managing Energy Price Risk" , Edited by Kaminski , V. 71 78 (London : Risk
Publications)).


Available via Athens: Taylor & Francis Online
http://www.openathens.net

SINGAPORE

A copula approach to test asymmetric information with applications to predictive modeling. Shi, Peng; Valdez, Emiliano A [RKN:
44965]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (2) : 226-239.
In this article, we present a copula regression model for testing asymmetric information as well as for predictive modeling
applications in automobile insurance market. We use the Frank copula to jointly model the type of coverage and the number of
accidents, with the dependence parameter providing for evidence of the relationship between the choice of coverage and the
frequency of accidents. This dependence therefore provides an indication of the presence (or absence) of asymmetric information.
The type of coverage is in some sense ordered so that coverage with higher ordinals indicate the most comprehensive coverage.
Henceforth, a positive relationship would indicate that more coverage is chosen by high risk policyholders, and vice versa. This
presence of asymmetric information could be due to either adverse selection or moral hazard, a distinction often made in the
economics or insurance literature, or both. We calibrated our copula model using a one-year cross-sectional observation of claims
arising from a major automobile insurer in Singapore. Our estimation results indicate a significant positive coveragerisk
relationship. However, when we correct for the bias resulting from possible underreporting of accidents, we find that the positive
association vanishes. We further used our estimated model for other possible actuarial applications. In particular, we are able to
demonstrate the effect of coverage choice on the incidence of accidents, and based on which, the pure premium is derived. In
general, a positive margin is observed when compared with the gross premium available in our empirical database.

Available via Athens: Palgrave MacMillan
http://www.openathens.net

SOLVENCY

Claims development result in the paid-incurred chain reserving method. Happ, Sebastian; Merz, Michael; Wthrich, Mario V [RKN:
45724]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 66-72.
We present the one-year claims development result (CDR) in the paid-incurred chain (PIC) reserving model. The PIC reserving
model presented in Merz and Wthrich (2010) is a Bayesian stochastic claims reserving model that considers simultaneously
claims payments and incurred losses information and allows for deriving the full predictive distribution of the outstanding loss
liabilities. In this model we study the conditional mean square error of prediction (MSEP) for the one-year CDR uncertainty, which
is the crucial uncertainty view under Solvency II.
Available via Athens: Palgrave MacMillan
http://www.openathens.net
175


Higher moments of the claims development result in general insurance. Salzmann, Robert; Wthrich, Mario V; Merz, Michael - 30
pages. [RKN: 70755]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2012) 42 (1) : 355-384.
The claims development result (CDR) is one of the major risk drivers in the profit and loss statement of a general insurance
company. Therefore, the CDR has become a central object of interest under new solvency regulation. In current practice, simple
methods based on the first two moments of the CDR are implemented to find a proxy for the distribution of the CDR. Such
approximations based on the first two moments are rather rough and may fail to appropriately describe the shape of the
distribution of the CDR. In this paper we provide an analysis of higher moments of the CDR. Within a Bayes chain ladder
framework we consider two different models for which it is possible to derive analytical solutions for the higher moments of the
CDR. Based on higher moments we can e.g. calculate the skewness and the excess kurtosis of the distribution of the CDR and
obtain refined approximations. Moreover, a case study investigates and answers questions raised in IASB [9].
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

The influence of non-linear dependencies on the basis risk of industry loss warranties. Gatzert, Nadine; Kellner, Ralf [RKN: 44983]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (1) : 132-144.
Index-linked catastrophic loss instruments represent an alternative to traditional reinsurance to hedge against catastrophic losses.
The use of these instruments comes with benefits, such as a reduction of moral hazard and higher transparency. However, at the
same time, it introduces basis risk as a crucial key risk factor, since the index and the companys losses are usually not fully
dependent. The aim of this paper is to examine the impact of basis risk on an insurers solvency situation when an industry loss
warranty contract is used for hedging. Since previous literature has consistently stressed the importance of a high degree of
dependence between the companys losses and the industry index, we extend previous studies by allowing for non-linear
dependencies between relevant processes (high-risk and low-risk assets, insurance companys loss and industry index). The
analysis shows that both the type and degree of dependence play a considerable role with regard to basis risk and solvency
capital requirements and that other factors, such as relevant contract parameters of index-linked catastrophic loss instruments,
should not be neglected to obtain a comprehensive and holistic view of their effect upon risk reduction.

Available via Athens: Palgrave MacMillan
http://www.openathens.net

Optimal dividend and investing control of an insurance company with higher solvency constraints. Liang, Zongxia; Huang,
Jianping [RKN: 44952]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 501-511.
This paper considers the optimal control problem of a large insurance company under a fixed insolvency probability. The company
controls proportional reinsurance rate, dividend pay-outs and investing process to maximize the expected present value of the
dividend pay-outs until the time of bankruptcy. This paper aims at describing the optimal return function as well as the optimal
policy. As a by-product, the paper theoretically sets a risk-based capital standard to ensure the capital requirement that can cover
the total risk.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

SOLVENCY II

Multivariate stress scenarios and solvency. McNeil, Alexander J; Smith, Andrew D [RKN: 45634]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 299-308.
We show how the probabilistic concepts of half-space trimming and depth may be used to define convex scenario sets Qa for
stress testing the risk factors that affect the solvency of an insurance company over a prescribed time period. By choosing the
scenario in Qa which minimizes net asset value at the end of the time period, we propose the idea of the least solvent likely event
(LSLE) as a solution to the forward stress testing problem. By considering the support function of the convex scenario set Qa, we
establish theoretical properties of the LSLE when financial risk factors can be assumed to have a linear effect on the net assets of
an insurer. In particular, we show that the LSLE may be interpreted as a scenario causing a loss equivalent to the Value-at-Risk
(VaR) at confidence level a, provided the a-quantile is a subadditive risk measure on linear combinations of the risk factors. In this
case, we also show that the LSLE has an interpretation as a per-unit allocation of capital to the underlying risk factors when the
overall capital is determined according to the VaR. These insights allow us to define alternative scenario sets that relate in similar
ways to coherent measures, such as expected shortfall. We also introduce the most likely ruin event (MLRE) as a solution to the
problem of reverse stress testing.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Repeat performance. Hursey, Chris Staple Inn Actuarial Society, - 2 pages. [RKN: 74930]
Shelved at: Per: Actuary (Oxf) Per: Actuary (Lon) Shelved at: JOU
The Actuary (2012) January/February : 32-33.
Chris Hursey describes a new and original theory on determining optimal calibration nodes for replicating formulae
http://www.theactuary.com/

The Solvency II square-root formula for systematic biometric risk. Christiansen, Marcus C; Denuit, Michel M; Lazar, Dorina [RKN:
45599]
Shelved at: Online Only Shelved at: Online Only
176

Insurance: Mathematics & Economics (2012) 50 (2) : 257-265.
In this paper, we develop a model supporting the so-called square-root formula used in Solvency II to aggregate the modular life
SCR. Describing the insurance policy by a Markov jump process, we can obtain expressions similar to the square-root formula in
Solvency II by means of limited expansions around the best estimate. Numerical illustrations are given, based on German
population data. Even if the square-root formula can be supported by theoretical considerations, it is shown that the QIS
correlation matrix is highly questionable.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

SPARRE ANDERSEN MODEL

A generalized penalty function in Sparre Andersen risk models with surplus-dependent premium. Cheung, Eric C K [RKN: 45133]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (3) : 384-397.
In a general Sparre Andersen risk model with surplus-dependent premium income, the generalization of GerberShiu function
proposed by Cheung et al. (2010a) is studied. A general expression for such GerberShiu function is derived, and it is shown that
its determination reduces to the evaluation of a transition function which is independent of the penalty function. Properties of and
explicit expressions for such a transition function are derived when the surplus process is subject to (i) constant premium; (ii) a
threshold dividend strategy; or (iii) credit interest. Extension of the approach is discussed for an absolute ruin model with debit
interest.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

On the absolute ruin problem in a Sparre Andersen risk model with constant interest. Mitric, Ilie-Radu; Badescu, Andrei L; Stanford,
David A [RKN: 45533]
Insurance: Mathematics & Economics (2012) 50 (1) : 167-178.
In this paper, we extend the work of Mitric and Sendova (2010), which considered the absolute ruin problem in a risk model with
debit and credit interest, to renewal and non-renewal structures. Our first results apply to MAP processes, which we later restrict to
the Sparre Andersen renewal risk model with interclaim times that are generalized Erlang (n) distributed and claim amounts
following a Matrix-Exponential (ME) distribution (see for e.g. Asmussen and OCinneide (1997)). Under this scenario, we present
a general methodology to analyze the GerberShiu discounted penalty function defined at absolute ruin, as a solution of
high-order linear differential equations with non-constant coefficients. Closed-form solutions for some absolute ruin related
quantities in the generalized Erlang (2) case complement the results obtained under the classical risk model by Mitric and
Sendova (2010).
Available via Athens: Palgrave MacMillan
http://www.openathens.net

SPARRE ANDERSEN RISK MODEL

The distributions of some quantities for Erlang(2) risk models. Dickson, David C M; Li, Shuanming (2012). - Victoria: University of
Melbourne, 2012. - 18 pages. [RKN: 73947]
We study the distributions of [1] the first time that the surplus reaches a given level and [2] the duration of negative surplus in a
Sparre Andersen risk process with the inter-claim times being Erlang(2) distributed. These distributions can be obtained through
the inversion of Laplace transforms using the inversion relationship for the Erlang(2) risk model given by Dickson and Li (2010).
No.1 (1993) onwards available online. Download as PDF.
http://www.economics.unimelb.edu.au/ACT/papers.shtml

STAPLE INN

The intellectual and cultural world of the early modern Inns of Court. Archer, Jayne Elisabeth (ed); Goldring, Elizabeth (ed); Knight,
Sarah (ed) (2011). - Manchester: Manchester University Press, 2011. - xvi, 334 p. + 4 p. of plates: ill. (some col.) pages. [RKN: 44484]
Shelved at: EZG/6 (Lon)
Reference is made to Staple Inn (an Inn of Chancery) with illustration but focuses on history of the Inns of Court of London.

STATISTICAL ANALYSIS

Statistical analysis of model risk concerning temperature residuals and its impact on pricing weather derivatives. Ahcan, Ales
[RKN: 44998]
Insurance: Mathematics & Economics (2012) 50 (1) : 131-138.
In this paper we model the daily average temperature via an extended version of the standard Ornstein Uhlenbeck process driven
by a Levy noise with seasonally adjusted asymmetric ARCH process for volatility. More precisely, we model the disturbances with
the Normal inverse Gaussian (NIG) and Variance gamma (VG) distribution. Besides modelling the residuals we also compare the
177

prices of January 2010 out of the money call and put options for two of the Slovenian largest cities Ljubljana and Maribor under
normally distributed disturbances and NIG and VG distributed disturbances. The results of our numerical analysis demonstrate
that the normal model fails to capture adequately tail risk, and consequently significantly misprices out of the money options. On
the other hand prices obtained using NIG and VG distributed disturbances fit well to the results obtained by bootstrapping the
residuals. Thus one should take extreme care in choosing the appropriate statistical model.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Statistical analysis of the spreads of catastrophe bonds at the time of issue. Papachristou, Dimitris [RKN: 45308]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU/AST
ASTIN Bulletin (2011) 41 (1) : 251-273.
In this paper the catastrophe bond prices, as determined by the market, are analysed. The limited published work in this area has
been carried out mainly by cat bond investors and is based either on intuition, or on simple linear regression on one factor or on
comparisons of the prices of cat bonds with similar features. In this paper a Generalised Additive Model is fitted to the market data.
The statistical significance of different factors which may affect the cat bond prices is examined and the effect of these factors on
the prices is measured. A statistical framework and analysis could provide insight into the cat bond pricing and could have
applications among other things in the construction of a cat bond portfolio, cat bond price indices and in understanding changes of
the price of risk over time.
online access via International Actuarial Association:
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

STATISTICS

Bayesian theory and methods with applications. Savchuk, Vladimir; Tsokos, Chris P (2011). Atlantis Press, 2011. - 317 pages.
[RKN: 74707]
Shelved at: 519.5
Bayesian methods are growing more and more popular, finding new practical applications in the fields of health sciences,
engineering, environmental sciences, business and economics and social sciences, among others. This book explores the use of
Bayesian analysis in the statistical estimation of the unknown phenomenon of interest. The contents demonstrate that where such
methods are applicable, they offer the best possible estimate of the unknown. Beyond presenting Bayesian theory and methods of
analysis, the text is illustrated with a variety of applications to real world problems.

STOCHASTIC INVESTMENT MODELS

Optimal asset allocation for DC pension plans under inflation. Han, Nan-wei; Hung, Mao-Wei [RKN: 45734]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 172-181.
In this paper, the stochastic dynamic programming approach is used to investigate the optimal asset allocation for a
defined-contribution pension plan with downside protection under stochastic inflation. The plan participant invests the fund wealth
and the stochastic interim contribution flows into the financial market. The nominal interest rate model is described by the
CoxIngersollRoss (Cox et al., 1985) dynamics. To cope with the inflation risk, the inflation indexed bond is included in the asset
menu. The retired individuals receive an annuity that is indexed by inflation and a downside protection on the amount of this
annuity is considered. The closed-form solution is derived under the CRRA utility function. Finally, a numerical application is
presented to characterize the dynamic behavior of the optimal investment strategy.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Portfolio selection through an extremality stochastic order. Laniado, Henry; Lillo, Rosa E; Pellerey, Franco; Romo, Juan [RKN:
45718]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 1-9.
In this paper, we introduce a new multivariate stochastic order that compares random vectors in a direction which is determined by
a unit vector, generalizing the previous upper and lower orthant orders. The main properties of this new order, together with its
relationships with other multivariate stochastic orders, are investigated and, we present some examples of application in the
determination of optimal allocations of wealth among risks in single period portfolio problems.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

STOCHASTIC MODELS

An application of comonotonicity theory in a stochastic life annuity framework. Liu, Xiaoming; Jang, Jisoo; Kim, Sun Mee [RKN:
40022]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 271-279.
178

A life annuity contract is an insurance instrument which pays pre-scheduled living benefits conditional on the survival of the
annuitant. In order to manage the risk borne by annuity providers, one needs to take into account all sources of uncertainty that
affect the value of future obligations under the contract. In this paper, we define the concept of annuity rate as the conditional
expected present value random variable of future payments of the annuity, given the future dynamics of its risk factors. The
annuity rate deals with the non-diversifiable systematic risk contained in the life annuity contract, and it involves mortality risk as
well as investment risk. While it is plausible to assume that there is no correlation between the two risks, each affects the annuity
rate through a combination of dependent random variables. In order to understand the probabilistic profile of the annuity rate, we
apply comonotonicity theory to approximate its quantile function. We also derive accurate upper and lower bounds for prediction
intervals for annuity rates. We use the LeeCarter model for mortality risk and the Vasicek model for the term structure of interest
rates with an annually renewable fixed-income investment policy. Different investment strategies can be handled using this
framework.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Bayesian stochastic mortality modelling for two populations. Cairns, Andrew J G; Blake, David; Dowd, Kevin; Coughlan, Guy D;
Khalaf-Allah, Marwa [RKN: 45299]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU/AST
ASTIN Bulletin (2011) 41 (1) : 29-59.
This paper introduces a new framework for modelling the joint development over time of mortality rates in a pair of related
populations with the primary aim of producing consistent mortality forecasts for the two populations. The primary aim is achieved
by combining a number of recent and novel developments in stochastic mortality modelling, but these, additionally, provide us with
a number of side benefits and insights for stochastic mortality modelling. By way of example, we propose an Age-Period-Cohort
model which incorporates a mean-reverting stochastic spread that allows for different trends in mortality improvement rates in the
short-run, but parallel improvements in the long run. Second, we fit the model using a Bayesian framework that allows us to
combine estimation of the unobservable state variables and the parameters of the stochastic processes driving them into a single
procedure. Key benefits of this include dampening down of the impact of Poisson variation in death counts, full allowance for
paramater uncertainty, and the flexibility to deal with missing data. The framework is designed for large populations coupled with
a small sub-population and is applied to the England & Wales national and Continuous Mortality Investigation assured lives males
populations. We compare and contrast results based on the two-population approach with single-population results.
online access via International Actuarial Association:
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Calibrating affine stochastic mortality models using term assurance premiums. Russo, Vincenzo; Giacometti, Rosella; Ortobelli,
Sergio; Rachev, Svetlozar; Fabozzi, Frank J [RKN: 44975]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (1) : 53-60.
In this paper, we focus on the calibration of affine stochastic mortality models using term assurance premiums. We view term
assurance contracts as a swap in which policyholders exchange cash flows (premiums vs. benefits) with an insurer analogous to
a generic interest rate swap or credit default swap. Using a simple bootstrapping procedure, we derive the term structure of
mortality rates from a stream of contract quotes with different maturities. This term structure is used to calibrate the parameters of
affine stochastic mortality models where the survival probability is expressed in closed form. The Vasicek, CoxIngersollRoss,
and jump-extended Vasicek models are considered for fitting the survival probabilities term structure. An evaluation of the
performance of these models is provided with respect to premiums of three Italian insurance companies.

Available via Athens: Palgrave MacMillan
http://www.openathens.net

Cash flow simulation for a model of outstanding liabilities based on claim amounts and claim numbers. Martinez Miranda, Maria
Dolores; Nielsen, Bent; Nielsen, Jens Perch; Verrall, Richard [RKN: 45302]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU/AST
ASTIN Bulletin (2011) 41 (1) : 107-129.
In this paper we develop a full stochastic cash flow model of outstanding liabilities for the model developed in Verrall, Nielsen and
Jessen (2010). This model is based on the simple triangular data available in most non-life insurance companies. By using more
data, it is expected that the method will have less volatility than the celebrated chain ladder method. Eventually, our method will
lead to lower solvency requirements for those insurance companies that decide to collect counts data and replace their
conventional chain ladder method.
online access via International Actuarial Association:
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Classical and singular stochastic control for the optimal dividend policy when there is regime switching. Sotomayor, Luz R;
Cadenillas, Abel [RKN: 45128]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (3) : 344-354.
Motivated by economic and empirical arguments, we consider a company whose cash surplus is affected by macroeconomic
conditions. Specifically, we model the cash surplus as a Brownian motion with drift and volatility modulated by an observable
continuous-time Markov chain that represents the regime of the economy. The objective of the management is to select the
dividend policy that maximizes the expected total discounted dividend payments to be received by the shareholders. We study two
different cases: bounded dividend rates and unbounded dividend rates. These cases generate, respectively, problems of classical
stochastic control with regime switching and singular stochastic control with regime switching. We solve these problems, and
obtain the first analytical solutions for the optimal dividend policy in the presence of business cycles. We prove that the optimal
dividend policy depends strongly on macroeconomic conditions.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

179

Dynamic hedging of conditional value-at-risk. Melnikov, Alexander; Smirnov, Ivan [RKN: 45735]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 182-190.
In this paper, the problem of partial hedging is studied by constructing hedging strategies that minimize conditional value-at-risk
(CVaR) of the portfolio. Two dual versions of the problem are considered: minimization of CVaR with the initial wealth bounded
from above, and minimization of hedging costs subject to a CVaR constraint. The NeymanPearson lemma approach is used to
deduce semi-explicit solutions. Our results are illustrated by constructing CVaR-efficient hedging strategies for a call option in the
BlackScholes model and also for an embedded call option in an equity-linked life insurance contract.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Mortality density forecasts: An analysis of six stochastic mortality models. Cairns, Andrew J G; Blake, David; Dowd, Kevin; Epstein,
David; Khalaf-Allah, Marwa; Coughlan, Guy D [RKN: 45129]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (3) : 355-367.
This paper develops a framework for developing forecasts of future mortality rates. We discuss the suitability of six stochastic
mortality models for forecasting future mortality and estimating the density of mortality rates at different ages. In particular, the
models are assessed individually with reference to the following qualitative criteria that focus on the plausibility of their forecasts:
biological reasonableness; the plausibility of predicted levels of uncertainty in forecasts at different ages; and the robustness of the
forecasts relative to the sample period used to fit the model. An important, though unsurprising, conclusion is that a good fit to
historical data does not guarantee sensible forecasts. We also discuss the issue of model risk, common to many modelling
situations in demography and elsewhere. We find that even for those models satisfying our qualitative criteria, there are significant
differences among central forecasts of mortality rates at different ages and among the distributions surrounding those central
forecasts.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Prediction uncertainty in the Bornhuetter-Ferguson claims reserving method: revisited. Alai, D H; Merz, M; Wthrich, Mario V
Faculty of Actuaries and Institute of Actuaries; Cambridge University Press, [RKN: 39998]
Shelved at: Per: AAS (Oxf) Per: AAS (Lon)
Annals of Actuarial Science (2011) 5(1) : 7-17.
We revisit the stochastic model of Alai et al. (2009) for the Bornhuetter-Ferguson claims reserving method, Bornhuetter &
Ferguson (1972). We derive an estimator of its conditional mean square error of prediction (MSEP) using an approach that is
based on generalized linear models and maximum likelihood estimators for the model parameters. This approach leads to simple
formulas, which can easily be implemented in a spreadsheet.
http://www.actuaries.org.uk/research-and-resources/pages/access-journals

A statistical basis for claims experience monitoring. Taylor, Greg (2011). - Victoria: University of Melbourne, 2011. - 32 pages.
[RKN: 74757]
By claims experience monitoring is meant the systematic comparison of the forecasts from a claims model with claims experience
as it emerges subsequently. In the event that the stochastic properties of the forecasts are known, the comparison can be
represented as a collection of probabilistic statements. This is stochastic monitoring.
The paper defines this process rigorously in terms of statistical hypothesis testing. If the model is a regression model (which is the
case for most stochastic claims models), then the natural form of hypothesis test is a number of likelihood ratio tests, one for each
parameter in the valuation model. Such testing is shown to be very easily implemented by means of GLM software.
This tests the formal structure of the claims model and is referred to as micro-testing. There may be other quantities (e.g. amount
of claim payments in a defined interval) that require testing for practical reasons. This sort of testing is referred to as macro-testing,
and its formulation is also discussed.
No.1 (1993) onwards available online. Download as PDF.
http://www.economics.unimelb.edu.au/ACT/papers.shtml

A statistical basis for claims experience monitoring. Taylor, Greg Society of Actuaries, - 18 pages. [RKN: 74921]
Shelved at: Per: NAAJ (Oxf) Per NAAJ (Lon) Shelved at: JOU
North American Actuarial Journal (2011) 15 (4) : 535-552.
By claims experience monitoring is meant the systematic comparison of the forecasts from a claims model with claims experience
as it emerges subsequently. In the event that the stochastic properties of the forecasts are known, the comparison can be
represented as a collection of probabilistic statements. This is stochastic monitoring. This paper defines this process rigorously in
terms of statistical hypothesis testing. If the model is a regression model (which is the case for most stochastic claims models),
then the natural form of hypothesis test is a number of likelihood ratio tests, one for each parameter in the valuation model. Such
testing is shown to be very easily implemented by means of generalized linear modeling software. This tests the formal structure
of the claims model and is referred to as microtesting. There may be other quantities (e.g., amount of claim payments in a defined
interval) that require testing for practical reasons. This sort of testing is referred to as macrotesting, and its formulation is also
discussed.
http://www.soa.org/news-and-publications/publications/journals/naaj/naaj-detail.aspx

Stochastic comparisons for allocations of policy limits and deductibles with applications. Lu, ZhiYi; Meng, LiLi [RKN: 45127]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (3) : 338-343.
In this paper, we study the problem of comparing losses of a policyholder who has an increasing utility function when the form of
coverage is policy limit and deductible. The total retained losses of a policyholder [formula] are ordered in the usual stochastic
order sense when Xi(i=1,,n) are ordered with respect to the likelihood ratio order. The parallel results for the case of deductibles
are obtained in the same way. It is shown that the ordering of the losses are related to the characteristics (log-concavity or
log-convexity) of distributions of the risks. As an application of the comparison results, the optimal problems of allocations of policy
limits and deductibles are studied in usual stochastic order sense and the closed-form optimal solutions are obtained in some
special cases.
Available via Athens: Palgrave MacMillan
180

http://www.openathens.net

Stochastic comparisons of capital allocations with applications. Xu, Maochao; Hu, Taizhong [RKN: 45633]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 293-298.
This paper studies capital allocation problems using a general loss function. Stochastic comparisons are conducted for general
loss functions in several scenarios: independent and identically distributed risks; independent but non-identically distributed risks;
comonotonic risks. Applications in optimal capital allocations and policy limits allocations are discussed as well.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Stochastic orders in time transformed exponential models with applications. Li, Xiaohu; Lin, Jianhua [RKN: 44974]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (1) : 47-52.
This paper studies expectations of a supermodular function of bivariate random risks following [Time Transformed Exponential]
TTE models. Comparison of such expectations are conducted based on some stochastic orders of the involved univariate survival
functions in the models, and also the upper orthant-convex order between two bivariate random risks in TTE models is built. This
corrects Theorem 2.3 of Mulero et al. (2010) and invalidates some results there. Some applications in actuarial science are
presented as well.

Available via Athens: Palgrave MacMillan
http://www.openathens.net

Stochastic projection for large individual losses. Drieskens, Damien; Henry, Marc; Walhin, Jean-Franois; Wielandts, Jrgen [RKN:
44929]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2012) 1 : 1-39.
In this paper we investigate how to estimate ultimate values of large losses. The method is based on the development of individual
losses and therefore allows to compute the netting impact of excess of loss reinsurance. In particular the index clause is properly
accounted for. A numerical example based on real-life data is provided.

Variance-optimal hedging for time-changed Levy processes. Kallsen, Jan; Pauwels, Arnd [RKN: 45251]
Applied Mathematical Finance (2011) 18 (1-2) : 1-28.
In this article, we solve the variance-optimal hedging problem in stochastic volatility (SV) models based on time-changed Levy
processes, that is, in the setup of Carr et al. (2003). The solution is derived using results for general affine models in the
companion article [Kallsen and Pauwels (2009)].

Available via Athens: Taylor & Francis Online
http://www.openathens.net

STOCHASTIC MORTALITY

Evolution of coupled lives' dependency across generations and pricing impact. Luciano, Elisa; Spreeuw, Jaap; Vigna, Elena
(2012). - London: Cass Business School, 2012. - 19 pages. [RKN: 73995]
This paper studies the dependence between coupled lives - both within and across generations - and its effects on prices of
reversionary annuities in the presence of longevity risk. Longevity risk is represented via a stochastic mortality intensity.
Dependence is modelled through copula functions. We consider Archimedean single and multi-parameter copulas. We find that
dependence decreases when passing from older generations to younger generations. Not only the level of dependence but also
its features - as measured by the copula - change across generations: the best-fit Archimedean copula is not the same across
generations. Moreover, for all the generations under exam the single-parameter copula is dominated by the two-parameter one.
The independence assumption produces quantifiable mispricing of reversionary annuities. The misspecification of the copula
produces different mispricing effects on different generations. The research is conducted using a well-known dataset of double life
contracts.
1995 onwards available online. Download as PDF.
http://www.cass.city.ac.uk/research-and-faculty/faculties/faculty-of-actuarial-science-and-insurance/publications/actuarial-resear
ch-reports

STOCHASTIC PROCESSES

Assessing the costs of protection in a context of switching stochastic regimes. Barrieu, Pauline; Bellamy, Nadine; Sahut,
Jean-Michel [RKN: 45880]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (5-6) : 495-511.
We consider the problem of cost assessment in the context of switching stochastic regimes. The dynamics of a given asset include
a background noise, described by a Brownian motion and a random shock, the impact of which is characterized by changes in the
coefficient diffusions. A particular economic agent that is directly exposed to variations in the underlying asset price, incurs some
costs, F(L), when the underlying asset price reaches a certain threshold, L. Ideally, the agent would make advance provision, or
hedge, for these costs at time 0. We evaluate the amount of provision, or the hedging premium, M(L), for these costs in the
disrupted environment, with changes in the regime for a given time horizon, and analyse the sensitivity of this amount to possible
181

model misspecifications.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

Calibration of stock betas from skews of implied volatilities. Fouque, Jean-Pierre; Kollman, Eli [RKN: 45256]
Applied Mathematical Finance (2011) 18 (1-2) : 119-137.
We develop call option price approximations for both the market index and an individual asset using a singular perturbation of a
continuous-time capital asset pricing model in a stochastic volatility environment. These approximations show the role played by
the asset's beta parameter as a component of the parameters of the call option price of the asset. They also show how these
parameters, in combination with the parameters of the call option price for the market, can be used to extract the beta parameter.
Finally, a calibration technique for the beta parameter is derived using the estimated option price parameters of both the asset and
market index. The resulting estimator of the beta parameter is not only simple to implement but has the advantage of being
forward looking as it is calibrated from skews of implied volatilities.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

Claims development result in the paid-incurred chain reserving method. Happ, Sebastian; Merz, Michael; Wthrich, Mario V [RKN:
45724]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 66-72.
We present the one-year claims development result (CDR) in the paid-incurred chain (PIC) reserving model. The PIC reserving
model presented in Merz and Wthrich (2010) is a Bayesian stochastic claims reserving model that considers simultaneously
claims payments and incurred losses information and allows for deriving the full predictive distribution of the outstanding loss
liabilities. In this model we study the conditional mean square error of prediction (MSEP) for the one-year CDR uncertainty, which
is the crucial uncertainty view under Solvency II.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Comparison and bounds for functionals of future lifetimes consistent with life tables. Barz, Christiane; Muller, Alfred [RKN: 45596]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (2) : 229-235.
We derive a new crossing criterion of hazard rates to identify a stochastic order relation between two random variables. We apply
this crossing criterion in the context of life tables to derive stochastic ordering results among three families of fractional age
assumptions: the family of linear force of mortality functions, the family of quadratic survival functions and the power family.
Further, this criterion is used to derive tight bounds for functionals of future lifetimes that exhibit an increasing force of mortality
with given one-year survival probabilities. Numerical examples illustrate our findings.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Corrections to the prices of derivatives due to market incompleteness. German, David [RKN: 45258]
Applied Mathematical Finance (2011) 18 (1-2) : 155-187.
We compute the first-order corrections to marginal utility-based prices with respect to a 'small' number of random endowments in
the framework of three incomplete financial models. They are a stochastic volatility model, a basis risk and market portfolio model
and a credit-risk model with jumps and stochastic recovery rate.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

Covariance of discounted compound renewal sums with a stochastic interest rate. Lveill, Ghislain; Adkambi, Franck [RKN:
45356]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 2 : 138-153.
Formulas have been obtained for the moments of the discounted aggregate claims process, for a constant instantaneous interest
rate, and for a claims number process that is an ordinary or a delayed renewal process. In this paper, we present explicit formulas
on the first two moments and the joint moment of this risk process, for a non-trivial extension to a stochastic instantaneous interest
rate. Examples are given for Erlang claims number processes, and for the Ho-Lee-Merton and the Vasicek interest rate models.

Hedging of spatial temperature risk with market-traded futures. Barth, Andrea; Benth, Fred Espen; Potthoff, Jurgen [RKN: 45255]
Applied Mathematical Finance (2011) 18 (1-2) : 93-117.
The main objective of this work is to construct optimal temperature futures from available market-traded contracts to hedge spatial
risk. Temperature dynamics are modelled by a stochastic differential equation with spatial dependence. Optimal positions in
market-traded futures minimizing the variance are calculated. Examples with numerical simulations based on a fast algorithm for
the generation of random fields are presented.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

Laws of small numbers : Extremes and rare events. Falk, Michael; Hsler, Jrg; Reis, Rolf-Dieter (2011). - 3rd, revised and extended
ed. Birkhuser, 2011. - 509 pages. [RKN: 73659]
Shelved at: 519.24 22 LAW
"The intention of the book is to give a mathematically oriented development of the theory of rare events underlying various
applications. This characteristic of the book was strengthened in the second edition by incorporating various new results. In this
third edition, the dramatic change of focus of extreme value theory has been taken into account: from concentrating on maxima of
182

observations it has shifted to large observations, defined as exceedances over high thresholds. One emphasis of the present third
edition lies on multivariate generalized Pareto distributions, their representations, properties such as their peaks-over-threshold
stability, simulation, testing and estimation."

Multi-period mean-variance portfolio selection with regime switching and a stochastic cash flow. Wu, Huiling; Li, Zhongfei [RKN:
45640]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 371-384.
This paper investigates a non-self-financing portfolio optimization problem under the framework of multi-period meanvariance
with Markov regime switching and a stochastic cash flow. The stochastic cash flow can be explained as capital additions or
withdrawals during the investment process. Specially, the cash flow is the surplus process or the risk process of an insurer at each
period. The returns of assets and amount of the cash flow all depend on the states of a stochastic market which are assumed to
follow a discrete-time Markov chain. We analyze the existence of optimal solutions, and derive the optimal strategy and the
efficient frontier in closed-form. Several special cases are discussed and numerical examples are given to demonstrate the effect
of cash flow.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

On cross-currency models with stochastic volatility and correlated interest rates. Grzelak, Lech A; Oosterleeac, Cornelis W
Routledge, [RKN: 45793]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (1-2) : 1-35.
We construct multi-currency models with stochastic volatility (SV) and correlated stochastic interest rates with a full matrix of
correlations. We first deal with a foreign exchange (FX) model of Heston-type, in which the domestic and foreign interest rates are
generated by the short-rate process of HullWhite (Hull, J. and White, A. [1990] Pricing interest-rate derivative securities, Review
of Financial Studies, 3, pp. 573592). We then extend the framework by modelling the interest rate by an SV displaced-diffusion
(DD) Libor Market Model (Andersen, L. B. G. and Andreasen, J. [2000] Volatility skews and extensions of the libor market model,
Applied Mathematics Finance, 1[7], pp. 132), which can model an interest rate smile. We provide semi-closed form
approximations which lead to efficient calibration of the multi-currency models. Finally, we add a correlated stock to the framework
and discuss the construction, model calibration and pricing of equityFXinterest rate hybrid pay-offs.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

Optimal loss-carry-forward taxation for the Lvy risk model. Wang, Wen-Yuan; Hu, Yijun [RKN: 44997]
Insurance: Mathematics & Economics (2012) 50 (1) : 121-130.
In the spirit of Albrecher and Hipp (2007), Albrecher et al. (2008b) and Kyprianou and Zhou (2009), we consider the reserve
process of an insurance company which is governed by [formula unable to display], where X is a spectrally negative Levy process
with the usual exclusion of negative subordinator or deterministic drift, [formula unable to display] the running supremum of X, and
[formula unable to display] the rate of loss-carry-forward tax at time t which is subject to the taxation allocation policy p and is a
function of [formula unable to display]. The objective is to find the optimal policy which maximizes the total (discounted) taxation
pay-out: [formula unable to display], where [formula unable to display] and [formula unable to display] refer to the expectation
corresponding to the law of X such that [formula unable to display], and the time of ruin, respectively. With the scale function of X
denoted by [formula unable to display] and [formula unable to display] allowed to vary in , two situations are considered.

Available via Athens: Palgrave MacMillan
http://www.openathens.net

Optimal time-consistent investment and reinsurance strategies for insurers under Hestons SV model. Li, Zhongfei; Zeng, Yan;
Lai, Yongzeng [RKN: 45736]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 191-203.
This paper considers the optimal time-consistent investment and reinsurance strategies for an insurer under Hestons stochastic
volatility (SV) model. Such an SV model applied to insurers portfolio problems has not yet been discussed as far as we know. The
surplus process of the insurer is approximated by a Brownian motion with drift. The financial market consists of one risk-free asset
and one risky asset whose price process satisfies Hestons SV model. Firstly, a general problem is formulated and a verification
theorem is provided. Secondly, the closed-form expressions of the optimal strategies and the optimal value functions for the
meanvariance problem without precommitment are derived under two cases: one is the investmentreinsurance case and the
other is the investment-only case. Thirdly, economic implications and numerical sensitivity analysis are presented for our results.
Finally, some interesting phenomena are found and discussed.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Stochastic comparisons of distorted variability measures. Sordo, Miguel A; Suarez-Llorens, Alfonso [RKN: 44970]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (1) : 11-17.
In this paper, the authors consider the dispersive order and the excess wealth order to compare the variability of distorted
distributions. We know from Sordo (2009a) that the excess wealth order can be characterized in terms of a class of variability
measures associated to the tail conditional distribution which includes, as a particular measure, the tail variance. Given that the tail
conditional distribution is a particular distorted distribution, a natural question is whether this result can be extended to include
other classes of variability measures associated to general distorted distributions. As the authors show in this paper, the answer is
yes, by focusing on distorted distributions associated to concave distortion functions. For distorted distributions associated to
more general distortions, the characterizations are stated in terms of the stronger dispersive order.

Available via Athens: Palgrave MacMillan
http://www.openathens.net
183


Stochastic expansion for the pricing of call options with discrete dividends. Etore, Pierre; Gobet, Emmanuel Routledge, [RKN:
45839]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (3-4) : 233-264.
In the context of an asset paying affine-type discrete dividends, we present closed analytical approximations for the pricing of
European vanilla options in the BlackScholes model with time-dependent parameters. They are obtained using a stochastic
Taylor expansion around a shifted lognormal proxy model. The final formulae are, respectively, first-, second- and third- order
approximations w.r.t. the fixed part of the dividends. Using CameronMartin transformations, we provide explicit representations
of the correction terms as Greeks in the BlackScholes model. The use of Malliavin calculus enables us to provide tight error
estimates for our approximations. Numerical experiments show that this approach yields very accurate results, in particular
compared with known approximations of Bos, Gairat and Shepeleva (20035. Bos , R. , Gairat , A. and Shepeleva , D. 2003 .
Dealing with discrete dividends . Risk Magazine , 16 : 109 112 and Veiga and Wystup (2009, Closed formula for options with
discrete dividends and its derivatives, Applied Mathematical Finance, 16(6): 517531) and quicker than the iterated integration
procedure of Haug, Haug and Lewis (2003, Back to basics: a new approach to the discrete dividend problem, Wilmott Magazine,
pp. 37 47) or than the binomial tree method of Vellekoop and Nieuwenhuis (2006, Efficient pricing of derivatives on assets with
discrete dividends, Applied Mathematical Finance, 13(3), pp. 265284).

Available via Athens: Taylor & Francis Online
http://www.openathens.net

A time-dependent variance model for pricing variance and volatility swaps. Goard, Joanna [RKN: 45253]
Applied Mathematical Finance (2011) 18 (1-2) : 51-70.
Analytic solutions are found for prices of variance and volatility swaps under a new time-dependent stochastic model for the
dynamics of variance. The main features of the new stochastic differential equation are (1) an empirically validated diffusion term
and (2) a free function of time as a moving target in a reversion term, allowing additional flexibility for model calibration against
market data.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

TVaR-based capital allocation for multivariate compound distributions with positive continuous claim amounts. Cossette,
Helene; Mailhot, Melina; Marceau, Etienne [RKN: 45598]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (2) : 247-256.
In this paper, we consider a portfolio of n dependent risks X1,,Xn and we study the stochastic behavior of the aggregate claim
amount [unable to display]. Our objective is to determine the amount of economic capital needed for the whole portfolio and to
compute the amount of capital to be allocated to each risk X1,,Xn. To do so, we use a topdown approach. For (X1,,Xn), we
consider risk models based on multivariate compound distributions defined with a multivariate counting distribution. We use the
TVaR to evaluate the total capital requirement of the portfolio based on the distribution of S, and we use the TVaR-based capital
allocation method to quantify the contribution of each risk. To simplify the presentation, the claim amounts are assumed to be
continuously distributed. For multivariate compound distributions with continuous claim amounts, we provide general formulas for
the cumulative distribution function of S, for the TVaR of S and the contribution to each risk. We obtain closed-form expressions for
those quantities for multivariate compound distributions with gamma and mixed Erlang claim amounts. Finally, we treat in detail
the multivariate compound Poisson distribution case. Numerical examples are provided in order to examine the impact of the
dependence relation on the TVaR of S, the contribution to each risk of the portfolio, and the benefit of the aggregation of several
risks.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

STOCHASTIC RESERVING

Bootstrapping individual claim histories. Rosenlund, Stig - 34 pages. [RKN: 70752]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2012) 42 (1) : 291-324.
The bootstrap method BICH is given for estimating mean square prediction errors and predictive distributions of non-life claim
reserves under weak conditions. The dates of claim occurrence, reporting and finalization and the payment dates and amounts of
individual finalized historic claims form a claim set from which samples with replacement are drawn. We assume that all claims are
independent and that the historic claims are distributed as the object claims, possibly after inflation adjustment and segmentation
on a background variable, whose distribution could have changed over time due to portfolio change. Also we introduce the new
reserving function RDC, using all these dates and payments for reserve predictions. We study three reserving functions: chain
ladder, the Schnieper (1991) method and RDC. Checks with simulated cases obeying the assumptions of Mack (1999) for chain
ladder and Liu and Verrall (2009) for Schniepers method, respectively, confirm the validity of our method. BICH is used to
compare the three reserving functions, of which RDC is found overall best in simulated cases.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

STOCHASTIC VOLATILITY

On the approximation of the SABR model : A probabilistic approach. Kennedy, Joanne E; Mitra, Subhankar; Pham, Duy [RKN:
184

45883]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (5-6) : 553-586.
In this article, we derive a probabilistic approximation for three different versions of the SABR model: Normal, Log-Normal and a
displaced diffusion version for the general case. Specifically, we focus on capturing the terminal distribution of the underlying
process (conditional on the terminal volatility) to arrive at the implied volatilities of the corresponding European options for all
strikes and maturities. Our resulting method allows us to work with a variety of parameters that cover the long-dated options and
highly stress market condition. This is a different feature from other current approaches that rely on the assumption of very small
total volatility and usually fail for longer than 10 years maturity or large volatility of volatility (Volvol).

Available via Athens: Taylor & Francis Online
http://www.openathens.net

Options on realized variance in log-OU models. Drimus, Gabriel G [RKN: 45879]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (5-6) : 477-494.
We study the pricing of options on realized variance in a general class of Log-OU (Ornsteinhlenbeck) stochastic volatility
models. The class includes several important models proposed in the literature. Having as common feature the log-normal law of
instantaneous variance, the application of standard FourierLaplace transform methods is not feasible. We derive extensions of
Asian pricing methods, to obtain bounds, in particular, a very tight lower bound for options on realized variance.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

The stochastic intrinsic currency volatility model : A consistent framework for multiple FX rates and their volatilities. Doust, Paul
Routledge, [RKN: 45876]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (5-6) : 381-445.
The SABR and the Heston stochastic volatility models are widely used for foreign exchange (FX) option pricing. Although they are
able to reproduce the market's volatility smiles and skews for single FX rates, they cannot be extended to model multiple FX rates
in a consistent way. This is because when two FX rates with a common currency are described by either SABR or Heston
processes, the stochastic process for the third FX rate associated with the three currencies will not be a SABR or a Heston
process. A consistent description of the FX market should be symmetric so that all FX rates are described by the same type of
stochastic process. This article presents a way of doing that using the concept of intrinsic currency values. To model FX volatility
curves, the intrinsic currency framework is extended by allowing the volatility of each intrinsic currency value to be stochastic. This
makes the framework more realistic, while preserving all FX market symmetries. The result is a new SABR-style option pricing
formula and a model that can simultaneously be calibrated to the volatility smiles of all possible currency pairs under
consideration. Consequently, it is more powerful than modelling the volatility curves of each currency pair individually and offers a
methodology for comparing the volatility curves of different currency pairs against each other. One potential application of this is in
the pricing of FX options, such as vanilla options on less liquid currency pairs. Given the volatilities of less liquid currencies
against, for example, USD and EUR, the model could then be used to calculate the volatility smiles of those less liquid currencies
against all other currencies.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

STOCK MARKET

Exponential change of measure applied to term structures of interest rates and exchange rates. Bo, Lijun [RKN: 44964]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (2) : 216-225.
In this paper, we study the term structures of interest rates and foreign exchange rates through establishing a state-price deflator.
The state-price deflator considered here can be viewed as an extension to the potential representation of the state-price density in
[Rogers, L.C.G., 1997. The potential approach to the term structure of interest rates and foreign exchange rates. Mathematical
Finance 7(2), 157164]. We identify a risk-neutral probability measure from the state-price deflator by a technique of exponential
change of measure for Markov processes proposed by [Palmowski, Z., Rolski, T., 2002. A technique for exponential change of
measure for Markov processes. Bernoulli 8(6), 767785] and present examples of several classes of diffusion processes
(jumpdiffusions and diffusions with regime-switching) to illustrate the proposed theory. A comparison between the exponential
change of measure and the Esscher transform for identifying risk-neutral measures is also presented. Finally, we consider the
exchange rate dynamics by virtue of the ratio of the current state-price deflators between two economies and then discuss the
pricing of currency options.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Optimal proportional reinsurance and investment in a stock market with OrnsteinUhlenbeck process. Liang, Zhibin; Yuen, Kam
Chuen; Guo, Junyi [RKN: 44963]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (2) : 207-215.
In this paper, the authors study the optimal investment and proportional reinsurance strategy when an insurance company wishes
to maximize the expected exponential utility of the terminal wealth. It is assumed that the instantaneous rate of investment return
follows an OrnsteinUhlenbeck process. Using stochastic control theory and HamiltonJacobiBellman equations, explicit
expressions for the optimal strategy and value function are derived not only for the compound Poisson risk model but also for the
Brownian motion risk model. Further, we investigate the partially observable optimization problem, and also obtain explicit
185

expressions for the optimal results.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

STOCKS AND SHARES

The herd behavior index : A new measure for the implied degree of co-movement in stock markets. Dhaene, Jan; Linders, Daniel;
Schoutens, Wim; Vyncke, David [RKN: 45639]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 357-370.
We introduce a new and easy-to-calculate measure for the expected degree of herd behaviour or co-movement between stock
prices. This forward looking measure is model-independent and based on observed option data. It is baptized the Herd Behavior
Index (HIX). The degree of co-movement in a stock market can be determined by comparing the observed market situation with
the extreme (theoretical) situation under which the whole system is driven by a single factor. The HIX is then defined as the ratio of
an option-based estimate of the risk-neutral variance of the market index and an option-based estimate of the corresponding
variance in case of the extreme single factor market situation. The HIX can be determined for any market index provided an
appropriate series of vanilla options is traded on this index as well as on its components. As an illustration, we determine historical
values of the 30-days HIX for the Dow Jones Industrial Average, covering the period January 2003 to October 2009.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

STOP LOSS

Optimal reinsurance under VaR and CVaR risk measures. Chi, Yichun; Tan, Ken Seng - 23 pages. [RKN: 74744]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2011) 41 (2) : 487-509.
In this paper, we study two classes of optimal reinsurance models by minimizing the total risk exposure of an insurer under the
criteria of value at risk (VaR) and conditional value at risk (CVaR). We assume that the reinsurance premium is calculated
according to the expected value principle. Explicit solutions for the optimal reinsurance policies are derived over ceded loss
functions with increasing degrees of generality. More precisely, we establish formally that under the VaR minimization model, (i)
the stop-loss reinsurance is optimal among the class of increasing convex ceded loss functions; (ii) when the constraints on both
ceded and retained loss functions are relaxed to increasing functions, the stop-loss reinsurance with an upper limit is shown to be
optimal; (iii) and finally under the set of general increasing and left-continuous retained loss functions, the truncated stop-loss
reinsurance is shown to be optimal. In contrast, under CVaR risk measure, the stop-loss reinsurance is shown to be always
optimal. These results suggest that the VaR-based reinsurance models are sensitive with respect to the constraints imposed on
both ceded and retained loss functions while the corresponding CVaR-based reinsurance models are quite robust.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Optimality of general reinsurance contracts under CTE risk measure. Tan, Ken Seng; Weng, Chengguo; Zhang, Yi [RKN: 44960]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (2) : 175-187.
By formulating a constrained optimization model, we address the problem of optimal reinsurance design using the criterion of
minimizing the conditional tail expectation (CTE) risk measure of the insurers total risk. For completeness, we analyze the optimal
reinsurance model under both binding and unbinding reinsurance premium constraints. By resorting to the Lagrangian approach
based on the concept of directional derivative, explicit and analytical optimal solutions are obtained in each case under some mild
conditions. We show that pure stop-loss ceded loss function is always optimal. More interestingly, we demonstrate that ceded loss
functions, that are not always non-decreasing, could be optimal. We also show that, in some cases, it is optimal to exhaust the
entire reinsurance premium budget to determine the optimal reinsurance, while in other cases, it is rational to spend less than the
prescribed reinsurance premium budget.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

STRATEGIC PLANNING

Optimal time-consistent investment and reinsurance strategies for insurers under Hestons SV model. Li, Zhongfei; Zeng, Yan;
Lai, Yongzeng [RKN: 45736]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 191-203.
This paper considers the optimal time-consistent investment and reinsurance strategies for an insurer under Hestons stochastic
volatility (SV) model. Such an SV model applied to insurers portfolio problems has not yet been discussed as far as we know. The
surplus process of the insurer is approximated by a Brownian motion with drift. The financial market consists of one risk-free asset
and one risky asset whose price process satisfies Hestons SV model. Firstly, a general problem is formulated and a verification
theorem is provided. Secondly, the closed-form expressions of the optimal strategies and the optimal value functions for the
meanvariance problem without precommitment are derived under two cases: one is the investmentreinsurance case and the
other is the investment-only case. Thirdly, economic implications and numerical sensitivity analysis are presented for our results.
Finally, some interesting phenomena are found and discussed.
186

Available via Athens: Palgrave MacMillan
http://www.openathens.net

STRESS TESTS

A coherent aggregation framework for stress testing and scenario analysis. Kwiatkowski, Jan; Rebonato, Riccardo [RKN: 45257]
Applied Mathematical Finance (2011) 18 (1-2) : 139-154.
We present a methodology to aggregate in a coherent manner conditional stress losses in a trading or banking book. The
approach bypasses the specification of unconditional probabilities of the individual stress events and ensures by a linear
programming approach so that the (subjective or frequentist) conditional probabilities chosen by the risk manager are internally
consistent. The admissibility requirement greatly reduces the degree of arbitrariness in the conditional probability matrix if this is
assigned subjectively. The approach can be used to address the requirements of the regulators on the Instantaneous Risk
Charge.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

Multivariate stress scenarios and solvency. McNeil, Alexander J; Smith, Andrew D [RKN: 45634]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (3) : 299-308.
We show how the probabilistic concepts of half-space trimming and depth may be used to define convex scenario sets Qa for
stress testing the risk factors that affect the solvency of an insurance company over a prescribed time period. By choosing the
scenario in Qa which minimizes net asset value at the end of the time period, we propose the idea of the least solvent likely event
(LSLE) as a solution to the forward stress testing problem. By considering the support function of the convex scenario set Qa, we
establish theoretical properties of the LSLE when financial risk factors can be assumed to have a linear effect on the net assets of
an insurer. In particular, we show that the LSLE may be interpreted as a scenario causing a loss equivalent to the Value-at-Risk
(VaR) at confidence level a, provided the a-quantile is a subadditive risk measure on linear combinations of the risk factors. In this
case, we also show that the LSLE has an interpretation as a per-unit allocation of capital to the underlying risk factors when the
overall capital is determined according to the VaR. These insights allow us to define alternative scenario sets that relate in similar
ways to coherent measures, such as expected shortfall. We also introduce the most likely ruin event (MLRE) as a solution to the
problem of reverse stress testing.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

SURPLUS

Analyzing surplus appropriation schemes in participating life insurance from the insurers and the policyholders perspective.
Bohnert, Alexander; Gatzert, Nadine [RKN: 44991]
Insurance: Mathematics & Economics (2012) 50 (1) : 64-78.
This paper examines the impact of three surplus appropriation schemes often inherent in participating life insurance contracts on
the insurers shortfall risk and the net present value from an insureds viewpoint. (1) In case of the bonus system, surplus is used
to increase the guaranteed death and survival benefit, leading to higher reserves; (2) the interest-bearing accumulation increases
only the survival benefit by accumulating the surplus on a separate account; and (3) surplus can also be used to shorten the
contract term, which results in an earlier payment of the survival benefit and a reduced sum of premium payments. The pool of
participating life insurance contracts with death and survival benefit is modeled actuarially with annual premium payments;
mortality rates are generated based on an extension of the Lee-Carter (1992) model, and the asset process follows a geometric
Brownian motion. In a simulation analysis, we then compare the influence of different asset portfolios and shocks to mortality on
the insurers risk situation and the policyholders net present value for the three surplus schemes. Our findings demonstrate that,
even though the surplus distribution and thus the amount of surplus is calculated the same way, the type of surplus appropriation
scheme has a substantial impact on the insurers risk exposure and the policyholders net present value.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Risk analysis and valuation of life insurance contracts: combining actuarial and financial approaches. Graf, Stefan; Kling,
Alexander; Ru, Jochen [RKN: 44981]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (1) : 115-125.
In this paper, we analyze traditional (i.e. not unit-linked) participating life insurance contracts with a guaranteed interest rate and
surplus participation. We consider three different surplus distribution models and an asset allocation that consists of money
market, bonds with different maturities, and stocks. In this setting, we combine actuarial and financial approaches by selecting a
risk minimizing asset allocation (under the real world measure P) and distributing terminal surplus such that the contract value
(under the pricing measure Q) is fair. We prove that this strategy is always possible unless the insurance contracts introduce
arbitrage opportunities in the market. We then analyze differences between the different surplus distribution models and
investigate the impact of the selected risk measure on the risk minimizing portfolio.


Available via Athens: Palgrave MacMillan
http://www.openathens.net

187

SURRENDER RATES

A joint valuation of premium payment and surrender options in participating life insurance contracts. Schmeiser, H; Wagner, J
[RKN: 44958]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 580-596.
In addition to an interest rate guarantee and annual surplus participation, life insurance contracts typically embed the right to stop
premium payments during the term of the contract (paid-up option), to resume payments later (resumption option), or to terminate
the contract early (surrender option). Terminal guarantees are on benefits payable upon death, survival and surrender. The latter
are adapted after exercising the options. A model framework including these features and an algorithm to jointly value the
premium payment and surrender options is presented. In a first step, the standard principles of risk-neutral evaluation are applied
and the policyholder is assumed to use an economically rational exercise strategy. In a second step, option value sensitivity on
different contract parameters, benefit adaptation mechanisms, and exercise behavior is analyzed numerically. The two latter are
the main drivers for the option value.

Available via Athens: Palgrave MacMillan
http://www.openathens.net

SURVIVAL ANALYSIS

Calibrating affine stochastic mortality models using term assurance premiums. Russo, Vincenzo; Giacometti, Rosella; Ortobelli,
Sergio; Rachev, Svetlozar; Fabozzi, Frank J [RKN: 44975]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (1) : 53-60.
In this paper, we focus on the calibration of affine stochastic mortality models using term assurance premiums. We view term
assurance contracts as a swap in which policyholders exchange cash flows (premiums vs. benefits) with an insurer analogous to
a generic interest rate swap or credit default swap. Using a simple bootstrapping procedure, we derive the term structure of
mortality rates from a stream of contract quotes with different maturities. This term structure is used to calibrate the parameters of
affine stochastic mortality models where the survival probability is expressed in closed form. The Vasicek, CoxIngersollRoss,
and jump-extended Vasicek models are considered for fitting the survival probabilities term structure. An evaluation of the
performance of these models is provided with respect to premiums of three Italian insurance companies.

Available via Athens: Palgrave MacMillan
http://www.openathens.net

Recursive methods for a multi-dimensional risk process with common shocks. Gong, Lan; Badescu, Andrei L; Cheung, Eric C K
[RKN: 44996]
Insurance: Mathematics & Economics (2012) 50 (1) : 109-120.
In this paper, a multi-dimensional risk model with common shocks is studied. Using a simple probabilistic approach via observing
the risk processes at claim instants, recursive integral formulas are developed for the survival probabilities as well as for a class of
GerberShiu expected discounted penalty functions that include the surplus levels at ruin. Under the assumption of exponential or
mixed Erlang claims, the recursive integrals can be simplified to give recursive sums which are computationally more tractable.
Numerical examples including an optimal capital allocation problem are also given towards the end.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

SWAPS

Efficient algorithms for basket default swap pricing with multivariate Archimedean copulas. Choe, Geon Ho; Jang, Hyun Jin [RKN:
40014]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 205-213.
We introduce a new importance sampling method for pricing basket default swaps employing exchangeable Archimedean copulas
and nested Gumbel copulas. We establish more realistic dependence structures than existing copula models for credit risks in the
underlying portfolio, and propose an appropriate density for importance sampling by analyzing multivariate Archimedean copulas.
To justify efficiency and accuracy of the proposed algorithms, we present numerical examples and compare them with the crude
Monte Carlo simulation, and finally show that our proposed estimators produce considerably smaller variances.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Pricing catastrophe swaps: a contingent claims approach. Braun, Alexander [RKN: 44954]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 520-536.
In this paper, we comprehensively analyze the catastrophe (cat) swap, a financial instrument which has attracted little scholarly
attention to date. We begin with a discussion of the typical contract design, the current state of the market, as well as major areas
of application. Subsequently, a two-stage contingent claims pricing approach is proposed, which distinguishes between the main
risk drivers ex-ante as well as during the loss reestimation phase and additionally incorporates counterparty default risk.
Catastrophe occurrence is modeled as a doubly stochastic Poisson process (Cox process) with mean-reverting
188

OrnsteinUhlenbeck intensity. In addition, we fit various parametric distributions to normalized historical loss data for hurricanes
and earthquakes in the US and find the heavy-tailed Burr distribution to be the most adequate representation for loss severities.
Applying our pricing model to market quotes for hurricane and earthquake contracts, we derive implied Poisson intensities which
are subsequently condensed into a common factor for each peril by means of exploratory factor analysis. Further examining the
resulting factor scores, we show that a first order autoregressive process provides a good fit. Hence, its continuous-time limit, the
OrnsteinUhlenbeck process should be well suited to represent the dynamics of the Poisson intensity in a cat swap pricing model.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

SYSTEMIC RISK

The Solvency II square-root formula for systematic biometric risk. Christiansen, Marcus C; Denuit, Michel M; Lazar, Dorina [RKN:
45599]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (2) : 257-265.
In this paper, we develop a model supporting the so-called square-root formula used in Solvency II to aggregate the modular life
SCR. Describing the insurance policy by a Markov jump process, we can obtain expressions similar to the square-root formula in
Solvency II by means of limited expansions around the best estimate. Numerical illustrations are given, based on German
population data. Even if the square-root formula can be supported by theoretical considerations, it is shown that the QIS
correlation matrix is highly questionable.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

TAIL DEPENDENCE

Measurement and modelling of dependencies in economic capital. Shaw, R A; Smith, A D; Spivak, G S - 99 pages. [RKN: 73863]
Shelved at: Per: BAJ (Oxf) Per: BAJ (Lon) Shelved at: REF
BAJ (2011) 16 (3) : 601-699.
This paper covers a number of different topics related to the measurement and modelling of dependency within economic capital
models. The scope of the paper is relatively wide. We address in some detail the different approaches to modelling dependencies
ranging from the more common variance-covariance matrix approach, to the consideration of the use of copulas and the more
sophisticated causal models that feature feedback loops and other systems design ideas.
There are many data and model uncertainties in modelling dependency and so we have also endeavoured to cover topics such as
spurious relationships and wrong-way risk to highlight some of the uncertainties involved.
With the advent of the internal model approval process under Solvency II, senior management needs to have a greater
understanding of dependency methodology. We have devoted a section of this paper to a discussion of possible different ways to
communicate the results of modelling to the board, senior management and other interested parties within an insurance company.
We have endeavoured throughout this paper to include as many numerical examples as possible to help in the understanding of
the key points, including our discussion of model parameterisation and the communication to an insurance executive of the impact
of dependency on economic capital modelling results.
The economic capital model can be seen as a combination of two key components: the marginal risk distribution of each risk and
the aggregation methodology which combines these into a single aggregate distribution or capital number. This paper is
concerned with the aggregation part, the methods and assumptions employed and the issues arising, and not the determination of
the marginal risk distributions which is equally of importance and in many cases equally as complex.
http://www.actuaries.org.uk/research-and-resources/pages/members-access-journals

Measurement and modelling of dependencies in economic capital : Abstract of the London discussion. Shaw, Richard - 21 pages.
[RKN: 73864]
Shelved at: Per: BAJ (Oxf) Per: BAJ (Lon) Shelved at: REF
BAJ (2011) 16 (3) : 701-721.
This abstract relates to the following paper:

Shaw, R.A., Smith, A.D. & Spivak, G.S. Measurement and modelling of dependencies in economic
capital. British Actuarial Journal, 16 (3).
http://www.actuaries.org.uk/research-and-resources/pages/members-access-journals

TAIL RISK MEASURES

Bias-reduced estimators for bivariate tail modelling. Beirlant, J; Dierckx, G; Guillou, A [RKN: 44971]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (1) : 18-26.
Ledford and Tawn (1997) introduced a flexible bivariate tail model based on the coefficient of tail dependence and on the
dependence of the extreme values of the random variables. In this paper, we extend the concept by specifying the slowly varying
part of the model as done by Hall (1982) with the univariate case. Based on Beirlant et al. (2009), we propose a bias-reduced
estimator for the coefficient of tail dependence and for the estimation of small tail probabilities. We discuss the properties of these
estimators via simulations and a real-life example. Furthermore, we discuss some theoretical asymptotic aspects of this approach.
189


Available via Athens: Palgrave MacMillan
http://www.openathens.net

Characterization of upper comonotonicity via tail convex order. Nam, Hee Seok; Tang, Qihe; Yang, Fan [RKN: 45130]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (3) : 368-373.
In this paper, we show a characterization of upper comonotonicity via tail convex order. For any given marginal distributions, a
maximal random vector with respect to tail convex order is proved to be upper comonotonic under suitable conditions. As an
application, we consider the computation of the Haezendonck risk measure of the sum of upper comonotonic random variables
with exponential marginal distributions.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Estimating the distortion parameter of the proportional-hazard premium for heavy-tailed losses. Brahimi, Brahim; Meraghni,
Djamel; Necir, Abdelhakim; Zitikis, Ricardas [RKN: 44934]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 325-334.
The distortion parameter reflects the amount of loading in insurance premiums. A specific value of a given premium determines a
value of the distortion parameter, which depends on the underlying loss distribution. Estimating the parameter, therefore,
becomes a statistical inferential problem, which has been initiated by Jones and Zitikis [Jones, B.L., Zitikis, R., 2007. Risk
measures, distortion parameters, and their empirical estimation. Insurance: Mathematics and Economics, 41, 279297] in the
case of the distortion premium and tackled within the framework of the central limit theorem. Heavy-tailed losses do not fall into
this framework as they rely on the extreme-value theory. In this paper, we concentrate on a special but important distortion
premium, called the proportional-hazard premium, and propose an estimator for its distortion parameter in the case of heavy-tailed
losses. We derive an asymptotic distribution of the estimator, construct a practically implementable confidence interval for the
distortion parameter, and illustrate the performance of the interval in a simulation study.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Fitting insurance claims to skewed distributions: are the skew-normal and skew-student good models?. Eling, Martin [RKN:
44782]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51(2) : 239-248.
This paper analyzes whether the skew-normal and skew-student distributions recently discussed in the finance literature are
reasonable models for describing claims in property-liability insurance. We consider two well-known datasets from actuarial
science and fit a number of parametric distributions to these data. Also the non-parametric transformation kernel approach is
considered as a benchmark model. We find that the skew-normal and skew-student are reasonably competitive compared to other
models in the literature when describing insurance data. In addition to goodness-of-fit tests, tail risk measures such as value at risk
and tail value at risk are estimated for the datasets under consideration.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

A new class of models for heavy tailed distributions in finance and insurance risk. Ahn, Soohan; Kim, Joseph H T; Ramaswami,
Vaidyanathan [RKN: 45722]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 43-52.
Many insurance loss data are known to be heavy-tailed. In this article we study the class of Log phase-type (LogPH) distributions
as a parametric alternative in fitting heavy tailed data. Transformed from the popular phase-type distribution class, the LogPH
introduced by Ramaswami exhibits several advantages over other parametric alternatives. We analytically derive its tail related
quantities including the conditional tail moments and the mean excess function, and also discuss its tail thickness in the context of
extreme value theory. Because of its denseness proved herein, we argue that the LogPH can offer a rich class of heavy-tailed loss
distributions without separate modeling for the tail side, which is the case for the generalized Pareto distribution (GPD). As a
numerical example we use the well-known Danish fire data to calibrate the LogPH model and compare the result with that of the
GPD. We also present fitting results for a set of insurance guarantee loss data.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

On the interplay between distortion, mean value and Haezendonck-Goovaerts risk measures. Goovaerts, Marc; Linders, Daniel;
Van Weert, Koen; Tank, Fatih [RKN: 45719]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 10-18.
In the actuarial research, distortion, mean value and HaezendonckGoovaerts risk measures are concepts that are usually treated
separately. In this paper we indicate and characterize the relation between these different risk measures, as well as their relation
to convex risk measures. While it is known that the mean value principle can be used to generate premium calculation principles,
we will show how they also allow to generate solvency calculation principles. Moreover, we explain the role provided for the
distortion risk measures as an extension of the Tail Value-at-Risk (TVaR) and Conditional Tail Expectation (CTE).
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Robustefficient credibility models with heavy-tailed claims: A mixed linear models perspective. Dornheim, Harald; Brazauskas,
Vytaras [RKN: 39365]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (1) : 72-84.
In actuarial practice, regression models serve as a popular statistical tool for analyzing insurance data and tariff ratemaking. In this
190

paper, we consider classical credibility models that can be embedded within the framework of mixed linear models. For inference
about fixed effects and variance components, likelihood-based methods such as (restricted) maximum likelihood estimators are
commonly pursued. However, it is well-known that these standard and fully efficient estimators are extremely sensitive to small
deviations from hypothesized normality of random components as well as to the occurrence of outliers. To obtain better estimators
for premium calculation and prediction of future claims, various robust methods have been successfully adapted to credibility
theory in the actuarial literature. The objective of this work is to develop robust and efficient methods for credibility when
heavy-tailed claims are approximately log-locationscale distributed. To accomplish that, we first show how to express additive
credibility models such as BhlmannStraub and Hachemeister ones as mixed linear models with symmetric or asymmetric
errors. Then, we adjust adaptively truncated likelihood methods and compute highly robust credibility estimates for the ordinary
but heavy-tailed claims part. Finally, we treat the identified excess claims separately and find robustefficient credibility premiums.
Practical performance of this approach is examinedvia simulationsunder several contaminating scenarios. A widely studied
real-data set from workers compensation insurance is used to illustrate functional capabilities of the new robust credibility
estimators.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Second order regular variation and conditional tail expectation of multiple risks. Hua, Lei; Joe, Harry [RKN: 44955]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 537-546.
For the purpose of risk management, the study of tail behavior of multiple risks is more relevant than the study of their overall
distributions. Asymptotic study assuming that each marginal risk goes to infinity is more mathematically tractable and has also
uncovered some interesting performance of risk measures and relationships between risk measures by their first order
approximations. However, the first order approximation is only a crude way to understand tail behavior of multiple risks, and
especially for sub-extremal risks. In this paper, we conduct asymptotic analysis on conditional tail expectation (CTE) under the
condition of second order regular variation (2RV). First, the closed-form second order approximation of CTE is obtained for the
univariate case. Then CTE of the form , as , is studied, where is a loss aggregating function and with independent of and the
survivor function of satisfying the condition of 2RV. Closed-form second order approximations of CTE for this multivariate form
have been derived in terms of corresponding value at risk. For both the univariate and multivariate cases, we find that the first
order approximation is affected by only the regular variation index of marginal survivor functions, while the second order
approximation is influenced by both the parameters for first and second order regular variation, and the rate of convergence to the
first order approximation is dominated by the second order parameter only. We have also shown that the 2RV condition and the
assumptions for the multivariate form are satisfied by many parametric distribution families, and thus the closed-form
approximations would be useful for applications. Those closed-form results extend the study of.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Stochastic comparisons of distorted variability measures. Sordo, Miguel A; Suarez-Llorens, Alfonso [RKN: 44970]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (1) : 11-17.
In this paper, the authors consider the dispersive order and the excess wealth order to compare the variability of distorted
distributions. We know from Sordo (2009a) that the excess wealth order can be characterized in terms of a class of variability
measures associated to the tail conditional distribution which includes, as a particular measure, the tail variance. Given that the tail
conditional distribution is a particular distorted distribution, a natural question is whether this result can be extended to include
other classes of variability measures associated to general distorted distributions. As the authors show in this paper, the answer is
yes, by focusing on distorted distributions associated to concave distortion functions. For distorted distributions associated to
more general distortions, the characterizations are stated in terms of the stronger dispersive order.

Available via Athens: Palgrave MacMillan
http://www.openathens.net

Tail distortion risk and its asymptotic analysis. Zhu, Li; Li, Haijun [RKN: 45728]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 115-121.
A distortion risk measure used in finance and insurance is defined as the expected value of potential loss under a scenario
probability measure. In this paper, the tail distortion risk measure is introduced to assess tail risks of excess losses modeled by the
right tails of loss distributions. The asymptotic linear relation between tail distortion and value-at-risk is derived for heavy-tailed
losses with the linear proportionality constant depending only on the distortion function and the tail index. Various examples
involving tail distortions for location-invariant, scale-invariant, and shape-invariant loss distribution families are also presented to
illustrate the results.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

TAX

Optimal asset allocation for passive investing with capital loss harvesting. Ostrov, Daniel N; Wong, Thomas G [RKN: 45461]
Applied Mathematical Finance (2011) 18 (3-4) : 291-329.
This article examines how to quantify and optimally utilize the beneficial effect that capital loss harvesting generates in a taxable
portfolio. We explicitly determine the optimal initial asset allocation for an investor who follows the continuous time dynamic trading
strategy of Constantinides (1983). This strategy sells and re-buys all stocks with losses, but is otherwise passive. Our model
allows the use of the stock position's full purchase history for computing the cost basis. The method can also be used to rebalance
at later times. For portfolios with one stock position and cash, the probability density function for the portfolio's state corresponds
to the solution of a 3-space + 1-time dimensional partial differential equation (PDE) with an oblique reflecting boundary condition.
Extensions of this PDE, including to the case of multiple correlated stocks, are also presented.
191


We detail a numerical algorithm for the PDE in the single stock case. The algorithm shows the significant effect capital loss
harvesting can have on the optimal stock allocation, and it also allows us to compute the expected additional return that capital
loss harvesting generates. Our PDE-based algorithm, compared with Monte Carlo methods, is shown to generate much more
precise results in a fraction of the time. Finally, we employ Monte Carlo methods to approximate the impact of many of our model's
assumptions.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

TERM ASSURANCE

Calibrating affine stochastic mortality models using term assurance premiums. Russo, Vincenzo; Giacometti, Rosella; Ortobelli,
Sergio; Rachev, Svetlozar; Fabozzi, Frank J [RKN: 44975]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (1) : 53-60.
In this paper, we focus on the calibration of affine stochastic mortality models using term assurance premiums. We view term
assurance contracts as a swap in which policyholders exchange cash flows (premiums vs. benefits) with an insurer analogous to
a generic interest rate swap or credit default swap. Using a simple bootstrapping procedure, we derive the term structure of
mortality rates from a stream of contract quotes with different maturities. This term structure is used to calibrate the parameters of
affine stochastic mortality models where the survival probability is expressed in closed form. The Vasicek, CoxIngersollRoss,
and jump-extended Vasicek models are considered for fitting the survival probabilities term structure. An evaluation of the
performance of these models is provided with respect to premiums of three Italian insurance companies.

Available via Athens: Palgrave MacMillan
http://www.openathens.net

TERM STRUCTURE

An affine two-factor heteroskedastic macro-finance term structure model. Spreij, Peter; Veerman, Enno; Vlaar, Peter [RKN: 45462]
Applied Mathematical Finance (2011) 18 (3-4) : 331-352.
We propose an affine macro-finance term structure model for interest rates that allows for both constant volatilities
(homoskedastic model) and state-dependent volatilities (heteroskedastic model). In a homoskedastic model, interest rates are
symmetric, which means that either very low interest rates are predicted too often or very high interest rates not often enough. This
undesirable symmetry for constant volatility models motivates the use of heteroskedastic models where the volatility depends on
the driving factors.

For a truly heteroskedastic model in continuous time, which involves a multivariate square root process, the so-called Feller
conditions are usually imposed to ensure that the roots have non-negative arguments. For a discrete time approximate model, the
Feller conditions do not give this guarantee. Moreover, in a macro-finance context, the restrictions imposed might be economically
unappealing. It has also been observed that even without the Feller conditions imposed, for a practically relevant term structure
model, negative arguments rarely occur.

Using models estimated on German data, we compare the yields implied by (approximate) analytic exponentially affine
expressions to those obtained through Monte Carlo simulations of very high numbers of sample paths. It turns out that the
differences are rarely statistically significant, whether the Feller conditions are imposed or not. Moreover, economically, the
differences are negligible, as they are always below one basis point.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

Bonds and options in exponentially affine bond models. Bermin, Hans-Peter [RKN: 45881]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (5-6) : 513-534.
In this article we apply the FlesakerHughston approach to invert the yield curve and to price various options by letting the
randomness in the economy be driven by a process closely related to the short rate, called the abstract short rate. This process is
a pure deterministic translation of the short rate itself, and we use the deterministic shift to calibrate the models to the initial yield
curve. We show that we can solve for the shift needed in closed form by transforming the problem to a new probability measure.
Furthermore, when the abstract short rate follows a CoxIngersollRoss (CIR) process we compute bond option and swaption
prices in closed form. We also propose a short-rate specification under the risk-neutral measure that allows the yield curve to be
inverted and is consistent with the CIR dynamics for the abstract short rate, thus giving rise to closed form bond option and
swaption prices.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

192

TERRORISM

Calculating catastrophe. Woo, Gordon (2011). - London: Imperial College Press, 2011. - 355 pages. [RKN: 73989]
Shelved at: 363.34
Calculating Catastrophe has been written to explain, to a general readership, the underlying philosophical ideas and scientific
principles that govern catastrophic events, both natural and man-made. Knowledge of the broad range of catastrophes deepens
understanding of individual modes of disaster. This book will be of interest to anyone aspiring to understand catastrophes better,
but will be of particular value to those engaged in public and corporate policy, and the financial markets.

TIME

Bayesian modelling of the time delay between diagnosis and settlement for Critical Illness Insurance using a Burr
generalised-linear-type model. Ozkok, Erengul; Streftaris, George; Waters, Howard R; Wilkie, A David [RKN: 45600]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (2) : 266-279.
We discuss Bayesian modelling of the delay between dates of diagnosis and settlement of claims in Critical Illness Insurance
using a Burr distribution. The data are supplied by the UK Continuous Mortality Investigation and relate to claims settled in the
years 19992005. There are non-recorded dates of diagnosis and settlement and these are included in the analysis as missing
values using their posterior predictive distribution and MCMC methodology. The possible factors affecting the delay (age, sex,
smoker status, policy type, benefit amount, etc.) are investigated under a Bayesian approach. A 3-parameter Burr
generalised-linear-type model is fitted, where the covariates are linked to the mean of the distribution. Variable selection using
Bayesian methodology to obtain the best model with different prior distribution setups for the parameters is also applied. In
particular, Gibbs variable selection methods are considered, and results are confirmed using exact marginal likelihood findings
and related Laplace approximations. For comparison purposes, a lognormal model is also considered.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Dynamic portfolio optimization in discrete-time with transaction costs. Atkinson, Colin; Quek, Gary Routledge, [RKN: 45840]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (3-4) : 265-298.
A discrete-time model of portfolio optimization is studied under the effects of proportional transaction costs. A general class of
underlying probability distributions is assumed for the returns of the asset prices. An investor with an exponential utility function
seeks to maximize the utility of terminal wealth by determining the optimal investment strategy at the start of each time step.
Dynamic programming is used to derive an algorithm for computing the optimal value function and optimal boundaries of the
no-transaction region at each time step. In the limit of small transaction costs, perturbation analysis is applied to obtain the optimal
value function and optimal boundaries at any time step in the rebalancing of the portfolio.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

Levy risk model with two-sided jumps and a barrier dividend strategy. Bo, Lijun; Song, Renming; Tang, Dan; Wang, Tongjin; Yang,
Xuewei [RKN: 45601]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 50 (2) : 280-291.
In this paper, we consider a general Lvy risk model with two-sided jumps and a constant dividend barrier. We connect the ruin
problem of the ex-dividend risk process with the first passage problem of the Lvy process reflected at its running maximum. We
prove that if the positive jumps of the risk model form a compound Poisson process and the remaining part is a spectrally negative
Lvy process with unbounded variation, the Laplace transform (as a function of the initial surplus) of the upward entrance time of
the reflected (at the running infimum) Lvy process exhibits the smooth pasting property at the reflecting barrier. When the surplus
process is described by a double exponential jump diffusion in the absence of dividend payment, we derive some explicit
expressions for the Laplace transform of the ruin time, the distribution of the deficit at ruin, and the total expected discounted
dividends. Numerical experiments concerning the optimal barrier strategy are performed and new empirical findings are
presented.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

A maximum-entropy approach to the linear credibility formula. Najafabadi, Amir T. Payandeh; Hatami, Hamid; Najafabadi, Maryam
Omidi [RKN: 45738]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 216-221.
Payandeh [Payandeh Najafabadi, A.T., 2010. A new approach to credibility formula. Insurance: Mathematics and Economy 46,
334338] introduced a new technique to approximate a Bayes estimator with the exact credibilitys form. This article employs a
well known and powerful maximum-entropy method (MEM) to extend results of Payandeh Najafabadi (2010) to a class of linear
credibility, whenever claim sizes have been distributed according to the logconcave distributions. Namely, (i) it employs the
maximum-entropy method to approximate an appropriate Bayes estimator (with respect to either the square-error or the Linex
loss functions and general increasing and bounded prior distribution) by a linear combination of claim sizes; (ii) it establishes that
such an approximation coincides with the exact credibility formula whenever the require conditions for the exact credibility (see
below) are held. Some properties of such an approximation are discussed. Application to crop insurance has been given.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

193

Minimising expected discounted capital injections by reinsurance in a classical risk model. Eisenberg, Julia; Schmidli, Hanspeter
[RKN: 45487]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 3 : 155-176.
In this paper we consider a classical continuous time risk model, where the claims are reinsured by some reinsurance with
retention level [unable to display equation], where [equation] means no reinsurance and b=0 means full reinsurance. The insurer
can change the retention level continuously. To prevent negative surplus the insurer has to inject additional capital. The problem is
to minimise the expected discounted cost over all admissible reinsurance strategies. We show that an optimal reinsurance
strategy exists. For some special cases we will be able to give the optimal strategy explicitly. In other cases the method will be
illustrated only numerically.

On the analysis of a general class of dependent risk processes. Willmot, Gordon E; Woo, Jae-Kyung [RKN: 45730]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 134-141.
A generalized Sparre Andersen risk process is examined, whereby the joint distribution of the interclaim time and the ensuing
claim amount is assumed to have a particular mathematical structure. This structure is present in various dependency models
which have previously been proposed and analyzed. It is then shown that this structure in turn often implies particular functional
forms for joint discounted densities of ruin related variables including some or all of the deficit at ruin, the surplus immediately prior
to ruin, and the surplus after the second last claim. Then, employing a fairly general interclaim time structure which involves a
combination of Erlang type densities, a complete identification of a generalized GerberShiu function is provided. An application is
given applying these results to a situation involving a mixed Erlang type of claim amount assumption. Various examples and
special cases of the model are then considered, including one involving a bivariate Erlang mixture model.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Optimal time-consistent investment and reinsurance strategies for insurers under Hestons SV model. Li, Zhongfei; Zeng, Yan;
Lai, Yongzeng [RKN: 45736]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 191-203.
This paper considers the optimal time-consistent investment and reinsurance strategies for an insurer under Hestons stochastic
volatility (SV) model. Such an SV model applied to insurers portfolio problems has not yet been discussed as far as we know. The
surplus process of the insurer is approximated by a Brownian motion with drift. The financial market consists of one risk-free asset
and one risky asset whose price process satisfies Hestons SV model. Firstly, a general problem is formulated and a verification
theorem is provided. Secondly, the closed-form expressions of the optimal strategies and the optimal value functions for the
meanvariance problem without precommitment are derived under two cases: one is the investmentreinsurance case and the
other is the investment-only case. Thirdly, economic implications and numerical sensitivity analysis are presented for our results.
Finally, some interesting phenomena are found and discussed.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

The time to ruin and the number of claims until ruin for phase-type claims. Frostig, Esther; Pitts, Susan M; Politis, Konstadinos
[RKN: 45720]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 19-25.
We consider a renewal risk model with phase-type claims, and obtain an explicit expression for the joint transform of the time to
ruin and the number of claims until ruin, with a penalty function applied to the deficit at ruin. The approach is via the duality
between a risk model with phase-type claims and a particular single server queueing model with phase-type customer interarrival
times; see Frostig (2004). This result specializes to one for the probability generating function of the number of claims until ruin.
We obtain explicit expressions for the distribution of the number of claims until ruin for exponentially distributed claims when the
inter-claim times have an Erlang-n distribution.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

TIME SERIES

Measurement and modelling of dependencies in economic capital. Shaw, R A; Smith, A D; Spivak, G S - 99 pages. [RKN: 73863]
Shelved at: Per: BAJ (Oxf) Per: BAJ (Lon) Shelved at: REF
BAJ (2011) 16 (3) : 601-699.
This paper covers a number of different topics related to the measurement and modelling of dependency within economic capital
models. The scope of the paper is relatively wide. We address in some detail the different approaches to modelling dependencies
ranging from the more common variance-covariance matrix approach, to the consideration of the use of copulas and the more
sophisticated causal models that feature feedback loops and other systems design ideas.
There are many data and model uncertainties in modelling dependency and so we have also endeavoured to cover topics such as
spurious relationships and wrong-way risk to highlight some of the uncertainties involved.
With the advent of the internal model approval process under Solvency II, senior management needs to have a greater
understanding of dependency methodology. We have devoted a section of this paper to a discussion of possible different ways to
communicate the results of modelling to the board, senior management and other interested parties within an insurance company.
We have endeavoured throughout this paper to include as many numerical examples as possible to help in the understanding of
the key points, including our discussion of model parameterisation and the communication to an insurance executive of the impact
of dependency on economic capital modelling results.
The economic capital model can be seen as a combination of two key components: the marginal risk distribution of each risk and
the aggregation methodology which combines these into a single aggregate distribution or capital number. This paper is
concerned with the aggregation part, the methods and assumptions employed and the issues arising, and not the determination of
194

the marginal risk distributions which is equally of importance and in many cases equally as complex.
http://www.actuaries.org.uk/research-and-resources/pages/members-access-journals

Measurement and modelling of dependencies in economic capital : Abstract of the London discussion. Shaw, Richard - 21 pages.
[RKN: 73864]
Shelved at: Per: BAJ (Oxf) Per: BAJ (Lon) Shelved at: REF
BAJ (2011) 16 (3) : 701-721.
This abstract relates to the following paper:

Shaw, R.A., Smith, A.D. & Spivak, G.S. Measurement and modelling of dependencies in economic
capital. British Actuarial Journal, 16 (3).
http://www.actuaries.org.uk/research-and-resources/pages/members-access-journals

TRADING

The British put option. Peskir, Goran; Samee, Farman Routledge, [RKN: 45527]
Applied Mathematical Finance (2011) 18 (5-6) : 537-563.
We present a new put option where the holder enjoys the early exercise feature of American options whereupon his payoff
(deliverable immediately) is the best prediction of the European payoff under the hypothesis that the true drift of the stock price
equals a contract drift. Inherent in this is a protection feature which is key to the British put option. Should the option holder believe
the true drift of the stock price to be unfavourable (based upon the observed price movements) he can substitute the true drift with
the contract drift and minimize his losses. The practical implications of this protection feature are most remarkable as not only can
the option holder exercise at or above the strike price to a substantial reimbursement of the original option price (covering the
ability to sell in a liquid option market completely endogenously) but also when the stock price movements are favourable he will
generally receive higher returns at a lesser price. We derive a closed form expression for the arbitrage-free price in terms of the
rational exercise boundary and show that the rational exercise boundary itself can be characterized as the unique solution to a
nonlinear integral equation. Using these results we perform a financial analysis of the British put option that leads to the
conclusions above and shows that with the contract drift properly selected the British put option becomes a very attractive
alternative to the classic American put.


Available via Athens: Taylor & Francis Online
http://www.openathens.net

The endogenous price dynamics of emission allowances and an application to CO2 option pricing. Chesney, Marc; Taschini, Luca
[RKN: 45878]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (5-6) : 447-475.
Market mechanisms are increasingly being used as a tool for allocating somewhat scarce but unpriced rights and resources, and
the European Emission Trading Scheme is an example. By means of dynamic optimization in the contest of firms covered by such
environmental regulations, this article generates endogenously the price dynamics of emission permits under asymmetric
information, allowing inter-temporal banking and borrowing. In the market, there are a finite number of firms and each firm's
pollution emission follows an exogenously given stochastic process. We prove the discounted permit price is a martingale with
respect to the relevant filtration. The model is solved numerically. Finally, a closed-form pricing formula for European-style options
is derived.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

Mean-variance optimal adaptiv3e execution. Lorenz, Julian; Almgren, Robert F Routledge, [RKN: 45521]
Applied Mathematical Finance (2011) 18 (5-6) : 395-422.
Electronic trading of equities and other securities makes heavy use of arrival price algorithms that balance the market impact cost
of rapid execution against the volatility risk of slow execution. In the standard formulation, meanvariance optimal trading
strategies are static: they do not modify the execution speed in response to price motions observed during trading. We show that
substantial improvement is possible by using dynamic trading strategies and that the improvement is larger for large initial
positions.

We develop a technique for computing optimal dynamic strategies to any desired degree of precision. The asset price process is
observed on a discrete tree with an arbitrary number of levels. We introduce a novel dynamic programming technique in which the
control variables are not only the shares traded at each time step but also the maximum expected cost for the remainder of the
program; the value function is the variance of the remaining program. The resulting adaptive strategies are
aggressive-in-the-money: they accelerate the execution when the price moves in the trader's favor, spending parts of the trading
gains to reduce risk.


Available via Athens: Taylor & Francis Online
http://www.openathens.net

195

TRANSACTION COSTS

Dynamic portfolio optimization in discrete-time with transaction costs. Atkinson, Colin; Quek, Gary Routledge, [RKN: 45840]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (3-4) : 265-298.
A discrete-time model of portfolio optimization is studied under the effects of proportional transaction costs. A general class of
underlying probability distributions is assumed for the returns of the asset prices. An investor with an exponential utility function
seeks to maximize the utility of terminal wealth by determining the optimal investment strategy at the start of each time step.
Dynamic programming is used to derive an algorithm for computing the optimal value function and optimal boundaries of the
no-transaction region at each time step. In the limit of small transaction costs, perturbation analysis is applied to obtain the optimal
value function and optimal boundaries at any time step in the rebalancing of the portfolio.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

The effect of correlation and transaction costs on the pricing of basket options. Atkinson, C; Ingpochai, P Routledge, [RKN:
45797]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (1-2) : 131-179.
In this article, we examine the problem of evaluating the option price of a European call option written on N underlying assets when
there are proportional transaction costs in the market. Since the portfolio under consideration consists of multiple risky assets,
which makes numerical methods formidable, we use perturbation analyses. The article extends the model for option pricing on
uncorrelated assets, which was proposed by Atkinson and Alexandropoulos (2006 Pricing a European basket option in the
presence of proportional transaction cost . Applied Mathematical Finance , 13 ( 3 ) : 191 214). We determine optimal hedging
strategies as well as option prices on both correlated and uncorrelated assets. The option valuation problem is obtained by
comparing the maximized utility of wealth with and without option liability. The two stochastic control problems, which arise from
the transaction costs, are transformed to free boundary and partial differential equation problems. Once the problems have been
formulated, we establish optimal trading strategies for each of the portfolios. In addition, the optimal hedging strategies can be
found by comparing the trading strategies of the two portfolios. We provide a general procedure for solving N risky assets, which
shows that for small correlations the N asset problem can be replaced by N (N-1)/2 two-dimensional problems and give numerical
examples for the two risky assets portfolios.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

UNCERTAINTY

Claims development result in the paid-incurred chain reserving method. Happ, Sebastian; Merz, Michael; Wthrich, Mario V [RKN:
45724]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 66-72.
We present the one-year claims development result (CDR) in the paid-incurred chain (PIC) reserving model. The PIC reserving
model presented in Merz and Wthrich (2010) is a Bayesian stochastic claims reserving model that considers simultaneously
claims payments and incurred losses information and allows for deriving the full predictive distribution of the outstanding loss
liabilities. In this model we study the conditional mean square error of prediction (MSEP) for the one-year CDR uncertainty, which
is the crucial uncertainty view under Solvency II.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Competitive insurance market in the presence of ambiguity. Anwar, Sajid; Zheng, Mingli [RKN: 44992]
Insurance: Mathematics & Economics (2012) 50 (1) : 79-84.
Within the context of a competitive insurance market, this paper examines the impact of ambiguity on the behavior of buyers and
sellers. Ambiguity is described through a probability measure on an extended state space that includes extra ambiguous states. It
is shown that if insurers face the same or less ambiguity than their customers, a unique equilibrium exists where customers are
fully insured. On the other hand, if insurers face more ambiguity than their customers, customers will be under insured and it is
even possible that customers may not purchase any insurance.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

UNDERWRITING

Hierarchical structures in the aggregation of premium risk for insurance underwriting. Savelli, Nino; Clemente, Gian Paolo [RKN:
45489]
Shelved at: Per: SAJ Shelved at: SCA/ACT
Scandinavian Actuarial Journal (2011) 3 : 193-213.
In the valuation of the Solvency II capital requirement, the correct appraisal of risk dependencies acquires particular relevance.
These dependencies refer to the recognition of risk diversification in the aggregation process and there are different levels of
aggregation and hence different types of diversification. For instance, for a non-life company at the first level the risk components
of each single line of business (e.g. premium, reserve, and CAT risks) need to be combined in the overall portfolio, the second
196

level regards the aggregation of different kind of risks as, for example, market and underwriting risk, and finally various solo legal
entities could be joined together in a group.

Solvency II allows companies to capture these diversification effects in capital requirement assessment, but the identification of a
proper methodology can represent a delicate issue. Indeed, while internal models by simulation approaches permit usually to
obtain the portfolio multivariate distribution only in the independence case, generally the use of copula functions can consent to
have the multivariate distribution under dependence assumptions too.

However, the choice of the copula and the parameter estimation could be very problematic when only few data are available. So it
could be useful to find a closed formula based on Internal Models independence results with the aim to obtain the capital
requirement under dependence assumption.

A simple technique, to measure the diversification effect in capital requirement assessment, is the formula, proposed by Solvency
II quantitative impact studies, focused on the aggregation of capital charges, the latter equal to percentile minus average of total
claims amount distribution of single line of business (LoB), using a linear correlation matrix.

On the other hand, this formula produces the correct result only for a restricted class of distributions, while it may underestimate
the diversification effect.

In this paper we present an alternative method, based on the idea to adjust that formula with proper calibration factors (proposed
by Sandstrm (2007)) and appropriately extended with the aim to consider very skewed distribution too.

In the last part considering different non-life multi-line insurers, we compare the capital requirements obtained, for only premium
risk, applying the aggregation formula to the results derived by elliptical copulas and hierarchical Archimedean copulas.

UNIT LINKED LIFE ASSURANCE

Optimal strategies for hedging portfolios of unit-linked life insurance contracts with minimum death guarantee. Nteukam T,
Oberlain; Planchet, Frdric; Therond, Pierre [RKN: 40010]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 48 (2) : 161-175.
In this paper, we are interested in hedging strategies which allow the insurer to reduce the risk to their portfolio of unit-linked life
insurance contracts with minimum death guarantee. Hedging strategies are developed in the Black and Scholes model and in the
Merton jumpdiffusion model. According to the new frameworks (IFRS, Solvency II and MCEV), risk premium is integrated into our
valuations. We will study the optimality of hedging strategies by comparing risk indicators (Expected loss, volatility, VaR and CTE)
in relation to transaction costs and costs generated by the re-hedging error. We will analyze the robustness of hedging strategies
by stress-testing the effect of a sharp rise in future mortality rates and a severe depreciation in the price of the underlying asset.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

The uncertain mortality intensity framework: pricing and hedging unit-linked life insurance contracts. Li, Jing; Szimayer,
Alexander [RKN: 44949]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 471-486.
We study the valuation and hedging of unit-linked life insurance contracts in a setting where mortality intensity is governed by a
stochastic process. We focus on model risk arising from different specifications for the mortality intensity. To do so we assume that
the mortality intensity is almost surely bounded under the statistical measure. Further, we restrict the equivalent martingale
measures and apply the same bounds to the mortality intensity under these measures. For this setting we derive upper and lower
price bounds for unit-linked life insurance contracts using stochastic control techniques. We also show that the induced hedging
strategies indeed produce a dynamic superhedge and subhedge under the statistical measure in the limit when the number of
contracts increases. This justifies the bounds for the mortality intensity under the pricing measures. We provide numerical
examples investigating fixed-term, endowment insurance contracts and their combinations including various guarantee features.
The pricing partial differential equation for the upper and lower price bounds is solved by finite difference methods. For our
contracts and choice of parameters the pricing and hedging is fairly robust with respect to misspecification of the mortality
intensity. The model risk resulting from the uncertain mortality intensity is of minor importance.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

UNITED STATES

Bias reduction for pricing American options by least-squares Monte Carlo. Kan, Kin Hung Felix; Reesorb, Mark R Routledge,
[RKN: 45837]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (3-4) : 195-217.
We derive an approximation to the bias in regression-based Monte Carlo estimators of American option values. This derivation
holds for general asset-price processes of any dimensionality and for general pay-off structures. It uses the large sample
properties of least-squares regression estimators. Bias-corrected estimators result by subtracting the bias approximation from the
uncorrected estimator at each exercise opportunity. Numerical results show that the bias-corrected estimator outperforms its
uncorrected counterpart across all combinations of number of exercise opportunities, option moneyness and sample size. Finally,
the results suggest significant computational efficiency increases can be realized through trivial parallel implementations using the
197

corrected estimator.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

A risk-based model for the valuation of pension insurance. Chen, An [RKN: 44942]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 401-409.
In the US, defined benefit plans are insured by the Pension Benefit Guaranty Corporation (PBGC). Taking account of the fact that
the PBGC covers only the residual deficits of the pension fund the sponsoring company is unable to cover and that the plans can
be prematurely terminated, we consider a model that accounts for the joint dynamics of the pension funds and sponsoring firms
assets in order to effectively determine the risk-based pension premium for the insurance provided by the PBGC. We obtain a
closed-form pricing formula for this risk-based premium. Its magnitude depends highly on the investment portfolio of the pension
fund and of the sponsoring company as well as the correlation between these two portfolios.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

UTILITY

A note on subadditivity of zero-utility premiums. Denuit, Michel; Eeckhoudt, Louis; Menegatti, Mario [RKN: 45307]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU/AST
ASTIN Bulletin (2011) 41 (1) : 239-250.
Many papers in the literature have adopted the expected utility paradigm to analyze insurance decisions. Insurance companies
manage policies by growing, by adding independent risks. Even if adding risks generally ultimately decreases the probability of
insolvency, the impact on the insurers expected utility is less clear. Indeed, it is not true that the risk aversion toward the additional
loss generated by a new policy included in an insurance portfolio always decreases with the number of contracts already
underwritten. The present paper derives conditions under which zero-utility premium principles are subadditive for independent
risks. It is shown that subadditivity is the exception rather than the rule: the zero-utility premium principle generates a
superadditive risk premium for most common utility functions. For instance, all completely monotonic utility functions generate
superadditive zero-utility premiums. The main message of the present paper is thus that the zero-utility premium for a marginal
policy is generally not sufficient to guarantee the formation of insurance portfolios without additional capital.
online access via International Actuarial Association:
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

UTILITY FUNCTION

Archimedean copulas derived from Morgenstern utility functions. Spreeuw, Jaap (2012). - London: Cass Business School, 2012.
- 21 pages. [RKN: 70094]
The (additive) generator of an Archimedean copula - as well as the inverse of the generator- is a strictly decreasing and convex
function, while Morgenstern utility functions (applying to risk averse decision makers) are non decreasing and concave. This
provides a basis for deriving either a generator of Archimedean copulas, or its inverse, from a Morgenstern utility function. If we
derive the generator in this way, dependence properties of an Archimedean copula that are often taken to be desirable, match with
generally sought after properties of the corresponding utility function. It is shown how well known copula families are derived from
established utility functions. Also, some new copula families are derived, and their properties are discussed. If, on the other hand,
we instead derive the inverse of the generator from the utility function, there is a link between the magnitude of measures of risk
attitude (like the very common Arrow-Pratt coefficient of absolute risk aversion) and the strength of dependence featured by the
corresponding Archimedean copula.
1995 onwards available online. Download as PDF.
http://www.cass.city.ac.uk/research-and-faculty/faculties/faculty-of-actuarial-science-and-insurance/publications/actuarial-resear
ch-reports

VALUATION

A risk-based model for the valuation of pension insurance. Chen, An [RKN: 44942]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 401-409.
In the US, defined benefit plans are insured by the Pension Benefit Guaranty Corporation (PBGC). Taking account of the fact that
the PBGC covers only the residual deficits of the pension fund the sponsoring company is unable to cover and that the plans can
be prematurely terminated, we consider a model that accounts for the joint dynamics of the pension funds and sponsoring firms
assets in order to effectively determine the risk-based pension premium for the insurance provided by the PBGC. We obtain a
closed-form pricing formula for this risk-based premium. Its magnitude depends highly on the investment portfolio of the pension
fund and of the sponsoring company as well as the correlation between these two portfolios.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

198

Valuation of two-factor interest rate contingent claims using Green's theorem. Sorwar, Ghulam; Barone-Adesi, Giovanni [RKN:
45460]
Applied Mathematical Finance (2011) 18 (3-4) : 277-289.
Over the years a number of two-factor interest rate models have been proposed that have formed the basis for the valuation of
interest rate contingent claims. This valuation equation often takes the form of a partial differential equation that is solved using the
finite difference approach. In the case of two-factor models this has resulted in solving two second-order partial derivatives leading
to boundary errors, as well as numerous first-order derivatives. In this article we demonstrate that using Green's theorem,
second-order derivatives can be reduced to first-order derivatives that can be easily discretized; consequently, two-factor partial
differential equations are easier to discretize than one-factor partial differential equations. We illustrate our approach by applying
it to value contingent claims based on the two-factor CIR model. We provide numerical examples that illustrate that our approach
shows excellent agreement with analytical prices and the popular CrankNicolson method.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

VALUE-AT-RISK (VAR)

Asymptotic analysis of multivariate tail conditional expectations. Zhu, Li; Li, Haijun Society of Actuaries, - 14 pages. [RKN: 70665]
Shelved at: Per: NAAJ (Oxf) Per NAAJ (Lon) Shelved at: JOU
North American Actuarial Journal (2012) 16 (3) : 350-363.
Tail conditional expectations refer to the expected values of random variables conditioning on some tail events and are closely
related to various coherent risk measures. In the univariate case, the tail conditional expectation is asymptotically proportional to
Value-at-Risk, a popular risk measure. The focus of this paper is on asymptotic relations between the multivariate tail conditional
expectation and Value-at-Risk for heavy-tailed scale mixtures of multivariate distributions. Explicit tail estimates of multivariate tail
conditional expectations are obtained using the method of regular variation. Examples involving multivariate Pareto and elliptical
distributions, as well as application to risk allocation, are also discussed.
http://www.soa.org/news-and-publications/publications/journals/naaj/naaj-detail.aspx

Asymptotic behavior of the empirical conditional value-at-risk. Gao, Fuqing; Wang, Shaochen [RKN: 44936]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 345-352.
The authors study asymptotic behavior of the empirical conditional value-at-risk (CVaR). In particular, the BerryEssen bound, the
law of iterated logarithm, the moderate deviation principle and the large deviation principle for the empirical CVaR are obtained.
The authors also give some numerical examples.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Asymptotics for risk capital allocations based on Conditional Tail Expectation. Asimit, Alexandru V; Furman, Edward; Tang, Qihe;
Vernic, Raluca [RKN: 44933]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 310-324.
An investigation of the limiting behavior of a risk capital allocation rule based on the Conditional Tail Expectation (CTE) risk
measure is carried out. More specifically, with the help of general notions of Extreme Value Theory (EVT), the aforementioned risk
capital allocation is shown to be asymptotically proportional to the corresponding Value-at-Risk (VaR) risk measure. The existing
methodology acquired for VaR can therefore be applied to a somewhat less well-studied CTE. In the context of interest, the EVT
approach is seemingly well-motivated by modern regulations, which openly strive for the excessive prudence in determining risk
capitals.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Capital allocation using the bootstrap. Kim, Joseph H T Society of Actuaries, - 18 pages. [RKN: 74919]
Shelved at: Per: NAAJ (Oxf) Per NAAJ (Lon) Shelved at: JOU
North American Actuarial Journal (2011) 15 (4) : 499-516.
This paper investigates the use of the bootstrap in capital allocation. In particular, for the distortion risk measure (DRM) class, we
show that the exact bootstrap estimate is available in analytic form for the allocated capital. We then theoretically justify the
bootstrap bias correction for the allocated capital induced from the concave DRM when the conditional mean function is strictly
monotone. A numerical example shows a tradeoff exists between the bias reduction and variance increase in bootstrapping the
allocated capital. However, unlike the aggregate capital case, the variance increase of the bias-corrected allocated capital
estimate substantially outweighs the benefit of bias correction, making the bootstrap bias correction at the allocated capital level
not as useful. Overall, the exact bootstrap without bias correction offers an efficient method for determining allocation over the
ordinary resampling bootstrap estimate and the empirical counterpart.
http://www.soa.org/news-and-publications/publications/journals/naaj/naaj-detail.aspx

Conditional tail expectation and premium calculation. Heras, Antonio; Balbs, Beatriz; Vilar, Jos Luis - 18 pages. [RKN: 70753]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2012) 42 (1) : 325-342.
In this paper we calculate premiums which are based on the minimization of the Expected Tail Loss or Conditional Tail Expectation
(CTE) of absolute loss functions. The methodology generalizes well known premium calculation procedures and gives sensible
results in practical applications. The choice of the absolute loss becomes advisable in this context since its CTE is easy to
calculate and to understand in intuitive terms. The methodology also can be applied to the calculation of the VaR and CTE of the
loss associated with a given premium.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN
199


Dynamic hedging of conditional value-at-risk. Melnikov, Alexander; Smirnov, Ivan [RKN: 45735]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 182-190.
In this paper, the problem of partial hedging is studied by constructing hedging strategies that minimize conditional value-at-risk
(CVaR) of the portfolio. Two dual versions of the problem are considered: minimization of CVaR with the initial wealth bounded
from above, and minimization of hedging costs subject to a CVaR constraint. The NeymanPearson lemma approach is used to
deduce semi-explicit solutions. Our results are illustrated by constructing CVaR-efficient hedging strategies for a call option in the
BlackScholes model and also for an embedded call option in an equity-linked life insurance contract.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Extreme value behavior of aggregate dependent risks. Chen, Die; Mao, Tiantian; Pan, Xiaoming; Hu, Taizhong [RKN: 44995]
Insurance: Mathematics & Economics (2012) 50 (1) : 99-108.
Consider a portfolio of n identically distributed risks with dependence structure modeled by an Archimedean survival copula.
Wthrich (2003) and Alink et al. (2004) proved that the probability of a large aggregate loss scales like the probability of a large
individual loss, times a proportionality factor. This factor depends on the dependence strength and the tail behavior of the
individual risk, denoted by , and according to whether the tail behavior belongs to the maximum domain of attraction of the
Frchet, the Weibull or the Gumbel distribution, respectively. We investigate properties of the factors and with respect to the
dependence parameter and/or the tail behavior parameter, and revisit the asymptotic behavior of conditional tail expectations of
aggregate risks for the Weibull and the Gumbel cases by using a different method. The main results strengthen and complement
some results in [Alink et al., 2004] and [Alink et al., 2005]Barbe et al. (2006), and Embrechts et al. (2009).
Available via Athens: Palgrave MacMillan
http://www.openathens.net

HaezendonckGoovaerts risk measures and Orlicz quantiles. Bellini, Fabio; Gianin, Emanuela Rosazza [RKN: 45727]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 107-114.
In this paper, we study the well-known HaezendonckGoovaerts risk measures on their natural domain, that is on Orlicz spaces
and, in particular, on Orlicz hearts. We provide a dual representation as well as the optimal scenario in such a representation and
investigate the properties of the minimizer (that we call Orlicz quantile) in the definition of the HaezendonckGoovaerts risk
measure. Since Orlicz quantiles fail to satisfy an internality property, bilateral Orlicz quantiles are also introduced and analyzed.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Modelling losses and locating the tail with the Pareto Positive Stable distribution. Guilln, Montserrat; Prieto, Faustino; Sarabia,
Jos Mara [RKN: 44947]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 454-461.
This paper focuses on modelling the severity distribution. We directly model the small, moderate and large losses with the Pareto
Positive Stable (PPS) distribution and thus it is not necessary to fix a threshold for the tail behaviour. Estimation with the method of
moments is straightforward. Properties, graphical tests and expressions for value-at risk and tail value-at-risk are presented.
Furthermore, we show that the PPS distribution can be used to construct a statistical test for the Pareto distribution and to
determine the threshold for the Pareto shape if required. An application to loss data is presented. We conclude that the PPS
distribution can perform better than commonly used distributions when modelling a single loss distribution for moderate and large
losses. This approach avoids the pitfalls of cut-off selection and it is very simple to implement for quantitative risk analysis.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

On approximating law-invariant comonotonic coherent risk measures. Nakano, Yumiharu - 11 pages. [RKN: 70754]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2012) 42 (1) : 343-353.
The optimal quantization theory is applied for approximating law-invariant comonotonic coherent risk measures. Simple Lp-norm
estimates for the risk measures provide the rate of convergence of that approximation as the number of quantization points goes
to infinity.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

One-year Value-at-Risk for longevity and mortality. Plat, Richard [RKN: 44948]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 462-470.
Upcoming new regulation on regulatory required solvency capital for insurers will be predominantly based on a one-year
Value-at-Risk measure. This measure aims at covering the risk of the variation in the projection year as well as the risk of changes
in the best estimate projection for future years. This paper addresses the issue how to determine this Value-at-Risk for longevity
and mortality risk. Naturally, this requires stochastic mortality rates. In the past decennium, a vast literature on stochastic mortality
models has been developed. However, very few of them are suitable for determining the one-year Value-at-Risk. This requires a
model for mortality trends instead of mortality rates. Therefore, we will introduce a stochastic mortality trend model that fits this
purpose. The model is transparent, easy to interpret and based on well known concepts in stochastic mortality modeling.
Additionally, we introduce an approximation method based on duration and convexity concepts to apply the stochastic mortality
rates to specific insurance portfolios.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Optimal reinsurance revisited point of view of cedent and reinsurer. Hrlimann, Werner - 28 pages. [RKN: 74746]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2011) 41 (2) : 547-474.
200

It is known that the partial stop-loss contract is an optimal reinsurance form under the VaR risk measure. Assuming that market
premiums are set according to the expected value principle with varying loading factors, the optimal reinsurance parameters of
this contract are obtained under three alternative single and joint party reinsurance criteria: (i) strong minimum of the total retained
loss VaR measure; (ii) weak minimum of the total retained loss VaR measure and maximum of the reinsurers expected profit; (iii)
weak minimum of the total retained loss VaR measure and minimum of the total variance risk measure. New conditions for
financing in the mean simultaneously the cedent's and the reinsurer's required VaR economic capital are revealed for situations of
pure risk transfer (classical reinsurance) or risk and profit transfer (design of internal reinsurance or reinsurance captive owned by
the captive of a corporate firm).
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Optimal reinsurance under VaR and CVaR risk measures. Chi, Yichun; Tan, Ken Seng - 23 pages. [RKN: 74744]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2011) 41 (2) : 487-509.
In this paper, we study two classes of optimal reinsurance models by minimizing the total risk exposure of an insurer under the
criteria of value at risk (VaR) and conditional value at risk (CVaR). We assume that the reinsurance premium is calculated
according to the expected value principle. Explicit solutions for the optimal reinsurance policies are derived over ceded loss
functions with increasing degrees of generality. More precisely, we establish formally that under the VaR minimization model, (i)
the stop-loss reinsurance is optimal among the class of increasing convex ceded loss functions; (ii) when the constraints on both
ceded and retained loss functions are relaxed to increasing functions, the stop-loss reinsurance with an upper limit is shown to be
optimal; (iii) and finally under the set of general increasing and left-continuous retained loss functions, the truncated stop-loss
reinsurance is shown to be optimal. In contrast, under CVaR risk measure, the stop-loss reinsurance is shown to be always
optimal. These results suggest that the VaR-based reinsurance models are sensitive with respect to the constraints imposed on
both ceded and retained loss functions while the corresponding CVaR-based reinsurance models are quite robust.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Parameter uncertainty in exponential family tail estimation. Landsman, Z; Tsanakas, A - 30 pages. [RKN: 70746]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2012) 42 (1) : 123-152.
Actuaries are often faced with the task of estimating tails of loss distributions from just a few observations. Thus estimates of tail
probabilities (reinsurance prices) and percentiles (solvency capital requirements) are typically subject to substantial parameter
uncertainty. We study the bias and MSE of estimators of tail probabilities and percentiles, with focus on 1-parameter exponential
families. Using asymptotic arguments it is shown that tail estimates are subject to signifi cant positive bias. Moreover, the use of
bootstrap predictive distributions, which has been proposed in the actuarial literature as a way of addressing parameter
uncertainty, is seen to double the estimation bias. A bias corrected estimator is thus proposed. It is then shown that the MSE of the
MLE, the parametric bootstrap and the bias corrected estimators only differ in terms of order O(n2), which provides
decision-makers with some fl exibility as to which estimator to use. The accuracy of asymptotic methods, even for small samples,
is demonstrated exactly for the exponential and related distributions, while other 1-parameter distributions are considered in a
simulation study. We argue that the presence of positive bias may be desirable in solvency capital calculations, though not
necessarily in pricing problems.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Risk concentration of aggregated dependent risks: The second-order properties. Tong, Bin; Wu, Chongfeng; Xu, Weidong [RKN:
44999]
Insurance: Mathematics & Economics (2012) 50 (1) : 139-149.
Under the current regulatory guidelines for banks and insurance companies, the quantification of diversification benefits due to risk
aggregation plays a prominent role. In this paper we establish second-order approximation of risk concentration associated with a
random vector in terms of Value at Risk (VaR) within the methodological framework of second-order regular variation and the
theory of Archimedean copula. Moreover, we find that the rate of convergence of the first-order approximation of risk concentration
depends on the the interplay between the tail behavior of the marginal loss random variables and their dependence structure.
Specifically, we find that the rate of convergence is determined by either the second-order parameter ( 1) of Archimedean copula
generator or the second-order parameter ( ) of the tail margins, leading to either the so-called dependence dominated case or
margin dominated case.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Tail distortion risk and its asymptotic analysis. Zhu, Li; Li, Haijun [RKN: 45728]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 115-121.
A distortion risk measure used in finance and insurance is defined as the expected value of potential loss under a scenario
probability measure. In this paper, the tail distortion risk measure is introduced to assess tail risks of excess losses modeled by the
right tails of loss distributions. The asymptotic linear relation between tail distortion and value-at-risk is derived for heavy-tailed
losses with the linear proportionality constant depending only on the distortion function and the tail index. Various examples
involving tail distortions for location-invariant, scale-invariant, and shape-invariant loss distribution families are also presented to
illustrate the results.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Translation-invariant and positive-homogeneous risk measures and optimal portfolio management in the presence of a riskless
component. Landsman, Zinoviy; Makov, Udi E [RKN: 44994]
Insurance: Mathematics & Economics (2012) 50 (1) : 94-98.
Risk portfolio optimization, with translation-invariant and positive-homogeneous risk measures, leads to the problem of minimizing
a combination of a linear functional and a square root of a quadratic functional for the case of elliptical multivariate underlying
distributions.

This problem was recently treated by the authors for the case when the portfolio does not contain a riskless component. When it
does, however, the initial covariance matrix S becomes singular and the problem becomes more complicated. In the paper we
201

focus on this case and provide an explicit closed-form solution of the minimization problem, and the condition under which this
solution exists. The results are illustrated using data of 10 stocks from the NASDAQ Computer Index.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Viterbi-based estimation for Markov switching GARCH model. Routledge, [RKN: 45838]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (3-4) : 219-231.
We outline a two-stage estimation method for a Markov-switching Generalized Autoregressive Conditional Heteroscedastic
(GARCH) model modulated by a hidden Markov chain. The first stage involves the estimation of a hidden Markov chain using the
Vitberi algorithm given the model parameters. The second stage uses the maximum likelihood method to estimate the model
parameters given the estimated hidden Markov chain. Applications to financial risk management are discussed through simulated
data.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

VARIABILITY

Tail distortion risk and its asymptotic analysis. Zhu, Li; Li, Haijun [RKN: 45728]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 115-121.
A distortion risk measure used in finance and insurance is defined as the expected value of potential loss under a scenario
probability measure. In this paper, the tail distortion risk measure is introduced to assess tail risks of excess losses modeled by the
right tails of loss distributions. The asymptotic linear relation between tail distortion and value-at-risk is derived for heavy-tailed
losses with the linear proportionality constant depending only on the distortion function and the tail index. Various examples
involving tail distortions for location-invariant, scale-invariant, and shape-invariant loss distribution families are also presented to
illustrate the results.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

VARIABLE ANNUITIES

The impact of stochastic volatility on pricing, hedging, and hedge efficiency of withdrawal benefit guarantees in variable
annuities. Kling, Alexander; Ruez, Frederik; Russ, Jochen - 35 pages. [RKN: 74745]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2011) 41 (2) : 511-545.
We analyze different types of guaranteed withdrawal benefits for life, the latest guarantee feature within variable annuities.
Besides an analysis of the impact of different product features on the clients' payoff profile, we focus on pricing and hedging of the
guarantees. In particular, we investigate the impact of stochastic equity volatility on pricing and hedging. We consider different
dynamic hedging strategies for delta and vega risks and compare their performance. We also examine the effects if the hedging
model (with deterministic volatility) differs from the data-generating model (with stochastic volatility). This is an indication for the
model risk an insurer takes by assuming constant equity volatilities for risk management purposes, whereas in the real world
volatilities are stochastic.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

No-good-deal, local mean-variance and ambiguity risk pricing and hedging for an insurance payment process. Delong, Lukasz -
30 pages. [RKN: 70749]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2012) 42 (1) : 203-232.
We study pricing and hedging for an insurance payment process. We investigate a Black-Scholes financial model with stochastic
coefficients and a payment process with death, survival and annuity claims driven by a point process with a stochastic intensity.
The dependence of the claims and the intensity on the financial market and on an additional background noise (correlated index,
longevity risk) is allowed. We establish a general modeling framework for no-good-deal, local mean-variance and ambiguity risk
pricing and hedging. We show that these three valuation approaches are equivalent under appropriate formulations. We
characterize the price and the hedging strategy as a solution to a backward stochastic differential equation. The results could be
applied to pricing and hedging of variable annuities, surrender options under an irrational lapse behavior and mortality derivatives.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

Valuing equity-linked death benefits and other contingent options : A discounted density approach. Gerber, Hans U; Shiu, Elias S
W; Yang, Hailiang [RKN: 45725]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 73-92.
Motivated by the Guaranteed Minimum Death Benefits in various deferred annuities, we investigate the calculation of the expected
discounted value of a payment at the time of death. The payment depends on the price of a stock at that time and possibly also on
the history of the stock price. If the payment turns out to be the payoff of an option, we call the contract for the payment a (life)
contingent option. Because each time-until-death distribution can be approximated by a combination of exponential distributions,
the analysis is made for the case where the time until death is exponentially distributed, i.e., under the assumption of a constant
force of mortality. The time-until-death random variable is assumed to be independent of the stock price process which is a
202

geometric Brownian motion. Our key tool is a discounted joint density function. A substantial series of closed-form formulas is
obtained, for the contingent call and put options, for lookback options, for barrier options, for dynamic fund protection, and for
dynamic withdrawal benefits. In a section on several stocks, the method of Esscher transforms proves to be useful for finding
among others an explicit result for valuing contingent Margrabe options or exchange options. For the case where the contracts
have a finite expiry date, closed-form formulas are found for the contingent call and put options. From these, results for De
Moivres law are obtained as limits. We also discuss equity-linked death benefit reserves and investment strategies for maintaining
such reserves. The elasticity of the reserve with respect to the stock price plays an important role. Whereas in the most important
applications the stopping time is the time of death, it could be different in other applications, for example, the time of the next
catastrophe.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Valuing variable annuity guarantees with the multivariate Esscher transform. Ng, Andrew Cheuk-Yin; Li, Johnny Siu-Hang [RKN:
44941]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 393-400.
Variable annuities are usually sold with a range of guarantees that protect annuity holders from some downside market risk.
Although it is common to see variable annuity guarantees written on multiple funds, existing pricing methods are, by and large,
based on stochastic processes for one single asset only. In this article, we fill this gap by developing a multivariate valuation
framework. First, we consider a multivariate regime-switching model for modeling returns on various assets at the same time. We
then identify a risk-neutral probability measure for use with the model under consideration. This is accomplished by a multivariate
extension of the regime-switching conditional Esscher transform. We further extend our results to the situation when the
guarantee being valued is linked to equity indexes measured in foreign currencies. In particular, we derive a probability measure
that is risk-neutral from the perspective of domestic investors. Finally, we illustrate our results with a hypothetical variable annuity
guarantee.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Variable annuities: a unifying valuation approach. Bacinello, Anna Rita; Millossovich, Pietro; Olivieri, Annamaria; Pitacco, Ermanno
[RKN: 44931]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 285-297.
Life annuities and pension products usually involve a number of guarantees, such as minimum accumulation rates, minimum
annual payments or a minimum total payout. Packaging different types of guarantees is the feature of so-called variable annuities.
Basically, these products are unit-linked investment policies providing a post-retirement income. The guarantees, commonly
referred to as GMxBs (namely, Guaranteed Minimum Benefits of type x), include minimum benefits both in the case of death and
survival. In this paper we propose a unifying framework for the valuation of variable annuities under quite general model
assumptions. We compute and compare contract values and fair fee rates under static and mixed valuation approaches, via
ordinary and least squares Monte Carlo methods, respectively.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

VARIANCE ANALYSIS

Variance-optimal hedging for time-changed Levy processes. Kallsen, Jan; Pauwels, Arnd [RKN: 45251]
Applied Mathematical Finance (2011) 18 (1-2) : 1-28.
In this article, we solve the variance-optimal hedging problem in stochastic volatility (SV) models based on time-changed Levy
processes, that is, in the setup of Carr et al. (2003). The solution is derived using results for general affine models in the
companion article [Kallsen and Pauwels (2009)].

Available via Athens: Taylor & Francis Online
http://www.openathens.net

VOLATILITY

Assessing the performance of different volatility estimators : A Monte Carlo analysis. Cartea, Alvaro; Karyampas, Dimitrios [RKN:
45882]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (5-6) : 535-552.
We test the performance of different volatility estimators that have recently been proposed in the literature and have been
designed to deal with problems arising when ultra high-frequency data are employed: microstructure noise and price
discontinuities. Our goal is to provide an extensive simulation analysis for different levels of noise and frequency of jumps to
compare the performance of the proposed volatility estimators. We conclude that the maximum likelihood estimator filter (MLE-F),
a two-step parametric volatility estimator proposed by Cartea and Karyampas (2011a10. Cartea , . and Karyampas , D. 2011a .
The relationship between the volatility of returns and the number of jumps in financial markets, SSRN eLibrary , Working Paper
Series, SSRN .
View all references; The relationship between the volatility returns and the number of jumps in financial markets, SSRN eLibrary,
Working Paper Series, SSRN), outperforms most of the well-known high-frequency volatility estimators when different
assumptions about the path properties of stock dynamics are used.
203

Available via Athens: Taylor & Francis Online
http://www.openathens.net

Calibration of stock betas from skews of implied volatilities. Fouque, Jean-Pierre; Kollman, Eli [RKN: 45256]
Applied Mathematical Finance (2011) 18 (1-2) : 119-137.
We develop call option price approximations for both the market index and an individual asset using a singular perturbation of a
continuous-time capital asset pricing model in a stochastic volatility environment. These approximations show the role played by
the asset's beta parameter as a component of the parameters of the call option price of the asset. They also show how these
parameters, in combination with the parameters of the call option price for the market, can be used to extract the beta parameter.
Finally, a calibration technique for the beta parameter is derived using the estimated option price parameters of both the asset and
market index. The resulting estimator of the beta parameter is not only simple to implement but has the advantage of being
forward looking as it is calibrated from skews of implied volatilities.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

Corrections to the prices of derivatives due to market incompleteness. German, David [RKN: 45258]
Applied Mathematical Finance (2011) 18 (1-2) : 155-187.
We compute the first-order corrections to marginal utility-based prices with respect to a 'small' number of random endowments in
the framework of three incomplete financial models. They are a stochastic volatility model, a basis risk and market portfolio model
and a credit-risk model with jumps and stochastic recovery rate.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

A general formula for option prices in a stochastic volatility model. Chin, Stephen; Dufresne, Daniel Routledge, [RKN: 45842]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (3-4) : 313-340.
We consider the pricing of European derivatives in a BlackScholes model with stochastic volatility. We show how Parseval's
theorem may be used to express those prices as Fourier integrals. This is a significant improvement over Monte Carlo simulation.
The main ingredient in our method is the Laplace transform of the ordinary (constant volatility) price of a put or call in the
BlackScholes model, where the transform is taken with respect to maturity (T); this does not appear to have been used before in
pricing options under stochastic volatility. We derive these formulas and then apply them to the case where volatility is modelled as
a continuous-time Markov chain, the so-called Markov regime-switching model. This model has been used previously in stochastic
volatility modelling, but mostly with only states. We show how to use states without difficulty, and how larger number of states
can be handled. Numerical illustrations are given, including the implied volatility curve in two- and three-state models. The curves
have the smile shape observed in practice.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

The impact of stochastic volatility on pricing, hedging, and hedge efficiency of withdrawal benefit guarantees in variable
annuities. Kling, Alexander; Ruez, Frederik; Russ, Jochen - 35 pages. [RKN: 74745]
Shelved at: Per: Astin Bull (Oxf) Shelved at: JOU
ASTIN Bulletin (2011) 41 (2) : 511-545.
We analyze different types of guaranteed withdrawal benefits for life, the latest guarantee feature within variable annuities.
Besides an analysis of the impact of different product features on the clients' payoff profile, we focus on pricing and hedging of the
guarantees. In particular, we investigate the impact of stochastic equity volatility on pricing and hedging. We consider different
dynamic hedging strategies for delta and vega risks and compare their performance. We also examine the effects if the hedging
model (with deterministic volatility) differs from the data-generating model (with stochastic volatility). This is an indication for the
model risk an insurer takes by assuming constant equity volatilities for risk management purposes, whereas in the real world
volatilities are stochastic.
http://www.actuaries.org/index.cfm?lang=EN&DSP=PUBLICATIONS&ACT=ASTIN BULLETIN

On cross-currency models with stochastic volatility and correlated interest rates. Grzelak, Lech A; Oosterleeac, Cornelis W
Routledge, [RKN: 45793]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (1-2) : 1-35.
We construct multi-currency models with stochastic volatility (SV) and correlated stochastic interest rates with a full matrix of
correlations. We first deal with a foreign exchange (FX) model of Heston-type, in which the domestic and foreign interest rates are
generated by the short-rate process of HullWhite (Hull, J. and White, A. [1990] Pricing interest-rate derivative securities, Review
of Financial Studies, 3, pp. 573592). We then extend the framework by modelling the interest rate by an SV displaced-diffusion
(DD) Libor Market Model (Andersen, L. B. G. and Andreasen, J. [2000] Volatility skews and extensions of the libor market model,
Applied Mathematics Finance, 1[7], pp. 132), which can model an interest rate smile. We provide semi-closed form
approximations which lead to efficient calibration of the multi-currency models. Finally, we add a correlated stock to the framework
and discuss the construction, model calibration and pricing of equityFXinterest rate hybrid pay-offs.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

Optimal time-consistent investment and reinsurance strategies for insurers under Hestons SV model. Li, Zhongfei; Zeng, Yan;
Lai, Yongzeng [RKN: 45736]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 191-203.
This paper considers the optimal time-consistent investment and reinsurance strategies for an insurer under Hestons stochastic
volatility (SV) model. Such an SV model applied to insurers portfolio problems has not yet been discussed as far as we know. The
204

surplus process of the insurer is approximated by a Brownian motion with drift. The financial market consists of one risk-free asset
and one risky asset whose price process satisfies Hestons SV model. Firstly, a general problem is formulated and a verification
theorem is provided. Secondly, the closed-form expressions of the optimal strategies and the optimal value functions for the
meanvariance problem without precommitment are derived under two cases: one is the investmentreinsurance case and the
other is the investment-only case. Thirdly, economic implications and numerical sensitivity analysis are presented for our results.
Finally, some interesting phenomena are found and discussed.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

A time-dependent variance model for pricing variance and volatility swaps. Goard, Joanna [RKN: 45253]
Applied Mathematical Finance (2011) 18 (1-2) : 51-70.
Analytic solutions are found for prices of variance and volatility swaps under a new time-dependent stochastic model for the
dynamics of variance. The main features of the new stochastic differential equation are (1) an empirically validated diffusion term
and (2) a free function of time as a moving target in a reversion term, allowing additional flexibility for model calibration against
market data.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

Viterbi-based estimation for Markov switching GARCH model. Routledge, [RKN: 45838]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (3-4) : 219-231.
We outline a two-stage estimation method for a Markov-switching Generalized Autoregressive Conditional Heteroscedastic
(GARCH) model modulated by a hidden Markov chain. The first stage involves the estimation of a hidden Markov chain using the
Vitberi algorithm given the model parameters. The second stage uses the maximum likelihood method to estimate the model
parameters given the estimated hidden Markov chain. Applications to financial risk management are discussed through simulated
data.

Available via Athens: Taylor & Francis Online
http://www.openathens.net

WEALTH

A characterization of the multivariate excess wealth ordering. Fernndez-Ponce, J M; Pellerey, Franco; Rodrguez-Griolo, M R
[RKN: 44943]
Shelved at: Per: IME (Oxf)
Insurance: Mathematics & Economics (2011) 49 (3) : 410-417.
In this paper, some new properties of the upper-corrected orthant of a random vector are proved.
The univariate right-spread or excess wealth function, introduced by Fernndez-Ponce et al. (1996), is
extended to multivariate random vectors, and some properties of this multivariate function are studied.
Later, this function was used to define the excess wealth ordering by Shaked and Shanthikumar (1998)
and Fernndez-Ponce et al. (1998). The multivariate excess wealth function enable us to define a new
stochastic comparison which is weaker than the multivariate dispersion orderings. Also, some properties
relating the multivariate excess wealth order with stochastic dependence are described.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

WEATHER

The implied market price of weather risk. Hardlea, Wolfgang Karl; Caberaa, Brenda Lopez Routledge, [RKN: 45795]
Shelved at: Per. AMF
Applied Mathematical Finance (2012) 19 (1-2) : 59-95.
Weather derivatives (WD) are end-products of a process known as securitization that transforms non-tradable risk factors
(weather) into tradable financial assets. For pricing and hedging non-tradable assets, one essentially needs to incorporate the
market price of risk (MPR), which is an important parameter of the associated equivalent martingale measure (EMM). The majority
of papers so far has priced non-tradable assets assuming zero or constant MPR, but this assumption yields biased prices and has
never been quantified earlier under the EMM framework. Given that liquid-derivative contracts based on daily temperature are
traded on the Chicago Mercantile Exchange (CME), we infer the MPR from traded futures-type contracts (CAT, CDD, HDD and
AAT). The results show how the MPR significantly differs from 0, how it varies in time and changes in sign. It can be
parameterized, given its dependencies on time and temperature seasonal variation. We establish connections between the
market risk premium (RP) and the MPR.
Available via Athens: Taylor & Francis Online
http://www.openathens.net

Statistical analysis of model risk concerning temperature residuals and its impact on pricing weather derivatives. Ahcan, Ales
[RKN: 44998]
Insurance: Mathematics & Economics (2012) 50 (1) : 131-138.
In this paper we model the daily average temperature via an extended version of the standard Ornstein Uhlenbeck process driven
by a Levy noise with seasonally adjusted asymmetric ARCH process for volatility. More precisely, we model the disturbances with
205

the Normal inverse Gaussian (NIG) and Variance gamma (VG) distribution. Besides modelling the residuals we also compare the
prices of January 2010 out of the money call and put options for two of the Slovenian largest cities Ljubljana and Maribor under
normally distributed disturbances and NIG and VG distributed disturbances. The results of our numerical analysis demonstrate
that the normal model fails to capture adequately tail risk, and consequently significantly misprices out of the money options. On
the other hand prices obtained using NIG and VG distributed disturbances fit well to the results obtained by bootstrapping the
residuals. Thus one should take extreme care in choosing the appropriate statistical model.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

WITH PROFITS LIFE ASSURANCE

A performance analysis of participating life insurance contracts. Faust, Roger; Schmeiser, Hato; Zemp, Alexandra [RKN: 45733]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 158-171.
Participating life insurance contracts are one of the most important products in the European life insurance market. Even though
these contract forms are very common, only very little research has been conducted in respect to their performance. Hence, we
conduct a performance analysis to provide a decision support for policyholders. We decompose a participating life insurance
contract in a term life insurance and a savings part and simulate the cash flow distribution of the latter. Simulation results are
compared with cash flows resulting from two benchmarks investing in the same portfolio of assets but without investment
guarantees and bonus distribution schemes, in order to measure the impact of these two product features. To provide a realistic
picture within the two alternatives, we take transaction costs and wealth transfers between different groups of policyholders into
account. We show that the payoff distribution strongly depends on the initial reserve situation and managerial discretion. Results
indicate that policyholders will in general profit from a better payoff distribution of the participating life insurance compared to a
mutual fund benchmark but not compared to an exchange-traded fund benchmark portfolio.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

WITH PROFITS POLICIES

A performance analysis of participating life insurance contracts. Faust, Roger; Schmeiser, Hato; Zemp, Alexandra [RKN: 45733]
Shelved at: Online Only Shelved at: Online Only
Insurance: Mathematics & Economics (2012) 51 (1) : 158-171.
Participating life insurance contracts are one of the most important products in the European life insurance market. Even though
these contract forms are very common, only very little research has been conducted in respect to their performance. Hence, we
conduct a performance analysis to provide a decision support for policyholders. We decompose a participating life insurance
contract in a term life insurance and a savings part and simulate the cash flow distribution of the latter. Simulation results are
compared with cash flows resulting from two benchmarks investing in the same portfolio of assets but without investment
guarantees and bonus distribution schemes, in order to measure the impact of these two product features. To provide a realistic
picture within the two alternatives, we take transaction costs and wealth transfers between different groups of policyholders into
account. We show that the payoff distribution strongly depends on the initial reserve situation and managerial discretion. Results
indicate that policyholders will in general profit from a better payoff distribution of the participating life insurance compared to a
mutual fund benchmark but not compared to an exchange-traded fund benchmark portfolio.
Available via Athens: Palgrave MacMillan
http://www.openathens.net

Das könnte Ihnen auch gefallen