Sie sind auf Seite 1von 18

Development of a LGD model Basel2 compliant:

a case study

Stefano Bonini
University of Rome, Tor Vergata
bonini.stefano@gmail.com

Giuliana Caivano*
University of Rome, Tor Vergata
giuliana.caivano@gmail.com

Abstract: The Basel2 Accord allows banks to calculate their capital requirements using
advanced internal ratings based approach (IRBA), subject to supervisory review and
based on the estimation of three credit risk parameters - Probability of Default (PD),
Exposure at Default (EAD) and Loss Given Default (LGD). LGD – defined as credit loss
when extreme events occur influencing the obligor ability to repay debts - has a high
relevance into credit and recovery process because of its direct impact on capital savings.
While on PD models an extensive academic and practitioner's literature exists, LGD
studies are in a less advance status because of the lack of data on recoveries and
differences in recovery process among commercial banks. In this paper a case study on a
real Basel 2 compliant LGD model has been developed starting from a workout approach
and stressing the role and estimation of financial components (Economic LGD) such as
the estimation of discount rate, the regressive univariate process applied to identify the
most predictive and meaningful variables and the definition of the final multivariate
analysis. Finally some performance and backtesting tests have been performed in order to
verify the predictive power and the accuracy of the model.
The model has been developed on 10 years of historical real data of Corporate and Retail
portfolio of an Italian commercial bank among the fifteen Italian Banks that will be
supervised by ECB. This paper adds a real value to the existing litterature because it
follows all the Basel2 requirements and it is linked to the Italian Banking context. Italy,
unlike the rest of Europe, can be considered a more general and complicate case of LGD
computation because of specific recovery process and more than one default status
(doubtful loans, past-dues, charge-offs). This paper, finally, contributes to the
understanding of the Italian recovery process in order to define an even more common
European framework looking forward the ECB Banking Supervision.

*
Corresponding author
Keywords: Loss Given Default, Rating Model, Basel2, Credit Risk Modeling,
Quantitative Finance.
JEL Reference: C13, C18, C51, C52, C53, G21

1. Introduction

In the last years the biggest European Banking Groups started to assess the possibility of
adopting the Advanced Internal Rating Based Approach (AIRBA) under Basel2, in order
to save capital thanks also to the possibility of a larger use of Credit Risk mitigators with
respect to the Standardized Approach. The AIRBA framework requires banks to develop
statistical models for estimating probability of default (PD), Loss Given Default (LGD)
and Exposure at Default (EAD). The New Basel2 Accord, implemented throughout the
banking world starting from 1 January 2007, made a significant difference to the use of
modeling within financial organizations, by highlighting the relevant role of Loss Given
Default (LGD) modeling [5]. While on PD models an extensive academic and
practitioner’s literature exists, LGD studies are in a less advance status because of the
lack of data on recoveries and differences in recovery process among commercial banks.
The existing literature on LGD is for the most part related to Corporate Bonds given the
public availability of data and in most cases the existing papers only try to test different
statistical approaches in order to identify the most predictive variables / methods of
recovery rates.
The aim of this paper is to show the results of a case study on a real Basel 2 compliant
LGD model starting from a workout approach, applying a unique statistical approach but
stressing on the estimation of the discount rate as main component of Economic LGD,
and on the variable selection process adopted for the definition of the final short list of
drivers of Recovery Rates. The main authors’ intent is to formalize a framework that
could be easy to understand both from academic and pratictioner’s side, and also easy to
apply from IT department of banks.
The model has been developed on 10 years of historical real data of Corporate and Retail
portfolio of an Italian commercial bank among the fifteen Italian Banks under the
supervision of ECB. This paper adds a real value to the existing literature because it
follows all the Basel2 requirements and can contributes to the understanding of a general
recovery process in order to define an even more common European framework looking
forward the ECB Banking Supervision.

2. Literature Review
Lots of works related to the estimation of Loss Given Default are concentrated from
2004, when Basel Committee on Banking Supervision definitively ruled the process of
capital requirement calculation [5]. Table 1 summarizes the main information of the
papers analyzed:
Table 1 – Summary of Literature review
Title Authors Year of publication Market Methodolgy adopted Journal of publication
Lea V. Carty, Daniel Gates, Greg M. Moody's Investor Service, Global Credit
Bank-Loan Loss Given Default 2000 US Empirical analysis
Gupton, Research
Default Recovery Rates in Credit Risk Modeling: A Review of the Edward Altman, Andrea Resti, Andrea
2003 - - NYU Working Paper
Literature and Empirical Evidence Sironi
Incorporating Collateral Value Uncertainty in Loss Given Default
Esa Jokivuolle and Samu Peura 2003 - - European Financial Management
Estimates and Loan-to-value Ratios
Empirical analysis
Michel Araten, Michael Jacobs Jr.,
Measuring LGD on Commercial Loans: An 18-Year Internal Study 2004 US Linear regression The RMA Journal
Peeyush Varshney
Log linear regression
Look up tables
Wharton Financial Institutions Center
What Do We Know About Loss Given Default? Shuermann T. 2004 US Linear regression
WP N. 04-01
Log linear regression
LossCalc: Model for Predicting Loss Given Default (LGD) Gupton. G.M., Stein R.M. 2005 US Linear regression Moody's KMV
J. Derminea, C. Neto de Carvalho Linear regression
Bank Loss Given Default: a Case Study 2006 Portogallo Journal of Banking & Finance
Log linear regression
Default Recovery Rates and LGD in Credit Risk Modeling and in Altman "Advances in Credit Risk
Practice: An Updated Review of the Literature and Empirical Edward I. Altman 2008 - - Modeling and Corporate Bankruptcy
Evidence Prediction", Cambridge University Press
Modeling Bank Loan LGD of Corporate and SME Segments: A Linear regression
Chalupa R. et al. 2008 Repubblica Ceca Czech Journal of Economics and Finance
Case Study Log linear regression
Forecasting bank loans Loss Given Default Bastos, J.A. 2009 Portogallo Decision trees CEMAPRE Working Papers
Improvements in loss given default forecasts for bank loans Marc Gürtler e Martin Hibbeln 2009 Germania Linear regression Journal of Banking &Finance
Recovery Rates of Commercial Lending: Empirical Evidence for
Jens Grunert , Martin Weber 2009 Germania Linear regression Journal of Banking &Finance
German Companies
Modelling LGD for unsecured personal loans: Decision tree Lyn C. Thomas, Christophe Mues, Journal of the Operational Research
2010 UK Decision trees
approach Anna Matuszyk Society
Hinh D. Khieu, Donald J. Mullineaux, Ha- Linear regression
The determinants of bank loan recovery rates 2012 US Journal of Banking & Finance
Chin Yi Log linear regression
Predicting loss given default (LGD) for residential mortgage loans: Linear regression
Mindy Leow, Christophe Mues 2012 UK International Journal of Forecasting
A two-stage model and empirical evidence for UK bank data Log linear regression

A realistic approach for estimating and modeling loss given default Rakesh Malkani 2012 - - The Journal of Risk Model Validation
Look up tables
Modeling exposure at default and loss given default: empirical
Yang H.B et al. 2012 US Linear regression Journal of Credit Risk
approaches and technical implementation
Log linear regression
Linear regression
Loss given default modeling: a comparative analysis Yashkir G. et al. 2012 US Tobit model The Journal of Risk Model Validation
Log linear regression
Survival analysis approach in Basel2 Credit Risk Management:
Bonini S., Caivano G. 2013 Italy Survival Analysis Journal of Credit Risk
modelling Danger Rates in Loss Given Default parameter
Going through the works presenting quantitative analysis, the authors highlights that the
most part of papers use large samples (as in [17], where around 140,000 observations have
been used for the estimation), with some expection as in [14], where analysis have been
performed on a sample of 120 observations, and [6] that uses about 350 observations.
In terms of statistical tecniques adopted, the authors have identified two main currents,
below described.
Current 1 includes all the works having as goal the demonstration of some hypothesis or
the identification of the best drivers of recovery rates as in [9] where empirical analysis
have been used; [13], [14] and [15] with linear regression, [21] with decision trees, [7]
with credibility theory, and [8] and [26] with survival analysis.
Summarizing these works, the authors conclude that the loan characteristics (as the
presence and value of credit risk mitigators) are the main drivers of recovery rates.
Current 2 includes all the other works having two goals: they want to identify the main
drivers of recovery rates, but having also as a goal the definition of the best methodology
for estimating recovery rates as in [23], that tries to identify a possible set of statistical
tecniques to be applied for LGD estimation (lookup tables, simple linear regressions,
advanced linear regressions, decision trees).
Analyzing these papers, the authors conclude that also if there are some more predictive
statistical tecniques (as in [6]), it is not possible to define the best methodology because
the choice is related both to the skills level of Credit Risk Department of Banks (as in [23])
and to the type of available data (as in [17]).
As underlined by literature review, studies on LGD are related on the assessment of some
statistical tecniques (e.g. linear regression, decision trees, lookup tables), in order to
identify the best methodology for fitting the recovery rates curve. In some cases, using
only one methodology, they try to investigate the potential drivers of recoveries. But no
study focus on the normative aspects and on the potential whole process of LGD
development inside a Bank. That’s the focus of this paper, based on the development of a
Basel2 compliant model, in which only one statistical approach is used, but a great
emphasis is given to the overall process of development: the authors try to define a
framework of development easy to understand from an economic point of view and easy to
adopt inside banking processes.

3. LGD Methodological Framework


The New Basel2 Accord, implemented by the European Banking System starting from 1
January 2007, has highlighted the relevant role of Loss Given Default (LGD) modeling
for its impact in facility ratings, approval levels and setting of loss reserves, as well as
developing credit capital (as in [20]).
The concept of LGD in Basel 2 Accord is quite close to the one used by researchers and
practitioners: it can be defined as the share of a defaulted exposure that will never be
recovered by the lender. Recoveries have to be assessed in an economic sense rather than
from a mere accounting perspective: when measuring them (ex post) or estimating them
(ex ante), all relevant factors that may reduce the final economic value of recovered
portion of an exposure must be taken into account. This include the discount effect tied to
the time span between the emerging of default (charge-off status) and the actual recovery,
but also the various direct and indirect administrative costs associated with collecting
information on the exposure.
The efficiency levels (in terms of costs and time) of bank’s workout department may affect
LGD quite significantly, and must be reflected in the estimates used to assess recovery risk
on future defaulters. Thus an improvement in recovery procedure can lead to a reduction in
empirical LGDs and subsequently may reduce capital requirements for the following
years.

3.1 Workout LGD calculation


The best practice on European Banks, in particular on Retail Portfolios, is to use a workout
approach. Infact the same CEBS Guidelines2 specifies that:
• “LGD estimates based on an institution’s own loss and recovery experience
should in principal be superior to other types of estimates, all other things
being equal, as they are likely to be most representative of future outcomes ”;
• “The market and implied market LGD techniques can currently be used only in
limited circumstances. They may be suitable where capital markets are deep
and liquid. It is unlikely that any use of market LGD can be made for the bulk of
the loan portfolio ”;
• “The implied historical LGD technique is allowed only for the Retail exposure
class (…). The estimation of implied historical LGD is accepted in case where
institutions can estimate the expected loss for every facility rating grade or pool
of exposures, but only if all the minimum requirements for estimation of PD are
met .
The workout LGD estimation is based on economic notion of loss including all the
relevant costs tied to the collection process, but also the effect deriving from the discount
of cash flows, as required by CEBS Guidelines:
• “The measures of recovery rates used in estimating LGDs should reflect the
cost of holding defaulted assets over the workout period, including an
appropriate risk premium. When recovery streams are uncertain and involve

2
CEBS - GL10 Guidelines on the implementation, validation and assessment of Advanced Measurement (AMA) and Internal
Ratings Based (IRB) Approaches (4 April 2006).
risk that cannot be diversified away, net present value calculations should
reflect the time value of money and an appropriate risk premium for the
undiversifiable risk.
• In establishing appropriate risk premiums for the estimation of LGDs consistent
with economic downturn conditions, the institution should focus on the
uncertainties in recovery cash flows associated with defaults that arise during
an economic downturn. When there is no uncertainty in recovery streams (e.g.,
recoveries are derived from cash collateral), net present value calculations
need only reflect the time value of money, and a risk free discount rate is
appropriate.”
Finally, the workout LGD calculation consists in the calculation of empirical loss rates
through the observation of each charge-off at the end of recovery process, according to the
following formula:

 Re c i δ i   Cost i δ i
T T T
 Ai δ i 
LGD C  1  RR 1 (1)
EAD

All the parameters used in the previous formulas and their meaning are shown in the Table
2:
Table 2 - List of factors for Workout LGD calculation
Parameter Description
LGDC LGD estimated on charge-offs positions
RR Recovery rate on charge-offs
RECi Recovery flow at date i
Ai Increase flow at date i
COST Costs of litigation, collection procedures (e.g. legal expenses) at date i
EAD Exposure at default at the charge-off opening date
i Date in which each cash flow has been registered
T Time before the charge-off opening date
Discount rate of each flow at date i and opened before T (to be applied on a doubtful
δiT
position, next moved into charge-off)

As highlighted above, in workout LGD estimation there are two important aspects to take
into account because of their relevance from a collection and economic point of view:
• Cash flows occurring during the recovery process;
• Discount rate to be used for including in LGD the “economic” notion of loss
(Economic LGD).
3.1.1 Cash flow definition

According to Italian best practice of banking recovery process, the calculation of LGD in
order to bild the target variable for the statistical model starts from the computation of
some amounts related to the account items occurring during the collection process and
representing recoveries, increases or losses, as in Table 3:

Table 3 – Accounting items for LGD calculation


Type of cash flow Specific accounting item
Capital
RECOVERY FLOWS Interests
Costs
Capital
INCREASE FLOWS Interests
Costs
Capital
LOSS FLOWS Interests
Costs

The amounts here defined, according to what required by regulation, need to be discounted
in order to take into account the economic notion of loss, as described in the following
paragraph.

3.1.2 Discount rate estimation

The best Italian and International banking practice on discount rate calculation can be
summarized in four possible approaches here described (see Table 4):

Table 4 – Best practice approaches for discount rate calculation


Approach Description Pro Cons
It is possible to
It analyses the link Not so easy from a
Capital Asset Pricing differentiate the discount
between risk and expected methodological point of
Model (CAPM) rate based on the recovery
return view
volatility
It represents the cost-
opportunity related to the It doesn’t represent the
Contractual interest
replacement of return Easy to apply expected return after the
rate
rates on defaulted default event
positions
It doesn’t take into
account the recovery
It estimates the average
Weighted Average Cost Losses represent a cost for volatility and it is often
cost of funding through
of Capital (WACC) bank characterized by lots of
corporate assets
assumptions for debt and
equity cost
It assumes that It is used in business It underestimates the
Cost of Equity
shareholders pay the performance calculation importance of volatility of
recapitalization cost of recoveries
bank balance sheet

According to authors knowledge of Italian banking practice, the volatility of portfolio


recoveries represents an important aspect to be taken into account in LGD estimation. For
this reason authors choose to apply the CAPM for defining the discount rate to be applied
for Economic LGD calculation.

The Capital Asset Pricing Model (CAPM) assumes that recoveries deriving from
collection process are comparable to cash flows deriving from a defaulted corporate bond.
The theoretical assumptions under the model are:

• Investors are price-taker, so they can not influence the share price with their
actions;
• Investors are rational with respect to one-period horizon and they select
portfolios maximizing expected return and minimizing volatility;
• Activities are traded on market;
• There are no transaction costs or taxes;
• There is an unlimited possibility of debting or investing at a risk-free rate.

These assumptions imply that all the investors have the same expectations and, in a
balanced situation, market portfolio is the one on the efficient frontier.

In this context, the discount rate that make the investor unconcerned about an uncertain
recovery flow or a certain amount at time t0:

r  r f    rm  r f
(2)
where:
- r is the final discount rate;
- rf is the risk-free rate;
- rm is the return rate of market portfolio;
-  is the coefficient computed as:

   REC REC


, MKT
MKT (3)

- REC is the volatility of recoveries;
 REC
- is the correlation between market recoveries and returns;
, MKT

-  MKT is the volatility of market returns.


The CAPM assumes that the expected return of a risk activity is equal to the return of a
risk-free stock and the risk of activity I of market portfolio, as expressed in formula 4:


E (Ri )  R f   i E (Rm )  R f  (4)

where:
- E(Ri) is the expected return of activity i;
- Rf is the return rate of risk-free stock;
- i is computed as:
i
 i   i ,m
m

- i is the volatility of asset i;


- m is the volatility of market portfolio;
- E(Rm) is the expected return of market portfolio;
- E(Rm) - Rf is the risk premium.

3.2 LGD Estimation

In this paragraph authors describe the core of model estimation by focusing on the process of
variable selection and the methodology used for choosing the final set of variables to be used
as drivers of recovery rates. The results of all the methodological framework here described
are shown in chapter 4.

3.2.1 Univariate variable selection process

The overall process of variable selection that authors have applied to realize the model
estimation can be brought back to the following steps:
• Analysis of outliers and treatment of missing values;
• Assessment of each variable goodness as potential driver of recovery rates;
• Analysis of correlation.

3.2.1.1 Outliers and missing values

Authors considered data “normalization” as a very important step recognizing that a variable
distribution with a not negligible number of missing values or outliers values in tails can lead
to biased estimates.
Regarding the treatment of missing values, authors have excluded from the estimation
process all variables with a percentage of missing greather than 80%. On variables having
less than 80% of missing values authors decided to create a specific class on them, thus
changing all continuous variables into discrete variables. In this way the variable distribution
and its relation with risk (given by the empirical recovery rates distribution on each class of
each single variable) can be kept.

On outliers treatment, the percentile distribution of each continuous variable included in long
list has been analyzed, choosing to normalize by cutting distribution below 5^ percentile and
above 95^ percentile. Authors have choosen these cut-offs just observing the distribution
values (see chapter 4) and trying to cut all the continuous variables using the same percentile.

3.2.1.2 Univariate analysis

On all variables (originally or became discrete after the missing value treatment), authors
have applied a framework consisting in these analysis:

A. Empirical analysis, consisting on:


• Analysis of percentage of observations on each class of each variable;
• Analysis of empirical recovery rayes on each class on each variable, in order to assess
the coherence of empirical distribution with the economic meaning embedded in
relation between variables and LGD.

B. Statistical analysis, consisting on univariate linear regression analysis in which


authors have tried to assess the predictive power of each variable in explaining
recovery rates. Details on general theoretical framework of linear regression can be
found in next paragraph. The statistical significance of relationship of each variable
with the observed LGD has been assessed taking into account:
• The sign of estimated coefficient on each modality of each variable;
• The statistical significance of each modality for each variable (p-value);
• The predictive power of each variable through the Accuracy Ratio (AR).

In a generic linear regression:

Y    X (5)
α and β are, respectively, intercept and coefficient of linear regression. In particular 3:

3
X and Y are single observation points of variable X (dipendent variable) and Y (target variable).
(  Y )(  X )  (  X )(  XY )
2

  2
(6)
n( X )  (
2
X)
n (  XY )  (  X )(  Y )
  2
(7)
n ( X )  (
2
X)

The β coefficient represents the average variation of Y as a consequence of a unit change of


X, with all the other regressors considered as constant. The variation of Y can be positive or
negative according to the empirical relationship existing between each variable and, in this
case, the loss rate. This criteria is fundamental in variable selection process because if the
empirical relationship between X and Y doesn’t reflect the expectations from business,
variable can not be selected as significant.

3.2.1.2 Correlation analysis

In order to avoid an overestimation of predictive power of model because of information


redundancy, correlation between variables assessment is required.

For correlation analysis, authors have used Spearman coefficient (useful for discrete
variables) and computed as:

 i  x i  x  y i  y 
  (8)
 i xi  x  yi  y
2 2

where:
 = Spearman coefficient between variables x and y;
xi = value of variable x for i-m observation;
x = average value of x on the overall population;
yi = value of variable y for i-m observation;
y = average value of y on the overall population.
3.2.2 Multivariate analysis and model definition

In order to identify the best combination of variables for maximizing the predictivity of
recovery rates, the authors have choosen to adopt a linear regression approach with a double
goal:
• Maximizing the information power embedded in each variable potentially tested
as driver of recovery rates;
• Ensuring an easy understanding of the model by all the potential users (e.g.
credit analysts, risk managers).

Authors have performed a linear regression model considering as target variable the
Economic LGD (as estimated following the paragraph 3.1), assuming values in range
[0%,100%] thus assimilable to a continuous function. The linear regression framework
assumes a maximum of n-1 modality on which it computes a coefficient with positive or
negative sign depending on the sign of relation of each variable with risk compared with the
variable on intercept. The value assigned by regression model to each coefficient increases or
decreases the risk embedded in model intercept.

The model has been built considering as intercept for each variable the modality
corresponding to a higher empirical recovery rate. For coefficient estimation authors have
used the Ordinary Least Square (OLS) method, using SAS Base software. Given the limited
number of variables selected by univariate process (see par. 4.2), authors have not used
automatic selection variables tecniques (e.g. stepwise).

4. Main results

4.1 Sample description and Economic LGD calculation

The framework proposed in the previous chapter has been applied on a sample of around 26,000
charge-offs with a closed recovery process between 30/09/1999 and 31/12/2012 so composed:

Table 5 – Sample description


Average Nominal
Customer segment # OBS % obs
LGD
Retail customers 15,000 57,80% 44,00%
Small –medium size Corporate (Retail) 3,500 13,50% 47,00%
Medium – Large size Corporate 7,500 28,70% 50,00%
Following the methodological framework proposed in paragraph 3.2.1, authors have estimated a
discount rate to be applied on nominal cash flows for calculating Economic LGD and building the
target variable of model. In particular, the historical data and time series considered for defining each
parameter required by CAPM, have described in Table 6:

Table 6 – Description of CAPM parameters


Parameter Description
Market volatility Standard deviation of logaritmic returns of market
(σM) index MIB30 and FTSEMIB
Asset volatility Standard deviation of logaritmic returns of cumulative
(σi) annual recoveries
Assumption of Basel2 asset correlation for capital
Asset and market correlation (Ri) requirements calculation (k)

MRP set to a value of 5.6%, as in [12]


Market Risk Premium (MRP)

Linear interpolation of time curves of the following


interest: Eonia, EUR001w, EUR001M, EUR003M,
Risk-free rate (Rf)
EUR006M, EUR012M, I05302Y, I05303Y, I05304Y,
I05305Y, I05310Y, I05315Y, I05320Y, I05330Y.

Applying these parameters, authors obtained a final β of 9% for Medium – Large Corporates, and
value of 8% for all the other segments. The final Economic LGD calculated on the sample after the
application of discount rate to the nominal cash-flows assumes a binomial distribution, as in Figure 1
and with an overall average value at sample level of around 45%:

Figure 1 – Economic LGD distribution


4.2 Univariate analysis

The authors have applied the univariate framework described in paragraph 3.2.1.2 on all
the set of variables available. The results of statistical significance of each variable in
explaining the relationship with Loss Rates are shown in the Table 7:

Table 7 – Univariate analysis main results


Type of Univariate Accuracy
Name of variable % missing
information analysis results Ratio
KO - not
Region of localization significant p- 53,00% 16,00%
value
Information on Macro area of localization OK 0,00% 29,00%
counterparty Specific type of economic OK 2,00% 7,40%
activity
Sector of activity OK 5,00% 16,50%
Type of counterparty OK 0,00% 6,10%
EAD OK 0,00% 29,30%
Type of product OK 1,00% 23,20%
Presence of personal
OK 0,00% 18,30%
guarantees
Presence of mortgages OK 0,00% 20,20%
Presence of pledge OK 0,00% 5,20%
Information on Value to Loan of personal KO - Too much
82,00% 21,40%
charge-off guarantees missing values
KO - Too much
Value to Loan of mortgages 85,00% 21,00%
missing values
KO - Low
Value to Loan of pladges 88,00% 2,90%
Accuracy Ratio
Type of recovery process OK 11,00% 25,30%
KO – p-value not
Vintage of recovery process 0,00% 24,40%
significant

A correlation analysis has been performed among all the variables selected by univariate
phase, showing that no variables show a correlation higher than 50% (or lower than -50%).
The univariate analysis shows that the vintage of recovery process is not statistical
significant because of a general inhomogeneity of recovery process among Italian regions.

4.3 Final Model

A multivariate linear regression analysis has been run in order to find the best combination
of variables maximizing the predictiveness of recovery rates. Given the limited number of
variables, no criteria for automatic variable selection have been applied but a combination
of all variables has been tested. The final model identified is described in Table 8:
Table 8 – Final model description
Variables Grouping Coefficient p-value Variable weight
Intercetta 0,1001 <,0001
Center 0,2145
North East 0,1113
Macro-geographical area <,0001 25,00%
Sud & Island 0,0788
North West 0
Exposure at Default EAD 0,1567 <,0001 18,00%
Mortgages 0,1876
Type of product <,0001 9,00%
Other products 0
Absence 0,1134
Presence of personal guarantess <,0001 10,00%
Presence 0
Absence 0,1609
Presence of mortgages <,0001 15,00%
Presence 0
Absence 0,0789
Presence of pledge <,0001 3,00%
Presence 0
In court 0,2021
Type of recovery <,0001 20,00%
Out of court 0

The backtesting performed on the development sample has shown an Accuracy Ratio of
final model of 57%.

5. Conclusions

This paper has presented a case study of LGD in which, according to the requirements of
Basel2, the model has been developed on 10 years of historical real data of Corporate and
Retail portfolio of an Italian commercial bank among the fifteen Italian Banks that will be
supervised by ECB. Giving a particular stress on the economic component of the model,
the presented model highlights the determinant role of mitigators as recovery drivers, but
also the geographical localization of loans, the type of product, the exposure at default and
above all the type of recovery process. This paper adds a real value to the existing
literature because it follows all the Basel2 requirements and is linked to the Italian
Banking context. Italy, unlike the rest of Europe, can be considered a more general and
complicate case of LGD computation because of specific recovery process and more than
one default status (past-due, doubtful, and charge-off). The model proposed, underlying
the statistical significance of type of recovery process as one of the main drivers of
recoveries, contributes to the understanding of the Italian recovery process in order to
define an even more common European framework looking forward the ECB Banking
Supervision.
References

1. Altman, E., Brady, E., Resti, A. and Sironi, A. (2005). The Link between Default
and Recovery Rates: Theory, Empirical Evidence, and Implications. The Journal
of Business, Vol. 78, No. 6, Chicago Press
2. Altman, E., (2008). Default Recovery Rates and LGD in Credit Risk Modeling
and Practice. Advances in Credit Risk Modelling and Corporate Bankruptcy
Prediction, (Stewart Jones ed.), Cambridge University Press.
3. Araten, M., Jacobs, M., and Varshney, P. (2004). Measuring LGD on
Commercial Loans: An 18-Year Internal Study. The RMA Journal
4. Banca d’Italia (2006). Circolare n. 263. Titolo 2 – Capitolo 1, ultimo
aggiornamento
5. Basel Committee on Banking Supervision (2004). International convergence of
capital measurement and capital standards. A revised framework, June.
6. Bastos J.A. (2009). Forecasting bank loans Loss Given Default. CEMAPRE
Working Papers
7. Bonini. S. and Caivano, G. (2014). Estimating loss-given default through
advanced credibility theory. The European Journal of Finance
8. Bonini. S. and Caivano, G. (2013). Survival analysis approach in Basel2 Credit
Risk Management: modelling Danger Rates in Loss Given Default parameter.
Journal of Credit Risk, Vol 9/1, Spring 2013
9. Carty L.V., Gates D. and Gupton G.M. (2000). Bank Loss Given Default.
Moody's Investors Service, Global Credit Research
10. Chalupa, R., and Kopecsni, J. (2009). Modeling Bank Loan LGD of Corporate
and SME Segments: A Case Study. Czech Journal of Economics and Finance,
Vol. 4 – 59
11. Dermine, J., and Neto de Carvalho, C. (2005). Bank loan losses-given-default: A
case study. The Journal of Banking & Finance – Elsevier, Ed.
12. Fernandez, P., Aguirreamalloa, J. and Corres, L. (2012). “Market Risk Premium
used in 82 countries in 2012: a survey with 7,192 answers”. IESE Business
School publication
13. Gupton, G. M., and Stein, R. M. (2005). LossCalc: Model for Predicting Loss
Given Default (LGD). Moody’s KMV
14. Gürtler, M. and Hibbeln, M (2011). Improvements in Loss Given Default
Forecasts for Bank Loans. Midwest Finance Association 2012 Annual Meetings
Paper
15. Grunet, J., and Weber,M. (2009). Recovery rates of commercial lending:
empirical evidence for German companies. Journal of Banking and Finance 33,
505–513
16. Jokivoulle E. and Peura S. (2003). Incorporating Collateral Value Uncertainty in
Loss Given Default Estimates and Loan-to-value Ratios. European Financial
Management
17. Khieu H.D., Mullineaux D.M. and Yi H. (2012). The determinants of bank loan
recovery rates. Journal of Banking & Finance
18. Leow M. and Mues C. (2012). Predicting loss given default (LGD) for
residential mortgage loans: A two-stage model and empirical evidence for UK
bank data. International Journal of Forecasting
19. Malkani R. (2012). A realistic approach for estimating and modeling loss given
default. The Journal of Risk Model Validation
20. Maclahan, I. (2004). Choosing the Discount Factor for Estimating Economic
LGD, in Recovery Risk: The Next Challenge in Credit Risk Management by
Altman, E., Resti, A. & Sironi A.
21. Matuszyk A., Mues C., Thomas L.C. (2010). Modeling LGD for unsecured
personal loans: Decision tree approach. Journal of the Operational Research
Society.
22. Morrison, J.S. (2004), Preparing for Basel II Common Problems, Practical
Solutions, The RMA Journal.
23. Schuermann, T. (2004), What Do We Know About Loss Given Default?, Working
paper on Wharton Financial Institutions Center
24. Yang H.B. (2012). Modeling exposure at default and loss given default:
empirical approaches and technical implementation. Journal of Credit Risk
25. Yashir, O., and Yashir, Y. (2012). Loss given default modeling: a comparative
analysis. Journal of Risk Model Validation, Volume 7/Number 1, Spring 2013
(25–59)
26. Witzany, J., Rychnovsky´, M., Charamza, P. (2010). Survival Analysis in LGD
Modeling. IES Working Paper2/2010.

Das könnte Ihnen auch gefallen