Sie sind auf Seite 1von 36

Risk Chartis Market Report

IFRS 9
Risk.net October 2016

Lead sponsor Sponsored by


Contents

Risk Chartis Overview Model building

IFRS 9
2 History of IFRS9 and the three- 22 Model building for IFRS 9
stage approach Incorporating the right techniques

3 Time is tight Implementing the 23 Whitepaper

Market report three-stage approach


Rohit Verma, head of strategy for
Managing earnings volatility and
uncertainty in the supply and
risk analytics at Oracle, discusses demand for regulatory capital
Hassan Machmouchi, Commercial Director
hassan.machmouchi@incisivemedia.com the challenges associated with the The impact of IFRS 9
implementation of IFRS 9 A novel approach to modelling from
Hugh Stewart, Research Director, Chartis
hugh@chartis-research.com
Moodys Analytics allows better
management of the interplay of supply
Stuart Willes, Commercial Editorial Manager Classification & and demand dynamics for regulatory
stuart.willes@incisivemedia.com
measurement capital, combining an economic
James Hundleby, Commercial Subeditor framework with regulatory capital and
james.hundleby@incisivemedia.com
5 The IFRS9 classification and new loss recognition rules
Celine Infeld, Managing Director measurement model
celine.infeld@incisivemedia.com 26 Cutting edge
Lee Hartt, Group Publishing Director 6 Interview Loan classification under IFRS 9
lee.hartt@incisivemedia.com Co-operation The new imperative IFRS 9 requires classifying non-
for risk and finance functions defaulted loans in two stages
Ben Cornish, Senior Production Executive
ben.cornish@incisivemedia.com Jean Bernard Caen, subject matter depending on their credit quality
expert, AxiomSL evolution since initial recognition by
Incisive Media (UK)
Haymarket House, 2829 Haymarket
the bank. Vivien Brunel proposes
London SW1Y 4RX 8 Feature an optimal way to perform this
Tel: +44 (0)20 7316 9000, A complex nut to crack classification
Incisive Media (US) Move to expected loss impairment
55 Broad Street, 22nd Floor regime brings major challenges, say funda
New York, NY 10004-2501
Tel:+1 646 736 1888
banks and accountants IT infrastructure
Incisive Media (Hong Kong) 30 Implications for IT systems
14th Floor (Unit 1401-3) Devon House
Taikoo Place, 979 Kings Road
Impairment
Quarry Bay, Hong Kong, 31 Whitepaper
Tel: +852 3411 4900 12 Impairment and the three-stage Preparing for the IFRS 9 game-
Twitter @riskdotnet approach The expected credit loss changer How to reconcile the
Facebook facebook.com/riskdotnet model under IFRS 9 demands of risk and finance on a
LinkedIn Risk.net
single, flexible platform
Cover image 13 A strategic approach to IFRS 9 How the challenges to IFRS 9 for
nednapa/Shutterstock impairment financial firms can be overcome by
The Chartis content published in this report is fully Burcu Guner, senior director at using a single, integrated platform with
independent. The content is based on research Moodys Analytics, discusses the a flexible data model
conducted by Chartis over the course of nine months,
by a team of senior analysts. All findings in the report factors on which success in impairment
and published herein remain the intellectual property calculation depends
of Chartis Research. For further materials and
research please visit www.chartis-research.com Profiles
Challenges
34 Sponsor profiles
15 Building on current infrastructure
35 About Chartis Research Services
Published by Incisive Risk Information Ltd
17 Roundtable
Incisive Risk Information (IP) Ltd, 2016 Data challenges in IFRS 9
All rights reserved. No part of this publication may be Four experts in credit loss explore how
reproduced, stored in or introduced into any retrieval IFRS 9 is set to fundamentally change
system, or transmitted, in any form or by any means,
electronic, mechanical, photocopying, recording or the way banks do their accounting, and
otherwise, without the prior written permission of the the data challenges associated with its
copyright owners. RISK is registered as a trade mark
at the US Patent Office implementatione

1 risk.net October 2016


Overview

History of IFRS 9 and the


three-stage approach
International Financial Reporting Standard 9 (IFRS 9) is a high-impact symbolic, operational, IT and organisational transformation
event for finance and risk: an arranged marriage that is turning an uncomfortable courtship and good intentions into a powerful
successful partnership that is greater than the sum of its parts. It is one of a few interlinked, unavoidable initiatives in finance,
regulation, compliance and risk management that are catalysts to invest in sustainable best practice

IFRS 9 has foundations in common with a number of other key regulatory


trends. Therefore, the foundations for an easier implementation of IFRS 9 can be
achieved if an organisation has performed well, for example, in implementing:
Basel Committee on Banking Supervision regulation 239 (BCBS 239) for
data management
Comprehensive Capital Analysis and Review, Dodd-Frank Act Stress Test and
European Bank Authority stress testing
Rigorous enterprise credit and counterparty risk management that is internal
ratings-based (IRB)

Imagentle/Shutterstock
Close working practices, with risk management and finance sharing a common
culture with regard to risk-adjusted performance management

Searching for solutions


Organisational support for implementing and running IFRS 9 will require
change, and through greater involvement of different departments that so
far have not been as directly active in finance activities, particularly risk and History and status of the IFRS 9 standard
regulatory reporting. During the financial crisis, the Group of 20 tasked global accounting standard-
The marketplace, including large Tier 1 financial institutions, is turning to setters with working towards creating a single set of high-quality global
software vendors for solutions. However, this new marriage of finance and risk standards. In response to this request, the International Accounting Standards
is not reflected in most of the software vendors previous experience. There are Board (IASB) and the Financial Accounting Standards Board (FASB) began
very few one-stop shops that encompass the whole process from transaction working together on the development of new financial instruments standards.
origination to audited profit and loss and balance sheet. Therefore, there are also The IASB decided to accelerate its project to replace International Accounting
many integrated, multi-vendor solutions. Standard 39 (IAS 39), and subdivided it into three main phases:
There are few fully complete software packages that reflect the target state Classification and measurement
required by 2018, with deliverables still occurring during 2016, which makes I mpairment
some proofs of concepts reliant on vendor credibility and trust or successes for Hedge accounting
the early deliverers.
The complex structures of large financial institutions demand large in-house At the beginning of the project, the FASB and IASB worked jointly on both the
development, implementation and operations teams, as well as extra support classification and measurement and the impairment projects. However, due to
from external professional advisers. All other financial institutions can rely on the lack of support for a three-stage approach for the recognition of impairment
packaged software marketplace, but they require close support from the large losses in the US, the FASB developed a single measurement model, while the
audit firms, as well as extra consultancy, development and integration resources. IASB decided to continue with the three-stage model. In addition, the FASB
Throughout 2016 and 2017, there will be a shortfall in suitably qualified and decided it would not pursue a classification and measurement model similar to
experienced support services teams in this market sector, which, as mentioned the IASB. As a consequence, IFRS 9 is not a converged standard, and therefore
earlier, faces new methodological and organisational challenges. not applicable under US Generally Accepted Accounting Principles US financial
Data to support impairment modelling and calculations is a critical success firms should refer to the guidelines from the FASB.
factor. If not assembled comprehensively, aggregated and normalised rigorously On 24 July, 2014 the IASB published the complete version of IFRS 9,
within a formal data management and well-engineered IT architecture, Financial instruments, which replaced most of the guidance in IAS 39 and is
companies results will be negatively affected. Many companies, particularly applicable to all jurisdictions operating under IFRS. IFRS 9 is effective for annual
those that have not been through the IRB experience, will have to upgrade their periods beginning on or after 1 January, 2018, subject to endorsement in
IT architecture or rely on a vendors software-as-a-service infrastructure. certain territories.

risk.net 2
The three-stage approach

Time is tight
Implementing the three-stage approach
A cultural shift is occurring beneath banks as they will be expected to accurately handle vast amounts of granular data and adopt
a new flexible and forward-thinking outlook. Rohit Verma, head of strategy for risk analytics at Oracle, discusses the challenges
associated with the implementation of IFRS 9

One of the aims of IFRS 9 was to reduce pro-cyclicality in information about the counterparty that goes to determine whether an account
provisioning for losses on loans and other assets. How far do you is stage one, stage two or stage three.
think it has achieved that ambition, or will it once it is implemented? There are many manual workflow processes that need to be implemented. Its
Rohit Verma: I think the jury is still out on that particular question, but all the not just a rules-based process there are people who will have to actually step into
experts agree it is going to make the financial statements much more relevant the process and validate that the stage classification has gone as per the model.
and much closer to reality than they were prior to implementation. Whether or
not it reduces pro-cyclicality, whether or not it makes it less volatile, is something With IFRS 9 being implemented at the start of 2018, how well
only time will tell once IFRS 9 is implemented and there are numbers available to understood at this time is the three-stage process?
analyse and to see whether it has, in fact, reduced cyclicality. Rohit Verma: I think most banks have understood it pretty well; they know
what needs to be done. The challenge is in implementing the understanding,
It is a big standard a lot is involved and there is not much time because there are so many aspects and 18 months is just not enough time. But
to implement it. Can you outline the broader standard, which splits from a purely understanding perspective, I think most organisations know what
into classification, impairment and hedge accounting? they want to do.
Rohit Verma: Classification is essentially the phase where banks identify
which assets go for fair valuation and which assets go for amortised cost. This is How will it be handled if people are not already uniformly at the
important because the whole idea is that there has to be minimal opportunity starting line? Will there potentially be a kind of phased approach?
for banks to move assets around and gain some kind of an arbitrage benefit. Rohit Verma: Absolutely. I think the stage identification process is going
Phase two is impairment, where the maximum impact has been in coming up to start off with some basic rules-based approach. The stage identification
with a completely new model for calculating the provisions. We had the incurred- process establishes whether there has been deterioration in the quality of the
loss model that was being followed, and now banks are required to move asset from the date of its inception, except for assets that are purchased or
to an expected-loss model, which is more forward-looking and incorporates originated credit impaired.
projections about the economy into the calculation process. This rules-based approach to identify a stage is going to evolve over time. Its
The third phase is hedge accounting, where the effectiveness of a hedge is going to become more sophisticated; its going to bring more workflows into
essentially made one of the key criteria of deciding whether or not the hedge the process so that the classification becomes consistent over time and across
gets used in the accounting process. portfolios. To begin with, most banks may implement a rather more simple
version of the model. Once they are comfortable they have the systems in place,
To focus on impairment thats also split into three stages? theyre going to start tweaking the model and make it more sophisticated.
Rohit Verma: Yes, so the first step of the impairment process is to classify
each and every account into one of the three stages: stage one is an account How would you summarise the changes that banks will need to
that is active and is going fine; stage two is an account that is showing signs make to their data internally, and the way they project to meet
of distress; and stage three is an account or asset that has objective evidence that requirement?
of impairment. Rohit Verma: The biggest change to data that banks will have to make is to
This entire classification process is critical because the stage to which an start handling granular account-level data. Finance organisations are typically
account gets mapped will determine the provision calculated for that account. not used to granular data; theyre very comfortable with aggregated data, but
That is crucial you have to bring qualitative aspects into the assessment and with the stage classification and expected loss calculations of IFRS9 they will
forward-looking projections into the assessment of a stage, as well as actual have to learn to handle granular data. That will be a cultural shift.

3 risk.net October 2016


The three-stage approach

What role will technology play in the transition to IFRS 9, and how
should banks be approaching their vendors?
Rohit Verma: Now that you have to handle millions of records as an institution,
there is no option other than having a really robust technology solution in
place to go live with IFRS 9. In the old world it was fine you were running on
largely aggregated data and you could probably run your entire provisioning
calculations on a spreadsheet. But now, having to perform granular-level
calculations, you need to have a fairly advanced solution in place if you have any
ambition of going live with the 2018 date.
As you implement this solution, a couple of things need to be kept in mind.
First is that IFRS 9 itself is going to evolve over time. There could be changes
brought in by the regulator or by the International Accounting Standards Board;
there could be changes in the model that you, as an organisation, will bring
to the calculation. As you implement the solution, you must ensure you have a
solution in place that can evolve as you, as an organisation, change the model,
or changes are pushed onto you from the external world.

So, flexibility and customisation are going to be key?


Rohit Verma: Yes, the ability to make changes and tweaks to the methodology
as we move along. The third thing of some relevance is the fact that a black box
point solution for IFRS 9 would really be a waste of an opportunity. I recommend
a fairly strategic approach to the whole project so that once you are past the
2018 deadline you can start thinking of leveraging this ecosystem to do things
such as enterprise stress testing, regulatory capital and economic capital, and in
that process comply with regulations like BCBS 239.

Rohit Verma The way accounting standards are going to have to change and
with the impact on finance and risk, it will be quite easy to get
In that sense, this also provides an opportunity for banks to align risk and bogged down in compliance. From a business perspective, how
finance, because risk has operated on granular data for years. Its the finance should banks be positioning themselves? What are the changes
organisation that now has to adapt, and it provides an opportunity for banks to they might need to make ahead of this standard?
bring risk and finance data together into a single environment. Rohit Verma: I think it will give them an opportunity to make improvements
in their businesses. Take product design and product pricing as an example, in
This is obviously going to impact on finance departments because the old world you were dependent on an incurred-loss model, which was really
theyre going to need to have a more granular view of data. What backward-looking. Now we are talking about a forward-looking expected-loss
impact will the standard have on banks other business functions model, which is a lot closer to reality.
such as risk, accounting, audit and governance arrangements? That means you can make the pricing a lot finer for your customers than it
Rohit Verma: Regulations such as BCBS 239 talk about bringing data together. was under the old approach, because the number you have for your expected
Not just for the sake of data, but to ensure that there is consistency in the use loss is a lot closer to reality and is also reflective of the kind of economy that you
of data across the organisation and that results are much more accurate and may expect in the future, rather than backward-looking.
reconciled than they were previously.
There are short-term implications and then there is the long-term strategic
impact of such a project. In the short term, what I think were going to see
is the beginning of this convergence of data across different streams risk
Rohit Verma
and finance. There is going to be more emphasis on the governance aspect Rohit Verma is senior director at Oracle, where he is responsible for risk analytics
of data. strategy. He currently manages strategy and customer relationships for applications
In the long term we will see the calculations or the IFRS 9 project starting covering liquidity risk, IFRS 9 and FRTB. Previously, he was product manager
to have an impact on the business aspect as well. So, product design will for capital adequacy, enterprise stress testing and asset/liability management
have to incorporate the impact of IFRS 9 and banks will have an opportunity applications. In his tenure at Oracle, Rohit has also consulted on various risk and
to fine-tune product pricing because the expected loss numbers are closer to finance projects spanning complex multi-jurisdictional enterprise programs to tactical
reality; there is going to be an impact on the capital plans because of the new deployments. Prior to Oracle, Rohit held a number of positions in capital markets and
numbers that come out of IFRS 9. It is going to have a long-term impact on all corporate banking organisations.
three of these areas.

risk.net 4
Classification & measurement

The IFRS 9 classification


and measurement model
IFRS 9 provides a new model for the classification and measurement of debt financial assets driven by the business model in which
the assets are held and their cash flow characteristics thus dictating the applicable accounting mechanism

mighty chiwawa/Shutterstock
Any equity financial instruments are required to be carried on the balance sheet There are few fully complete software packages that reflect the target state
at fair value. The rules relating to the classification and measurement of financial required by 2018, with deliverables still occurring during 2016, which make
liabilities contained in IAS 39 are carried almost unchanged into IFRS 9, with one some proofs of concepts reliant on vendor credibility and trust or successes for
exception relating to the recognition of own credit gain, whereby any fair value the early deliverers.
gains and losses on liability measurement are reported. Large financial institutions complex structures demand large in-house
For the mixed-measurement model, the three main accounting mechanisms are: development, implementation and operations teams, as well as extra support
Amortised cost from all of their external professional advisers. All other financial institutions
Fair value through profit and loss (P&L) can rely on the packaged software marketplace, but require close support
Fair value through other comprehensive income from the large audit firms as well as extra consultancy, development and
integration resources. Throughout 2016 and 2017, there will be a shortfall
Organisational support for implementing and running IFRS 9 will require change, in suitably qualified experienced support services teams in this market
through greater involvement of different departments that, until now, have not sector, which as mentioned earlier has some new methodological and
been as directly active in finance activities. These particularly include risk and organisational challenges.
regulatory reporting. Data to support impairment modelling and calculations is a critical
The marketplace, including some large Tier 1 financial institutions, is turning success factor. If this data is not assembled comprehensively, aggregated and
towards the software vendors for solutions. However, this new marriage between normalised rigorously within a formal data management and well-engineered
risk and finance is not reflected in most software vendors previous experience. IT architecture, then firms results will be negatively affected. Many firms
There are very few one-stop shops that encompass the whole process from particularly those that have not been through the internal ratings-based
transaction origination to audited P&L and balance sheet. Therefore, there are experience will have to upgrade their IT architecture or rely on a vendors
also many integrated, multi-vendor solutions. software-as-a-service infrastructure.

5 risk.net October 2016


Classification & measurement

Co-operation The new


imperative for risk and
finance functions
In the face of pressures from the incoming IFRS 9 regime, walls between risk and finance divisions must tumble and co-operation
and communication take centre stage, says Jean-Bernard Caen, subject matter expert at AxiomSL, who fleshes out this challenge in a
conversation with Risk.net

IFRS9 has been designed to value assets and with regulatory requirements and he speaks the
liabilities in a more risk-sensitive manner than under language of probabilities and statistics. Up to now,
the prevailing IAS 39 framework. The pressures of accounting has been mainly backward-looking;
the incoming regime can only be met if the risk and with the introduction of risk sensitivity, forward-
finance divisions within banks learn to work more looking risk items will change the nature of
closely together. Presently, risk and accounting data accounting. Under IFRS 9 they will have to
is aggregated separately and obeys different rules co-operate. The CRO will be delegated
and norms. Yet inputs from across these divisions responsibility to set up the credit risk models to
are needed to generate IFRS9 reports. assess asset impairments, while the CFO will take
charge of mapping those model outputs onto the
How would you characterise the challenge of published accounts.
implementing IFRS 9 for financial institutions? At present, accounting is straightforward for an
Jean-Bernard Caen: Banks have assigned a asset you book the initial value and, under certain
high priority to their implementation projects conditions, show amortisation. Under IFRS9,
and allocated significant budgets to them. accounting becomes much more complex. The
However, these projects are not proceeding as value of an asset will be the discount of projected
efficiently as they should. The major reason is the cashflows. Moreover, these projections, will need to
difficulty getting risk and finance professionals exhibit a granularity beyond what the risk function
to communicate. They have been accustomed to Jean-Bernard Caen is used to producing under the exposure-at-default
working independently, and they have their own approach. Both functions will have to embrace this
language, culture and data frameworks that do not new logic and come to a shared understanding.
fit easily together. This is a real constraint.
In this market landscape of analysis and regulatory compliance, financial What are the challenges involved in producing the correct assessment of
institutions that are successfully implementing BCBS 239 and data management expected credit loss?
within their organisations are better prepared for IFRS 9. AxiomSL enterprise Jean-Bernard Caen: With IFRS 9, the International Accounting Standards
data management capabilities include the ability to map, aggregate and enrich Board is demanding that firms take an economic approach to credit risk.
source data and models across risk and finance, automate workflow process, This means calculating the risk of loss on an asset up to maturity with
reconcile and validate, as well as delivering visualisation, collaboration and proper recovery estimates and a forward-looking stance, rather than relying
reporting tools. on simplistic models, such as the regulatory internal ratings-based model.
Essentially, banks must completely review how they assess credit risk an
How will the relationship between the chief risk officer (CRO) and chief annual review of credit ratings will no longer be sufficient.
financial officer (CFO) need to change in an IFRS 9 environment? The second challenge will be to assess the probability of default to maturity,
Jean-Bernard Caen: They will have to learn to communicate. At present, rather than simply to a one-year horizon, which requires access to vast amounts
they have very clear and separate responsibilities. The CFO signs off on the of data to generate these estimates. The third challenge will be for banks to
accounts and is used to talking in terms of assets and liabilities precise include forward-looking scenarios, as under IFRS 9 they will have to be explicit
values. In contrast, the CRO is in charge of ensuring that the firm is compliant about how they see the economy evolving.

risk.net 6
Classification & measurement

What should financial institutions prioritise: modelling expected losses How important is it for institutions to be able to trace the lineage of data
under IFRS 9 or ensuring data governance is up to scratch? inputs into their IFRS 9 systems?
Jean-Bernard Caen: Data. Without the right data, model outputs will be Jean-Bernard Caen: The way most banks work today is to originate data in
flawed. Modelling is more sexy. Banks have plenty of people skilled at their production systems and run them through an extract, transform and load
modelling and too few dedicated to data management. The large amounts of (ETL) process. The data is then fed into an intermediary database from which the
data needed to satisfy IFRS 9 requirements are either not stored in firms existing bank produces its regulatory and management reports. The problem with this
databases or, if they are, have not been used for years and may prove unreliable. approach is that it is difficult to maintain data quality and lineage when data is
A focus on modelling can set banks down the wrong path. For example, transformed by the process. Furthermore, if there is a change at the production
they have to transition from using their one-year through-the-cycle probability- system level, then the ETL processes have to modified accordingly, meaning that
of-default models to point-in-time models and lifetime models for assessing frequent and painful reviews are the norm.
impairments. However, what I have learned from industry surveys is that many This also means that the data trail can go cold. If a production system
banks are considering making this transition by merely installing a multiplier into is changed, and data generated under the previous system is warehoused
their one-year models. That is not the right way to go about this. The right way separately, you lose the ability to data mine and trace that data to its original
is to start with the data analysing the term structure of projected defaults by source. This limits the flexibility and resilience of the data environment.
product and counterparty type, and sourcing useful economic data to produce AxiomSL delivers a change management approach that demonstrates full
forward-looking scenarios. Some banks are considering asking rating agencies to data and process lineage across enterprise core financial, treasury, operations
provide this data, which is a possible first step. and risk management functions. Generated reports can be programmed so
a user can trace a data point straight to the initiating production system. Put
What is the relationship between BCBS 239 and IFRS 9, and how should simply, you can take any current or historical report and determine where the
institutions factor this into their IFRS 9 implementation plans? data originally comes from, allowing for seamless data mining. You must still
Jean-Bernard Caen: BCBS 239 sets out rules and principles that banks must reconcile changes in the production system with the reporting system, but this
apply to their risk data aggregation processes and reporting. To be compliant, automation makes isolating a data points lineage much simpler.
banks must adopt processes to ensure the quality and reliability of their risk Today, banks face important choices about how to respond strategically to
data, conduct more granular analyses of this data and increase the frequency of the convergence of risk and finance data environments. They have to reduce
data reviews. In short, firms have to update their IT infrastructures. business-as-usual costs and achieve regulatory and accounting compliance, all
However, this revamp does not apply to accounting data. Yet IFRS 9 will the while enhancing business decision processes; adopting temporary tactical
require the use of data that is being constrained by BCBS 239. Once again, this solutions will impose structural drags on a firms ability to meet its expected
means there will need to be greater co-operation between risk and accounting results and manage its future obligations.
functions. The ideal scenario is one where BCBS 239-constrained data is easy for
the accounting function to access and any questions as to the origin of the data
are readily answerable.

How can risk and accounting data be reconciled to fulfil IFRS 9


requirements? About the author
Jean-Bernard Caen: This data will always be articulated differently. The goal
should be for banks to be able to articulate accounting data with risk data. Jean-Bernard Caen is a subject matter expert at AxiomSL specialising in risk and
This requires a uniform segmentation of assets and liabilities into products and finance. He is responsible for supporting AxiomSLs client base in analytical and
portfolios one that is coherent across risk and finance. reporting issues on IFRS 9 and providing direction and insights into the firms
From a technology perspective, the IFRS 9 requirement to converge risk and professional services and product development teams.
finance environments is now more compelling than ever. Duplication of work Prior to joining AxiomSL, Jean-Bernard was head of economic capital and
such as parallel processes making the same set of calculations without co- strategy for Dexia Group for 12 years, where he was in charge of Basel II and Pillar 2
ordination with other teams should be minimised, and these processes should implementation, and risk/finance co-operation.
be automated to improve the soundness and reliability of the banks results. In 1990, Jean-Bernard founded Finance & Technology Management, a management
AxiomSLs data-driven structure enables banks to adopt a strategic consulting firm he ran for 12 years as chief executive officer. As such, he directed
posture beyond merely complying with minimum reporting and compliance numerous assignments for European financial institutions in the areas of shareholder
requirements. AxiomSLs change management platform acts as a catalyst to value, risk management, capital allocation and asset-liability management.
move to a converged risk and finance data environment, where close co- Jean-Bernard is a member of the Professional Risk Managers International
ordination with other data-related initiatives such as Basel III, stress testing, Association France executive committee, of the Association Francaise de la
BCBS 239, Comprehensive Capital Analysis and Review regulation, and the Gestion Financire management board and teaches at the French National School
Fundamental Review of the Trading Book provides senior managers with the of Economics and Statistics. He is a French civil engineer and graduated from
ability to monitor redesigned processes and enforce control frameworks at Massachusetts Institute of Technology.
business unit and product levels.

7 risk.net October 2016


Classification & measurement

Image: Lightspring/Shutterstock

A complex nut to crack


Move to expected loss impairment regime brings major challenges, say banks and accountants. By Michael Hegarty

Need to know
With perfect hindsight, banks might have been mean revamping accounting systems and the
F rom January 2018, IFRS9 will usher in a spared the pain of the financial crisis. But in the way banks are organised, and it could also entail
forward-looking expected loss absence of a crystal ball, magic mirror or time modifying credit risk models, hiring quants and
accounting regime for assets subject to machine, such a level of clairvoyance is hard to improving governance.
impairment, suchas loans. achieve. In its absence, banks and regulators have I dont see much of an industry convergence,
IFRS9 will increase banks loan-loss searched for ways of making sure the possibility says Wolfgang Reitgruber, head of credit risk
provisions, but it is also proving tough to of future downturns is considered for instance, modelling at UniCredit in Austria. I would expect
implement due to the wide-ranging by using counter-cyclical capital buffers, more accountants will see pretty diverse environments
changes needed and a lack of detail on pessimistic modelling assumptions and regular after 2018 so lots of different approaches, which
how this should be done. supervisory stress tests. will make it increasingly difficult to compare one
Broadly, banks approaches fit into two Accountants are also doing their bit. From bank to the other.
categories: some are choosing to use January 2018, IFRS9 will usher in a forward-looking Sources close to the International Accounting
complex models involving Monte Carlo expected loss accounting regime for assets subject Standards Board (IASB) say the principles-based
simulations, while others are assigning to impairment, such as loans. Theidea is to force nature of the standard means it is up to banks,
weights to future scenarios using expert banks to consider the impact of potential adverse regulators and auditors to agree on best practice.
judgement. scenarios before they occur and ensure adequate But for one IFRS9 project manager at a large
The whole thing is crystal ball gazing at reserves are set aside to cover them. That would be European bank, that approach brings little comfort.
theend of the day, complains one a contrast to banksresponse to the 2008 financial Its frustrating in some sense that were at this level
IFRS9 project manager at a large crisis, when they were criticised for being too slow of the project and suddenly these issues are still
European bank. The longer you go out, to recognise losses. dangling like this with 18 months to go-live, he says.
the more uncertain it becomes, and if Once implemented, IFRS9 is expected to The new expected loss regime is among a
everyone had perfect hindsight wed all significantly increase banks loan-loss provisions number of changes in IFRS9 that are expected
be on a beach drinking tequila. (see box: The capital crunch). But even getting to have a big impact on banks. The new standard
Some banks and policy experts speculate to that point is going to be tough, say banks. is the culmination of a decade-long project to
that a three-stage system for classifying Forone thing, there is confusion over what will reform accounting for financial instruments, which
assets under IFRS9 could increase pass muster under the rules, with the largest covers three areas: classification and measurement;
secondarymarket liquidity in loans. global accounting firms said to be offering varying impairment; and hedge accounting. Impairment is
interpretations. Putting the standard in place will the last ofthese to beaddressed.

risk.net 8
Classification & measurement

Originally, IFRS9 was a joint project between


the two major accounting standard-setters the A year and a half before go-live there was a three-inch-thick Basel II bible
IASB and the US Financial Accounting Standards
that everybody had sitting on their desk, and they could go and check
Board (FASB) and part of a wider attempt at
convergence between US and global accounting the specific requirements. IFRS9 is not as far along and the key parts are
rules. But following disagreements between the two written down on two pieces of paper Scott Aguais, credit risk consultant
groups, the FASB has since been working on its own
version of the standard that includes its own rules
on impairment, which are expected to be unveiled in had hoped a single best estimate of expected losses A year and a half before go-live there was a
the first half of this year. might fit the bill, but those expectations were dashed three-inch-thick Basel II bible that everybody had
Under the IASBs prior standard on accounting at a December 2015 meeting of the IFRS Transition sitting on their desk, and they could go and check
for financial instruments, known as IAS 39, losses Resource Group for Impairment of Financial the specific requirements, he says. IFRS9 is not as
on financial assets subject to impairment are not Instruments a discussion forum for banks, auditors far along and the key parts are kind of written down
recognised until there is evidence that they have and the IASB, which seeks to tackle implementation on two pieces of paper.
become impaired. IFRS9 represents a radical challenges. The group concluded that banks should It means banks are pursuing different ways of
departure from this philosophy, forcing banks to incorporate multiple scenarios, rather than the most generating unbiased and probability-weighted
make greater and earlier provisions against losses. probable future one, because loan losses would be estimates of expected losses. Broadly, the
Under IFRS9, banks will have to immediately distributed asymmetrically around the most likely approaches fit into two categories: some are
set aside 12-month expected credit losses from outcome. In other words, a slightly more negative choosing to invest in complex models using Monte-
the time any unimpaired asset is originated or economic outlook would imply a larger increase in Carlo simulations to estimate losses using forward-
purchased. They must then track the assets defaults than the decline that would be experienced looking information, while others are assigning
changing credit risk at each financial reporting date with an equally positive forecast. weights to various future scenarios, using expert
using a three-stage process. Its going to be unlikely in many cases that judgement and techniques borrowed from scenario
For stage one assets, banks would set aside you can just have a single best estimate forecast, analysis and stress-testing.
12-month expected credit losses and calculate affirms Chris Spall, London-based global leader Aguais and other credit risk modellers advocate
interest revenue based on the gross carrying amount. on IFRS financial instruments at accountancy firm using Monte Carlo simulations as the purists
If a stage one asset were to undergo a significant KPMG and a member of the IFRS group. Youre solution to IFRS9. They argue the techniques ability
increase in credit risk, it would move to stage two and going to have to think about what the other to simulate a huge number of potential future
the bank would have to begin setting aside expected possibilities are and probabilities that attach to outcomes and come up with best estimates of credit
losses for the entire lifetime of the asset. If the asset them, and to what extent those different outcomes risk metrics stands a better chance of meeting the
were to become credit impaired as reflected by a could have an asymmetric impact on your estimate. standards language.
missed payment or a broken covenant, for example The group reiterated the principles-based Monte Carlo simulations involve random
it would move into stage three. In addition to setting nature of IFRS9, and said banks should consider sampling being carried out many times over, based
aside lifetime expected losses, stage three requires information from different sources and look at on historical data, allowing thousands of potential
that interest revenue is calculated on the net carrying several different scenarios. However, it stopped future scenarios to be mapped, with expected losses
amount, meaning expected losses must be taken into short of specifying exactly how many scenarios they calculated for each one. The probability of these
account (see table A). should use, saying instead that arepresentative expected losses would be incorporated and all
Once IFRS9 is in place, some banks and policy sample would be needed. the losses aggregated to give a single, probability-
experts believe this three-stage approach could It is the lack of detail in IFRS9 that is weighted average.
encourage banks to trade their loan portfolios, confounding banks, says Scott Aguais, a London- Notwithstanding this, most banks that spoke to
because stage two assets may revert back to stage based credit risk consultant who helped to Risk.net had gone down the second, judgement-
one by being purchased by a rivalbank, allowing build Basel II credit risk models at Barclays and based route. Both types of approach entail
the buyer to take lower lossprovisions (see box: Royal Bank of Scotland. Compared with the challenges. There are lots of ideas and lots of
IFRS9: good news for loan liquidity?). implementation of Basel II, for example, thelevel things being tried, but I dont think there is any
Where things get complicated is when working of guidance being offered to banks on IFRS9 industry consensus yet on the right way to do it,
out expected losses. IFRS9 says banks should compares poorly, he says. says Tom Millar, London-based accounting partner
provide an estimate of expected credit losses
to reflect an unbiased and probability-weighted A The three stages of IFRS9
amount that is determined by evaluating a range
Stage 1 Stage 2 Stage 3
of possible outcomes. Moreover, banks will need
12-month expected losses Lifetime expected losses Lifetime expected losses
to back their estimates up by considering all
Interest calculated on gross Interest calculated on gross Interest calculated on net
reasonable and supportable information, including carrying amount carrying amount carrying amount
that which is forward-looking. Source: IASB
It is not clear what this means in practice. Some

9 risk.net October 2016


Classification & measurement

and head of the global IFRS banking survey at


Deloitte. Its extremely difficult to practically do There are huge amounts of new risk and estimation processes and revised
something that is as detailed, granular and data-
based as they might like to do, partly because of
governance procedures that need to be added, because of the judgemental
the time, and partly because information and the nature of forward-looking inputs David Schraa, Institute of International Finance
methodologies simply dontexist yet.
While both approaches require changes to
accounting systems, firms choosing to use Monte At the moment, banks tend to base their loan- to have unbiased estimates, says David Schraa,
Carlo simulations must ensure they have the right IT loss estimates on a single baseline scenario, so regulatory counsel at the Institute of International
in place to do the computational heavy lifting. The working with multiple visions of the future is a Finance in London. Its a very complex nut.
use of complex modelling techniques may appear new frontier for bank economists and one that Potentially, one way to make IFRS9 compliance
to be a black box to senior management, says makes some uncomfortable. Our economists easier is to reuse existing credit risk models.
the IFRS9 programme manager, who warns that a are saying they are unable to assign these Banks following the internal ratings-based (IRB)
reliance on historical data could make the models probabilities, says Brunel. approach to Basel II use several standard credit
vulnerable to future market crises. Once banks have come up with probabilities risk parameters, such as PD, LGD and exposure-at-
On the other hand, banks going down the for various scenarios, they must decide how default. The expected credit loss is the mathematical
judgement-based route also have work to do. Vivien toincorporate these as expected losses. One product of these measures.
Brunel, head of risk and capital modelling at Societe approach being taken is to attach probabilities One problem is that the numbers used for
Generale in Paris, says there is a debate going on toloan-loss metrics, such as probability of default regulatory capital may not be entirely suited to
about how to blend several different scenarios into (PD) and loss-given default (LGD). Another is the use IFRS9. KPMGs Spall admits it is a massive effort to
the banks models with the required probability of probabilities to weight macroeconomic variables. re-engineer the systems to get banks from existing
weighting. In theory [the forward-looking To obtain unbiased estimates from a panel of Basel II IRB models to what is required by IFRS9.
information] should account for all the possible economists, a strict challenge process is needed, For instance, the IRB approach requires the
outcomes in any type of scenario, but its very difficult say experts. There are huge amounts of new risk use of through-the-cycle PD, which attempts to
to implement like this, because you need to assign and estimation processes and revised governance average out the peaks and troughs in default rates
probabilities to the future economic scenarios, which procedures that need to be added, because of the caused by the credit cycle. Under IFRS9, banks
is difficult in terms of model accuracy, hesays. judgemental nature of forward-looking inputs, will have to use point-in-time numbers, which can

CAPITAL IMPACT
The immediate impact of IFRS9 will be to increase the aside lifetime expected losses for assets that are im- Austrian regulator, Finanzmarktaufsicht, who is closely
provisions held by banks against future losses. In a paired could leave them out of pocket, because only involved with the EBAs work on IFRS9, says the exer-
speech to a London conference in September last year, provisions applying to 12-month expected losses are cise will involve hundreds of Europes largest banks
Hans Hoogervorst, chairman of the International considered. Banks using the standardised approach to and will be used by the Basel Committee to help
Accounting Standards Board, said the increase would credit risk would fare even worse, because there is no inform its decision on future capital rules. The first re-
be of the order of around 35%, although this is provision-miss calculation in the simpler rules. sults are expected in mid-2016, he says, with a second
expected to vary widely between banks. What we will have is no recognition of these round of results probably in early 2017.
Critically, the effect will also depend on how the additional provisions that have been taken, other than Grnberger is adamant that any changes to cred-
regulators respond to the new standard. the potential under the current rules for excess provi- it risk capital rules must be finished by the time
Banks using the Basel II internal ratings-based (IRB) sions, says Adrian Docherty, head of the financial IFRS9 is implemented, and says the European Union
approach to credit risk can obtain limited relief against institutions advisory team at BNP Paribas in London. should beprepared to go it alone if necessary. If
loan-loss provisions. Using a mechanism known as the My current understanding is that IFRS9 will deplete the Basel Committee does not do it in time we will
provision miss, banks must compare their expected common equity Tier 1 capital dollar for dollar. do it on a European scale, he says. There is pres-
losses over a 12-month period with any accounting Banks say the Basel Committee on Banking Super- sure on the Basel Committee to solve the issues dur-
provisions that correspond to those losses. Banks with vision should change its credit risk rules to account for ing the next year or so, because IFRS9 is going to be
provisions larger than their expected losses can recog- the mismatch, but the committee is not expected to implemented in less than two years time and we
nise the difference in Tier 2 capital for up to 0.6% of take any action until after the Financial Accounting need a solution.
their credit risk-weighted assets, although this number Standards Board (FASB) has released its own impair- If the committee fails to act, a possible EU solution
may be constrained by their national regulator. Con- ment rules in the US expected in the first half of might involve narrow amendments to the Capital
versely, if banks 12-month expected losses are larger thisyear. Requirements Regulation or guidance from the EBA
than the relevant provisions, they must put aside On January 27, the European Banking Authority on how regulators should interpret existing rules,
additional Tier 1 capital to cover the difference. (EBA) launched its own impact assessment of IFRS9. saysGrnberger.
Under IFRS 9, banks say the requirement to set David Grnberger, head of IFRS9 enforcement at the The Basel Committee declined to comment.

risk.net 10
Classification & measurement

IFRS9: GOOD NEWS FOR LOAN LIQUIDITY?


Some banks and policy experts speculate that the lifetime provisions that would have been held by the preparing loan portfolios to be traded as a result of
three-stage system used under IFRS9 could increase selling institution. IFRS 9. However, he refuses to be drawn on which
secondary market liquidity in loans. The innovation here is that if someone steps in particular firms are involved. He says the rules on life-
Under IFRS 9, banks must set aside 12-month the new owner of the asset potentially has a dramati- time provisions will incentivise banks to hold loans in
expected credit losses from the moment an unim- cally different expected loss calculation than the origi- the trading book rather than the banking book, and he
paired asset is originated or purchased. If it undergoes nator, says a US-based policy expert and former expects some trades involving these assets to emerge
a significant increase in credit risk, the asset moves government official. as early as next year.
from stage one to stage two, and the bank will have to Some say the resultant increase in loan liquidity Greater liquidity in NPLs is something that would
begin setting aside expected losses for the entire life- could be a positive development for European banks, be welcomed by European regulators, says David
time of the asset. which have struggled to shed their non-performing Grnberger, head of IFRS 9 enforcement at Finanz-
If the asset were to become credit impaired as loans (NPLs) in recent years. marktaufsicht, the Austrian regulator. But he does not
reflected by a missed payment or a broken covenant, If youre looking to incentivise banks to unload believe the impact will be as significant as others hope.
for example it would move into stage three. non-performing assets there has to be a mechanism High transaction costs are likely to impede liquidity, he
Moving from stage two to stage three has no im- to ensure the acquisition of NPLs and related provi- says, while there is also a problem of information
pact on expected losses, but means interest revenue sioning has some benefit. That 12-month reset on asymmetry between buyers and sellers.
must be calculated on the net, rather than gross, carry- acquisition could be very useful in incentivising asset For us, as banking supervisors, this would be a nice
ing amount. acquisition at a discounted rate, adds the result, because we like higher liquidity on loan mar-
Theoretically, if a bank held a loan in stage two and policy expert. kets, but I am not sure whether this is going to hap-
it was purchased by another bank, it would be reset to Adrian Docherty, head of the financial institutions pen, Grnberger says. It might, in some areas or
stage one. This would allow the purchasing bank to advisory team at BNP Paribas in London, says some niches, produce a market, but I dont expect this to be
setaside 12-month expected losses and not the full more sophisticated European banks are already a really striking effect of IFRS9.

make a huge difference. banks are through-the-cycle, as opposed to point-in- next year is going to be a crisis or not.
If those commercial and corporate [through- time. I spent most of my career trying to document The large European banks IFRS9 project
the-cycle] PD models that everybody is using for what I think is a point-in-time versus a through-the- manager puts it more bluntly: The whole thing
Basel II dont reflect the credit cycle, they arent fit cycle probability of default, but not all banks have the is crystal ball gazing at the end of the day. The
for purpose for IFRS9, says Aguais. Your credit same definition, saysthe IFRS9 project manager. trouble with forecasting is the longer you go out,
losses can vary across the credit cycle by a large And, for a lot of banks, the whole concept of point-in- the more uncertain it becomes, and if everyone
factor of up to 10 times relative to using [through- time risk measures is quite alien tothem. had perfect hindsight wed all be on a beach
the-cycle] models. Whichever path banks choose, it is clear there drinking tequila.
Obtaining point-in-time numbers from through- is a lot of work to do, and its important to find Now the IASB has spoken, banks say they
the-cycle models means building the credit cycle the right people to do it. But this is a struggle, say urgently need more clarification from regulators
into them by altering the core econometrics or firms. Most banks are investing tens of millions on which approaches to modelling would be
overlaying cyclical variations on to the model by of pounds over this four-year effort, and most of acceptable. With the January 2018 deadline
adjusting its outputs. Both will require more work that is going on talent and skills, says the IFRS9 looming, firms say they do not want to invest time
for banks, but Aguais says adjusting the core project manager. Resourcing is one of the biggest and money developing new systems and processes
econometrics is a tougher job than adding challenges at the moment, because everyones only to find they dont meet the supervisors
an overlay. competing at the same time for what is a very niche interpretation of IFRS9.
In a similar way, some banks taking the pool of quantitative skills. Clearly, the timelines are very short, says
judgement-based route are looking to make Once IFRS9 is implemented, will all of the theIFRS9 project manager. Thats the big concern
use of work they have already completed for effort be worthwhile? Banks and industry of banks at the moment: there isnt much room
supervisory stress tests, such as those required observers seem sceptical. Although the aim of the for manoeuvre if the goalposts were to change
annually by the US Federal Reserve Board and IFRS9 impairment rules is to encourage faster significantly over the course of the next six months.
the European Banking Authority. Such firms recognition of possible future losses, UniCredits Deloittes Millar agrees there is more that
arecentralising their pool of future scenarios so Reitgruber warns not to expect too much from the regulators and accountants can do. What I hope
they can dip into them whenever they need to resulting predictions. we are going to see is more guidance from the
whether for stress testing or IFRS9, says Burcu My personal opinion is that accountants are accounting firms, [and] hopefully a slightly more
Guner, a London-based director in the stress- overly optimistic with respect to achievable model practical approach from the regulator on what
testing team at Moodys Analytics. quality, he says. They strongly focus on forward- needs to be done in order to comply.
But here, as elsewhere, many of the figures used by looking [scenarios], as if anybody would know if the Previously published on Risk.net

11 risk.net October 2016


Impairment

Impairment and the


three-stage approach
A summary of the expected credit
loss model under IFRS 9
The revision of hedge accounting rules, the restructuring of credit risk models and a new approach to impairment measurement
market participants must prepare for the transformational effects of IFRS 9
The expected credit loss (ECL) model constitutes
a significant change, which seeks to address the 1 The three-stage approach
criticisms of the incurred loss model. Entities will be
required to record impairment almost immediately
Stage 2
equal to the 12-month expected loss after the
initial recognition of financial assets that are not Assets where there Assets where there is Assets where there is
credit-impaired. is no identified credit deterioration in credit more than significant
deterioration since quality since initial deterioration in credit
ECL forecasts a probability-weighted estimate initial recognition recognition, but where quality since initial
of credit losses. A credit loss is the difference there may not be an recognition, and there
between the cashflows that are due to an entity objective evidence of is objective evidence

Source: Chartis, IFRS 9 Technology Solutions, 2016


impairment of impairment
in accordance with the contract and the cashflows
Interest on gross Interest on gross
that the entity expects to receive discounted at carrying amount carrying amount
the original effective interest rate. Firms should Stage 3
Stage 1
discount the cashflows that they expect to receive
at the effective interest rate determined at initial 12-month ECL Lifetime ECL Lifetime ECL
recognition, or an approximate figure, in order to
calculate ECL.
Increase in the probability of default
An ECL estimate of loan commitments should be
consistent with expectations of drawdowns on that
loan commitment. Management should consider
the expected portion of the loan commitment
that will be drawn down within 12 months of the information available. Financial institutions are Firms can group financial instruments on the
reporting date when estimating 12-month ECL. It required to gather significant historical data about basis of shared credit risk characteristics such
should also consider the expected portion of the their credit exposures to enable application of the as instrument type, credit risk ratings, remaining
loan commitment that will be drawn down over the relative credit quality assessment. term to maturity, industry, and so on. Determining
expected life of the loan commitment. It is important to note that the credit risk of appropriate segmentation of credit exposures
IFRS 9 contains a three-stage approach based on the instrument needs to be evaluated without based on shared risk characteristics is a very
the change in credit quality of financial assets since consideration of collateral. This means that important element of the application of IFRS
initial recognition. Assets move through the three financial instruments are not considered to have requirements.
stages as credit quality changes, and the stages low credit risk simply because that risk is mitigated Extensive disclosures are required to conform to
dictate how an entity measures impairment losses by collateral. IFRS 9 ECL, including:
and applies the effective interest rate method. The standard requires the use of both forward- Reconciliations from opening to closing amounts
Where there has been a significant increase in looking and historical information to determine if of the ECL provision, assumptions and inputs.
credit risk, impairment is measured using lifetime a significant increase in credit risk has occurred. A reconciliation on transition of the original
ECL rather than 12-month ECL with operational Lifetime ECL is expected to be recognised before a classification categories IAS 39 to the new
simplifications for lease and trade receivables financial asset becomes delinquent. classification categories in IFRS 9.
(figure 1). The model can be applied granularly at an
The standard requires qualitative management individual or portfolio level. However, some Auditors and main boards are emphasising that the
when determining whether the credit risk on a factors or indicators may not be identifiable at required data and systems for all these needs will be
financial instrument has increased significantly an instrument level. In such cases, the factors or critical to ensure the completeness of IFRS9 project
by considering all reasonable and supportable indicators should be assessed at a portfolio level. planning, implementation and production.

risk.net 12
Impairment

A strategic approach to
IFRS 9 impairment
Success in impairment calculation depends on flexible modelling techniques, a scenario-driven approach to forecasting and the
transition towards a traceable and controlled technology architecture, writes Burcu Guner, senior director at Moodys Analytics

What is IFRS 9 and why is it important? What broad approaches are firms taking to
IFRS 9, and particularly the new impairment standards, are a incorporate forward-looking information about
response to the last financial crisis and reflect the intention expected losses and, in particular, the required use
of the International Accounting Standards Board (IASB) to of multiple economic scenarios?
overcome the too little, too late recognition of credit losses The success in IFRS 9 forward-looking impairment calculation
inherent in IAS 39. depends on flexible modelling techniques, forward-looking
IFRS 9 impairment introduces new forward-looking models and data available for model development and
expected credit loss (ECL) models that will require more benchmarking, all of which need to be compliant with the
timely recognition of changes in ECLs, and require institutions regulatory and audit requirements. To ensure consistency
to account for them from the point at which a significant across all scenario-based aspects of IFRS 9 and other risk
deterioration of the credit quality occurs. management concerns, we propose that firms put in place a
From our experience on IFRS 9 engagements, the flexible modelling approach where models and scenarios can
importance of IFRS 9 is twofold: be leveraged for both IFRS 9, the internal capital adequacy
While provisions are expected to increase significantly, we Burcu Guner process (ICAAP), stress testing and other strategic, capital and
anticipate the impact on earnings level and volatility to business forecasting and planning purposes.
be non-trivial, leading to implications on loan pricing and There is increasing scrutiny of the ongoing validation and
availability of credit. Firms will need to think about stakeholder management independency requirements as to the multiple scenarios used for IFRS 9, and this
ahead of the January 2018 deadline, as well as in an is making firms seek external help in the construction of such scenarios as well
ongoing capacity. as their validation.
S ignificant multi-year implementation efforts will require a rethink of and To ensure forward-looking estimations, firms should initially start by
changes to the data, systems, models and validation as well as to future estimating point-in-time (PIT) credit measures. Typically, firms tend to have the
monitoring, reporting and business decision-making. through-the-cycle (TTC) measures that they wish to convert to PIT measures by
credit cycle-driven conversion/adjustment factors.
What are the biggest operational challenges in IFRS 9? Moodys Analytics has worked on wholesale portfolios where it was able to
Based on our observations, there are two key operational challenges: convert TTC measures into PIT by using country- and industry-specific credit
The tactical challenges in achieving compliance by the January 2018 deadline. cycles. In the case of retail portfolios, vintage elements can be further analysed
Banks are addressing this by leveraging and ensuring scalability of existing to apply conversions. The vintage approach can be used at both granular and
internal or external capabilities, such as using existing off-the-shelf modelling, portfolio segment level, depending on data availability.
data and calculation capabilities to meet the tight deadlines. When firms do not have TTC measures due to data scarcity, we overcome the
The strategic challenges to meet the ongoing monitoring, reporting and issue by using industry standard off-the-shelf models. In most cases, we added
validation requirements of the new standard, as well as to manage the expert-driven elements to the off-the-shelf scorecards, as well as specific factors
anticipated impact on business and earnings volatility through advanced affecting the credit quality through the clients internal data, when available.
analytics and infrastructure. For forward-looking lifetime measures, we observe banks using a scenario-
based model to convert 12-month probability of default/loss-given default (PD/
In an age of increasing regulatory demands and new accounting standards, LGD) to conditional PD and LGD term structure for a single and/or multiple set
coupled with declining margins and low profitability, many banks of all sizes of macroeconomic scenarios. These conditional PD/LGD values can be applied
are seeing the benefits of the Moodys Analytics suite of IFRS 9 solutions as an for IFRS 9 stage allocation and ECL calculations. Most clients capture portfolio-
opportunity to minimise manual intervention and improve cost savings while level default and migration dynamics across different macroeconomic conditions
enhancing analytics. when projecting the lifetime ECL estimations.

13 risk.net October 2016


Impairment

IFRS 9 requires firms to use multiple scenarios to produce probability- When credit risk assessments and parameterisations are widely varied or
weighted lifetime ECLs. To help firms comply with this requirement, firms should deficient and not remedied on a timely basis, the supervisors can consider
produce multiple, fully fledged, upside and downside economic scenarios that whether such deficiencies and variations should be reflected in supervisory
align with the scenarios probability distribution and our deep understanding of ratings or through a higher capital requirement under ICAAP Pillar 2 of the Basel
the global economy and potential key threats. These scenarios need to extend capital framework.
through long future horizons to satisfy the IFRS 9 lifetime requirements. This in itself encourages the firms to go through ongoing independent
Another approach being taken is the scenario-driven approach to validation process of the IFRS 9 models and frameworks and ensures the staging
forecasting this begins with our baseline forecast. We define this as the most criteria is appropriate, considering the existing supervisory guidelines, accounting
likely outcome based on current conditions and our view of where the economy firm interpretations and evolving industry practices.
is headed. From this, we develop the basic outlines of our alternative scenarios
by running multiple simulations to develop a probability distribution of economic Can Basel II credit risk models be redeployed for use with IFRS 9?
outcomes. As a result, this allows for the identification of scenarios that are According to the Basel Committees GAECL, firms should use information
associated to customer-defined percentiles. consistently across the bank, and common models and data should be used
for both capital and provisioning purposes. This will ensure consistency in the
What questions surround the three-stage impairment approach? interpretation and application of the new IFRS 9 standard and reduce the
One of the most important dimensions of the new accounting standard is the extent of bias. Therefore, banks typically take Basel credit risk models and make
definition of the transferring criteria or bucketing allocation. adjustments to these PIT calibrations, scaling up to lifetime measures that
The general principle is to incorporate both quantitative and qualitative take into account multiple scenarios, and so on. This approach is compliant and
assessments within the determination of significant deterioration of credit consistent with that applied across the industry and is in line with guidelines
risk and to use lifetime PD as the primary quantitative measure for such from the supervisory authorities.
transferring criteria. The questions vary across firms from the weighting between Beyond credit risk models, the guidelines encourage firms to ensure
the quantitative and qualitative parameters to the consistency with the rating consistency in macroeconomic forecasts used across the organisation,
systems, credit policies, monitoring and forbearance processes, and credit from ICAAP scenario analysis, stress testing business forecasting, strategic
strategies. This, in itself, presents certain governance challenges, especially in a planning to IFRS 9 models and leveraging existing governance processes in
context of all changes needing to be traceable and justifiable. place for macroeconomic scenarios in ICAAP. This will ensure that economic
Firms are exploring various approaches at a tactical level, clients have forecasts are disclosed transparently and consistently among ICAAP reports
recently considered embedding the criteria within other relevant processes at and IFRS 9 disclosures.
the organisation. Moodys Analytics has been involved in formulating a strategic If the firms choose not to extend existing credit risk models, systems and
approach, where clients transition towards a more traceable, controlled and information for the purposes of IFRS 9 ECL calculations, then the underlying
automated workflow and technology architecture. rationale for differences should be documented and approved by senior
management. In our opinion, this will create additional unnecessary burden for
How will IFRS 9 affect banks incentives to do certain types clients and should be avoided.
of business?
Stakeholders pay close attention to earnings as they have significant impacts on Can IFRS 9 be viewed as an opportunity to revamp your modelling,
stock prices. Typically, they prefer higher earnings with lower earnings volatility. model governance and data?
With IFRS 9, the earnings volatility is expected to increase significantly across IFRS 9 can be viewed as a catalyst to optimise and seek synergies across the
the portfolio and firms will be looking to do one of the following: firms modelling landscape and data frameworks.
Minimise the portfolios earnings volatility given a certain level of expected Leveraging the multi-year IFRS 9 programmes, many firms have started to
earnings equivalent to maximising expected earnings to earnings volatility ratio. review and extend their modelling capabilities, revise their governance structure
Minimise the loss in portfolio earnings under extreme conditions given a and enhance data and technical infrastructure capabilities in this regard.
certain level of expected earnings under normal conditions equivalent to For example, in developing multiple scenario-based credit loss estimates
maximising expected earnings to earnings tail risk. for financial reporting, many of our clients are starting to initially consider the
experience and lessons learnt from similar exercises they have conducted for
As a result, firms need to explore ways to enable business in the new era of earnings regulatory purposes and then extend the forward-looking information and
of volatility by advancing their analytic capabilities as well as their infrastructure. related credit risk factors for IFRS 9 ECL purposes.
We anticipate further advancements in the analytics space as firms move
Does the three-stage impairment model create the opportunity for away from the tactical mind-set towards a strategic one and begin to focus on
regulatory arbitrage? the management of earnings volatility and capital implications.
Given the principle-based nature of IFRS 9, there is much room for interpretation,
especially in the adoption of staging criteria. This is expected to narrow down
as supervisors and auditors review firms IFRS 9 practices, including the
determination of staging criteria. Contact
According to the Basel Committee on Banking Supervisions guidance on Burcu Guner Senior Director
accounting for expected credit losses (GAECL), when assessing capital adequacy, T +44 (0)20 7772 1344
supervisors consider how a banks accounting and credit risk assessment policies E burcu.guner@moodys.com
and practices affect the quality of the banks reported earnings and, therefore, its www.moodysanalytics.com
capital position.

risk.net 14
Challenges

Challenges in IFRS 9
Building on current infrastructure
Upgrading to the latest IFRS 9 is a significant transformational event for all financial institutions, regardless of their size and complexity
Miss Kanithar Aiumla-Or/Shutterstock

15 risk.net October 2016


Challenges

Market commentators have compared the changes required as being similar E xpected life can be greater than contractual life for example, revolving
in scale to the re-engineering required for Basel, but in practice financial credit facilities which can affect the quantum in the EAD calculation.
institutions will be affected unevenly. For example, a bank that has an advanced
BCBS239 project is likely to have less trouble mining for historic data and Overview of the hedge accounting rules under IFRS 9
linking it back to the general ledger than a bank that has invested less in its data IFRS 9 has revised the existing rules relating to hedge accounting contained in
infrastructure and governance processes. IAS 39, viewed by some as disconnected from the practice of risk management.
Similarly, financial institutions with a mature portfolio of internal ratings- The rules on hedge accounting in IAS 39 have frustrated many, as the
based (IRB) credit risk models will be better able to evaluate IFRS9 exposures requirements have often not been linked to common risk management practices,
and re-engineer these models than financial institutions following the Basel and have made the process impossible or very costly.
standardised approach with its simpler rules and governance. IFRS 9 improves this by better aligning hedge accounting with the risk
Due to the significance of the changes expected, the International Accounting management activities of an entity. IFRS 9 addresses many of the issues
Standards Board (IASB) provided a multi-year period to facilitate the necessary in IAS39 that have frustrated treasury and asset-liability management
changes and parallel test the new provisions and monitoring systems ahead of departments. In doing so, it makes fundamental changes to the current
formal adoption in January 2018. requirements by removing or amending some of the key prohibitions and rules
under IAS 39. The main changes in IFRS 9 in relation to hedge accounting are
Point-in-time probability of default presented in figure 1.
All financial institutions need to repurpose or build their main credit models to It is important to note that the IFRS 9 hedge accounting rules do not
incorporate probability of default (PD), loss-given default (LGD) and exposure apply to fair-value hedges of the interest rate exposure of a portfolio of
at default (EAD). Currently, credit loss provisions are posted on an incurred loss financial assets or financial liabilities that is, fair-value macro hedges. This
basis. Now models will need to predict credit exposure at point-in-time (PIT) is because the IASB carved out the macro hedge accounting part of the
rather than through-the-cycle (TTC), which is the basis for Basel IRB. overall hedge accounting project, which will be issued separately outside
Twelve-month expected credit losses used for regulatory purposes are of IFRS 9. At the moment there is no clarity over when the rules relating to
normally based on TTC probabilities of a default in cycle-neutral economic macro hedge accounting will be finalised. In the meantime, until the macro
conditions. PD used for IFRS 9 should be PIT PD in current economic conditions, hedge accounting rules are finalised, companies applying the IFRS 9 hedge
and will therefore change as an entity moves through the economic cycle. accounting framework can continue to apply IAS 39 requirements for fair-
Historic data will be required especially origination data to build value macro hedges. This is an important issue for the banking sector as banks
12-month and lifetime estimates of PD, LGD and EAD. generally take a portfolio view of interest rate risk that is, when hedging the
interest rate risk on mortgages.
IFRS 9 model validation
Model validation will follow many existing IRB processes, but will diverge from
them in these key areas:
There is likely to be more diversity in the models that require testing for
1 Hedge accounting changes
example, the complexities in validating low default portfolios and expert
judgement models, especially LGD calculations.
Non-derivatives can be hedging instruments when
IRB models are tested as TTC and IFRS 9 models are PIT, so validation for Hedging hedging fair-value risk
IFRS9 will be a parallel and separate process to IRB. instruments Simplifications in hedge accounting, using options and

forwards as hedging instruments


IRB does not require full coverage of the balance sheet; however, IFRS9
coverage is much higher, so more models will require validation.
More components of risks can be hedged
Hedge items Group/net and aggregate exposures can also be hedged
Model validation will be required at a minimum to cover the following:
Review of model documentation methodology, delivery of models
and testing. Hedge No bright lines when assessing hedge effectiveness
Governance process status, compliance and appropriateness. effectiveness Testing more qualitative then quantitative in nature
Only performed prospectively
testing
Methodology review challenges to the techniques used and focus on
weaknesses and limitations. IRB models often cater for weaknesses by being Voluntary De-designation of hedging relationships is only allowed
conservative; however, IFRS 9 models are not meant to be conservative, but

de-designation
Source: Chartis, IFRS 9 Technology Solutions, 2016

in the case of a change in risk management objectives


best estimates. not allowed
Review of model performance through backtesting, historic model testing
for each period under review, reperforming models, comparison of model Link with risk Direct link established between the practice of risk
performance using other models, etc. management management and hedge accounting thereof

IRB calibration tests will need to be enhanced for IFRS 9 calibration, including: Enhanced disclosures to provide more meaningful
C  alibration for a maximum of 90 days past due (DPD) for IFRS 9 (with some Disclosures information about the hedging strategies applied and
their financial impact
exceptions) versus possible 180 DPD for IRB.
Conservatism in IRB models to adjust for model error or uncertainty versus
IFRS 9 measures that are meant to be the current best estimates.

risk.net 16
Challenges

Data challenges in IFRS9


Amid the unrelenting focus on regulation in recent years, it is sometimes easy to forget that new accounting standards may have
an equal, if not greater, impact on the way in which banks operate. In a roundtable discussion convened by Risk.net and sponsored
by Oracle, four experts in credit loss explore how the introduction of IFRS9 is set to fundamentally change the way banks do their
accounting, and specifically the data challenges associated with its implementation

The implementation of IFRS9 early in 2018 will be split into three principal
buckets classification and measurement, impairment, and hedge accounting THE PANEL
and moves to a forward-looking approach to calculating expected credit losses. David Grnberger, Head of Accounting Enforcement,
A research paper published by Chartis Research in March 2016 states: Financial Market Authority Austria
IFRS9 is a high-impact, symbolic, operational IT and organisational
Wolfgang Reitgruber, FVP, Group Credit Risk Modelling, UniCredit
transformational event for risk and finance, an arranged marriage that is turning
Hugh Stewart, Research Director, Chartis Research
an uncomfortable courtship and good intentions into a powerful, successful
partnership that is greater than the sum of its parts. Our panel aims to Rohit Verma, Head of Strategy for Risk Analytics, Oracle
determine just how well this arranged marriage of finance and risk is working
in the face of IFRS9.
Rohit Verma, Oracle: In terms of the projects, there have been many
With little more than a year until IFRS9 is implemented, how well developments in implementing IFRS9 at different banks. All the banks are
is the project going and what are the challenges that lie ahead? paying very close attention to these projects, simply because they realise that this
David Grnberger, Financial Market Authority Austria: The project is is a game-changer it fundamentally alters the earnings and the balance sheet
well ahead and we, from the supervisors perspective, have started to align and that banks report.
enter into an intensive dialogue with the banks, both from a national level and There are challenges that are being addressed today in terms of methodology,
from the EU level. With the European Banking Authority (EBA) and the European data availability, data quality and governance. These challenges will continue
Central Bank, we have all set up our national and European projects. We are to be faced; it is not a one-time effort that will no longer be required after
looking into the banks, querying their implementation and asking for numbers, implementation in 2018. These challenges will continue, but most banks realise
impacts and challenges. if they implement this project correctly it provides a huge opportunity for them to
Our activities have given us a pretty good picture of where we stand and improve the transparency of their financial statements which, in turn, is good
where the banks stand. The projects are advancing well; we are seeing some as most of them report their numbers to the Street and will also improve the
impacts, usually not large but still considerable, and we are seeing the challenges capital planning processes within their banks in the long term.
ahead quite clearly, which relate mostly to model performance and the reaction
of supervisors with regard to banks IFRS9 implementation. Hugh Stewart, Chartis Research: Reality has set in, and there is that sense
As supervisors, we are currently designing and drafting strategies about how when you go walking and you think youre going to hit the summit. Then you
to react and how to approach banks during the next couple of months. realise that many of your targets are false summits. New impact analysis is
identifying there is more to do with regard to modelling, finding missing data,
Wolfgang Reitgruber, UniCredit: Regulatory internal ratings-based (IRB) and so on.
frameworks are now expanding nicely into IFRS9 impairment concepts on the Therefore, there is talk of a little more forgiveness in 2018 when IFRS9
one hand, to enforce backtesting expected loss measurements and, on the other, becomes live, but there is also a sense of excitement with regard to how
to link the new upcoming provision quantities and steering concepts with an economic capital, regulatory capital and the loan loss provisions can be worked
improved understanding of credit risk measures. together and optimised a sense of threat, but also a sense of opportunity.

17 risk.net October 2016


Challenges

The concept of moving to forward-looking provisioning for We discuss the policy choices they have made and provide initial feedback on
expected losses was a major change to accounting standards in whether or not, from a supervisory perspective, they would be in line with policy
the wake of the financial crisis. Will it succeed in dampening the choices and whether nor not we would accept simplifications.
pro-cyclicality of accounting standards, just as weve seen similar The most important contribution we will be making in the coming months is
efforts in regulatory capital? How confident are you that the new how we expect them to validate and backtest IFRS9 results in the future because,
standards are going to achieve that? as supervisors, we at least have some experience in validation and backtesting
David Grnberger: This is a common misunderstanding I find when talking to from our IRB-model inspections. We want to extend this experience and the
practitioners. The aim of the IFRS9 project is not really to dampen pro-cyclicality. measures we are familiar with from the IRB approach to IFRS9 implementation.
In fact, our simulations show that it will increase the volatility and probably also
the pro-cyclicality of accounting provisions. Can you give us a general sense of the role of data in IFRS9
The main aim and this should be the merit of IFRS9 is to increase the obviously its a big issue, but what is the overall picture here?
relevance of accounting provisions to transfer information from the bank to Rohit Verma: Data is going to play a crucial role in the success or failure of
the investors and to the regulators about those events that can be reliably any IFRS project. There are three aspects of data that Id like to discuss, the first
forecast, reducing the noise from arbitrary effects from errors and really making of which is model calibration. These new-age, forward-looking models are only
accounting provisions relevant, as far as possible, to investors. as good as the data used to calibrate them. If you dont have the right quality
Relevance means statistical relevance and model performance, but it also or the right amount of data, the calibration of these models is not going to be
means practical relevance and judgement. All of a banks efforts should be relevant and will therefore affect things like backtesting, validation, and so on.
steered towards making numbers and forecasts as reliable as possible, reducing The second aspect is from a processing perspective. IFRS9 requires processing
all arbitrary effects from personal estimates and human judgement, and to happen at an instrument level, so millions of records will be processed on a
improving provisioning to the maximum relevant extent. periodic basis. You are not only obliged to process this under baseline conditions,
Investors and banking supervisors will hold banks responsible for preparing but also under different scenario conditions, which creates a huge demand in
those forecasts based on data and experience, having a high standard of models, terms of data processing. Running millions of records through multiple scenarios
improving the data and using high-quality data for their estimates because, on a monthly or quarterly basis is a huge task for any banking organisation.
whenever a bank deviates in practice from its forecasts, banking supervisors The third aspect is convergence between risk and finance, and how
and analysts will dig into the issue and ask why. It would present a pretty bad organisations can leverage it for addressing not just the immediate IFRS9
scenario if the bank cannot explain the deviations, and bad model quality and requirements, but also some of the other requirements such as regulatory
bad data quality are revealed. capital, economic capital and loss forecasting. Once you have this account level
of granular data available and reconciled to both risk and finance organisations,
What role are regulators playing now in helping banks to prepare you can do a lot more to it than just IFRS9.
for the standard?
David Grnberger: Right now the regulators role is more or less an advisory one. What is your perspective on the additional granularity of data
We are trying to pool the data we receive from the banks from the initial trial rounds that banks will be required to collect to fulfil the impairment
of IFRS9. We are feeding what has come out of the analysis back to the banks, at requirements in this standard?
least on a qualitative level, from the EBA impact studies. We are talking with the Hugh Stewart: Its very interesting, not just on the financial engineering and
banks and showing them where they deviate from European sample averages. data management side, but also on the cultural side. It has been alluded to

risk.net 18
Challenges

On the micro-data level, there was a lot happening in the past, and IFRS 9 is
therefore not something where we are starting from zero. We have already been
building up the infrastructure, which is continuously being improved and enhanced.
Im more concerned about managing the macro data, the forecasting element
of IFRS 9. We will enter an environment where forecasts may drive provisions,
and proper governance and data quality are required to ensure the ongoing
soundness of provisions.
We all know that economic forecasts tend not to be very reliable, but you need
the framework and the governance in place to justify that whether one or five
years later at a certain point in time, we had reason to believe in this forecast.
The world is changing, but there are real data challenges on the macro side.

Obviously, theres no point generating the data for this standard if


its not trusted or robust data. How can banks have confidence in
the data theyre generating for IFRS9?
Rohit Verma: Banks need to establish a couple of things so that, over a period
of time, they become more confident about the data that has been used, not just
Rohit Verma for IFRS9 but for any regulatory or reporting purposes.
First, banks can establish a single repository of raw data that has the capability
already, but I think the accountancy and finance world is less forgiving than the to ingest data from multiple sources and at multiple levels of granularity. For
risk management reporting world, and that a sense of detail, provability and IFRS9, you need account-level data, which is literally millions of records for large
verification, absence of error, and the locating of missing data are much more banks. You also need economic forecasts, which could be a few lines or a few
important within IFRS9. There needs to be a lot more historical data provided hundred records, each of them a different forecast period and forecast horizon.
than there is currently. So, you need to set up a repository that can input data from different sources,
Some of the less sophisticated banks will either be required to mine their keeping in mind these business requirements. Then, this repository becomes the
data more thoroughly probably manual data or very unstructured data or source of data for all of the regulatory requirements and reporting.
will need to make use of various types of proxy data. Second, establish strong data governance processes. This includes a data
A different set of bucketing is needed for IFRS9 compared with some of quality process that identifies trends in data quality from different source
the regulatory and economic capital reporting areas. The data pack should be systems or a reconciliation process to ensure your granular data is reconciled
reshuffled, but should still run in harmony with other operational and regulatory with your aggregated general ledger information. You could also establish a data
activities. This vast increase has performance issues and data management issues, repository [of definitions], so that everyone in the organisation refers to the same
but I think the biggest issue is a change in culture with regard to accuracy. data in a uniform manner.
These processes will help improve the quality of data over a period of time
Looking into this over recent months, how would you say banks are and ensure that, going forward, many data quality issues are addressed at
dealing with that? How far advanced are they, and do they have source, not at an intermediate level.
the resources they need?
Hugh Stewart: There are elements of denial and hope within some So, you should ensure the data is of good quality to start with,
organisations. I think the big Tier 1 banks understand the principles but have the rather than having to go back for it.
challenge of pulling their organisation together; they have the problems of size, Rohit Verma: Exactly if you know there is a portfolio in which there is a
complexity and scale, but are addressing them well. secular trend of data problems, you could go back to the source and fix it there,
The large Tier 2 banks are well on target. They copycat the sophistication rather than having to fix it when the data lands in the repository, which is a lot
of Tier 1s, but do not have the same levels of size and complexity. Some of more expensive.
the smaller organisations are way ahead of target or are in denial and hope,
relying on either existing or third-party data which are not proving as granular What role are validation and backtesting going to play here? How
and complete as they hoped and on core banking systems that are less risk should the quality of data be checked and validated?
sophisticated. I think there is an element of danger here. These are the people David Grnberger: Validation, or designing models that are accessible to
that are negotiating now for forgiveness throughout 2018. validation, is crucial to the entire project. You should start from the top down,
and decide in the first step, when you have a good set of data: what kind
Can you give some internal perspective on the challenges of of data, which factors and which parameters performed well in forecasting
IFRS 9 impairment? expected losses, especially over a 12-month horizon.
Wolfgang Reitgruber: I would say, as a main advanced IRB bank in the If you select those data sets variables that have a good performance there
market, like many of the large companies, we are well on the way. We have been will also be a need to differentiate between time horizons. Which data sets and
working for over 10 years in Basel IRB environments with different regulators, which indicators have a good model performance or explanatory value for six
recently contributing actively to regulatory benchmarking exercises and various months, 18 months or 48 months in the future? Then you can design your model
quantitative impact studies. in that way.

19 risk.net October 2016


Challenges

that is making predictions, you really want to avoid using different


predictions for IFRS9. There are stress-testing environments, there are certain
assumptions and there are also asset-liability management frameworks.
All of these are already using and building up certain types of predictions;
therefore, consistency really is the most important aspect of where to get the
data from.
There are two philosophies of how to include data in the models themselves
One philosophy is to use a probability of default (PD) model with factors that
reflect the economic status and the economic cycle. The second is more of an
IRB-type process through the cycle models, and adjusting them for point-in-time/
forward-looking-type features.
I prefer the second option as it allows much easier reconciliation
between Basel frameworks and IFRS9 frameworks. It also allows for much
easier correction. If youre assuming that the economic cycle information
is changing relatively quickly, you probably have a less reliable forecast
than you are typically used to in a PD framework, for example. The relative
risk ranking can be described pretty nicely, but the absolute risk ranking
Hugh Stewart is probably tricky and every year you have to make, probably significant,
adjustments.
This means only selecting those kinds of data and data sources that have a Personally, Id go for this second approach, but I know a couple of banks that
good track record in explaining future losses. Deselect those sets of data and are already including economic information directly into their systems.
variables that create noise or have been shown to have high autocorrelation,
and try to establish a simple set of variables not too many that have good What is your view on the need for consistency with regulatory
performance. Understanding model performance is the first step to creating a capital data standards?
model that can be validated later. Rohit Verma: Its a given that you need consistency in data; looking at a
For example, if your macro data or macro variables do not have a good track regulatory capital number versus the provision, they need to be generated from
record, include them only if they have a very low weight or with regard to short- the same source, otherwise the two numbers are not really in sync and present
term time horizons. Do not include macro projections for a time horizon of, for an inaccurate scenario to stakeholders to the bank.
example, four years in the future if they are not relevant for that time horizon, Most banks are going with the second approach, taking the regulatory
but include them for predicting expected losses and defaults for, for example, a capital PD and adjusting it for forward-looking measures. Maybe in the
six-month time horizon. long run the trend would be towards incorporating the forward-looking
This should also solve the problem of macro data seeming to have little projections into the PD model, into the transition matrix that drives the
relevance, which is always determined based on time horizons. The time horizon transition of credit quality for each and every account. This is a kind of
and the historical performance should be the driving factors when you design a evolution of the model that we might see in the future when incorporating
model and validate them later on. forward-looking projections.
We have also seen the inclusion of a lot of forward-looking projections in the
Wolfgang Reitgruber: I would focus much more on backtesting. The purpose identification process stage, as that will alter the provision generated for each
of IFRS 9 impairment is to get a good view of the value of the balance sheet. and every account. The forward-looking projection of the economy will have a
What part of this value do we control? The provisioning part, so in the end its big impact on the stage to which each account is classified.
about backtesting the provisions.
That is now the main challenge in IFRS9: there will be a lot of profit-and-loss What impact will all this have on legacy business, for which banks
volatility driven by risk results, and a lot of point-in-time/forward-looking types may not have relevant data available?
of assumption in our estimates. But we definitely have to ensure that they are Hugh Stewart: This is really important. Banks that havent been through the
neither overly conservative nor overly aggressive. BCBS 239 or Comprehensive Capital Analysis and Review dont necessarily have
In the end, it should be a fair picture of the value and that should basically the same discipline as the larger banks. This is a matter of data mining and its a
drive the decisions of what type of data and what type of data quality are difficult job. So maybe the models will have to be adjusted or replaced according
needed, and how to end up at a fair representation, probably after a couple of to the valid data available.
years of running IFRS9. We havent yet mentioned the word disclosures. This is going to be very
important, especially as this first step transfers from IAS 39 to IFRS9. There will
From a data perspective, how are you planning to incorporate be a huge body of disclosures to explain opening and closing from one state to
forward-looking projections into expected credit loss? another and provisioning. That is just a vast body of work that will be necessary
Wolfgang Reitgruber: One of the main topics, which is not unique to our in the first year.
group, is where to take these predictions and these forward-looking opinions IFRS9 Paragraph 35H, a hot topic among auditors and accountants, deals
from. And one of the main points was that it must be consistent, at least with the need for very heavy, accurate and forensically validated disclosures,
across the legal entity of the group. So, if you have an economics department which are all a part of the data quality initiatives.

risk.net 20
Challenges

What kinds of technology awareness to this part of the credit


might banks need to consider process. We achieve greater focus on
to overcome some of these the inception point and need lifetime
challenges? loss estimates, which are not required
Rohit Verma: Banks will definitely within the Basel environment. This is
need a technology solution that can probably laying the groundwork for
address the data challenges the better pricing decisions and better
www.charakter.photos/Philipp Monihart

ability to store and process large strategy in underwriting, which


volumes of data. A technology partner will have a positive impact on the
that is looking not just at the traditional business side.
ways of processing, but also some of There is much to be done and it
the newer big data technologies for is a huge project. There are a lot of
managing the whole processing aspect, new methodological dimensions that
is definitely required. have to be covered, but I do not see
Another aspect is that a technology the type of bureaucracy part of the
solution should be ready to meet the regulatory framework to approve
requirements as they stand today, as these models for IFRS 9. We will need
Wolfgang Reitgruber David Grnberger
well as be flexible enough to evolve to strong internal governance to keep
meet the specific requirements each everything under control, but we have
bank has around IFRS9. IFRS9 is not really prescriptive in nature, and every yet to see how the internal procedures are going to look and how the regulators
situation will have its own aspects unique to IFRS 9. You need a technology will review processes and models. It might be easier from that perspective than
solution that can handle these nuances in each bank in terms of methodology, the IRB framework.
organisational structure, and so on.
Finally, you need a technology solution that is not just focused on IFRS9, How is the culture within organisations changing, and what is the
but able to address IFRS9 projects other areas, such as enterprise stress impact of this standard on the business decisions banks are making?
testing, liquidity risk, regulatory capital, and so on. These are some of the Hugh Stewart: It is definitely having a major effect, and theres a big learning
ideas around technology that banks should be looking at when selecting experience going on just with risk and finance working together, as well as
a partner. different IT teams and financial engineering, pricing and risk analytics groups
working together. It is raising the bar holistically across a number of different
This is a huge lift for banks; some have seen it as even bigger and disciplines within the bank.
more impactful than Basel III, for example. To what extent is this an I agree that the focus on lifetime losses is changing things. This is great in
opportunity to improve modelling, model governance and data as terms of being in parallel to Basel with a similar focus on model governance,
well as being a compliance burden? but its definitely a burden. Although its leading to a promised land and were
David Grnberger: Whether its an opportunity or a compliance burden all going to be better people as a result of it, it is tough medicine. There is going
depends mostly on how banks decide to implement IFRS9. The supervisor to be a real sense of resource pressure on all financial services organisations to
expectation and were looking for this in the future is that banks reach these summits, or false summits, before planting the flag at the top.
improve the models not only for the accounting purpose, but also for the risk
management function, for capital purposes and to integrate as far as possible Rohit Verma: David mentioned earlier that IFRS9 makes the financial
the IFRS9 improvements into the existing risk management framework a bank statements more relevant. That is a huge benefit that banks will gain if they
already uses. implement the project correctly.
A vast majority of the banks told us they expect IFRS9 to be used in product Its not like an on/off switch. Its not that, in 2018, your financial statements
design and in decisions to grant or extend loans, and that they are actually will suddenly become more relevant, but rather it is the continuous improvement
trying to integrate the IFRS9 improvements into their risk management and in the relevance of the financial statements and the numbers that is going to
governance structure. If this is the case, it will be an opportunity to improve have an impact on product design and pricing. Its going to have an impact on
modelling overall, and this is what I expect from most banks. the capital planning process, and also on the credibility of the numbers and how
they are perceived by the markets, which in turn will have an effect on the cost
How should banks be positioning their businesses, as well as their of capital for banks.
accounting standards, ahead of IFRS9? Overall, if handled correctly and there are definitely challenges the key
Wolfgang Reitgruber: I agree that a couple of banks seem to use IFRS9 thing, if you implement it correctly from a data management perspective and a
for granting processes. When comparing accounting standards now with the modelling perspective, is that there are benefits that can be derived in the long
Basel environment, Basel is actually more focused on the stock and the on- run because the numbers are now much more relevant.
book portfolio. So there was always a little more pressure towards behaviour-
scoring systems, but not so much on the application scores, if youre just
taking this comparison. The commentary and responses to this forum are personal and do not necessarily reflect the views
IFRS 9 focuses more on the PD at inception, which is triggering a lot of and opinions of the panellists respective organisations.

21 risk.net October 2016


Model building

shooarts/Shutterstock

Model building for IFRS 9


Incorporating the right techniques
An illustration of the process for building an IFRS 9 model, outlining approaches for both wholesale/corporate and retail

Wholesale/corporate IFRS 9 model build R elations with the bank: length of the co-operation, average amount of loans,
The main activities are typically: product mix, and so on.
Determine the client segmentation. Interactions between complex variables: age and income, region and income,
Define the targeted model structure: group/local/multiple models. and so on.
D efine the input data and identify candidate variables for the analysis between Variables based on credit bureau data, describing repayment history: debt level
quantitative factors for example, return on assets, return on equity, debt and number of DPD.
equity ratio and qualitative factors for example, quality of the management History of loan origination: frequency, level of debt, loan types.
board, market share, market structure (monopoly versus competitive), hurdles History of queries to credit bureau.
to entry.
Definition of default: 30, 60 or 90 days past due (DPD). Behavioural models:
Perform data quality verification. Behaviour on current accounts.
Define the historic data population for development and validation. History of delinquency.
Development of subsequent modules: univariate or multivariate analyses. L evel of exposure, exposure amount divided by exposure as at
Development of the joint model from separately developed modules. origination data.
Analyses of the prepared models and final model selection. Frequency of loan origination how frequently the client takes new loans.
Pre-implementation tests. Delinquency value to exposure value.
Usage of the available off-balance limit in case of revolving products.
Retail IFRS 9 model build N  umber, value, frequency or share of cash transactions and cashless
From a risk perspective, retail models are driven more by scorecards and transactions.
homogenous groups. Such segmentation has a significant effect on the analytics; Repayment patterns for subsequent instalments.
therefore, vital areas for consideration are statistical methods, governance and History of co-operation with the bank.
model validation/calibration. A summary of variables is offered below.
Best practice for choosing a model for retail probability of default, loss-given
Credit application models default and exposure at default is to use several techniques or a combination of
S ocio-demo variables: income, profession, region, age, marital status, these. Final selection of the model is then based on its statistics and not on the
education, and so on. assumptions of modelling techniques.

risk.net 22
Model building: Whitepaper

Managing earnings volatility and


uncertainty in the supply and
demand for regulatory capital
The impact of IFRS 9
This novel approach to modelling from Moodys Analytics allows better management of the interplay of supply and demand dynamics
for regulatory capital, combining an economic framework with regulatory capital and new loss recognition rules
Introduction more appealing. Accordingly, institutions can use This article examines how IFRS 9 affects
The framework is particularly relevant in integrated metrics that account for both regulatory regulatory capital supply and demand. We provide
understanding the extent to which IFRS 9 can lead capital and economic risks such as regulatory- an overview of how an institution can utilise
to more aggressive provisioning, which feeds into adjusted return on risk-adjusted capital (Rorac), integrated measures that account for economic
earnings volatility. The approach provides guidance concentration-adjusted return on regulatory capital, risks and regulatory capital for better capital
on how organisations can better manage their or composite capital measures that reflect regulatory management. Our approach leverages an economic
capital buffer, considering investment concentration, capital constraints, as well as economic risks. capital framework similar to the one proposed by
its impact on earnings volatility and relationship In addition to considering regulatory requirements Levy, Kaplin, Meng and Zhang (LKMZ) in 20121,
to regulatory capital requirements. Imperative to and economic risks related to concentration in which stakeholders maximise return per unit of
portfolio management, the framework recognises effects, the question of being able to fulfil future risk while facing regulatory capital constraints. In
the likelihood of a capital shortfall being significantly regulatory requirements is material. In reality, addition to recognising current regulatory capital
impacted by portfolio name, industry, geography and credit deterioration flows into earnings along requirements, this article incorporates uncertainty
asset class concentration, as extreme fluctuations with increases in regulatory capital, resulting in in the supply of and demand for regulatory capital
in capital supply and demand occur more often for a potential capital breach. The likelihood of such coming from changes in the credit environment.
institutions holding more concentrated portfolios. a breach depends very much on a portfolios
Finally, we discuss integrated investment and composition, the degree to which it is diversified Impact of IFRS 9 on regulatory
strategic decision-making measures that account for and the supply of equity. Ideally, investment decision capital management
the full spectrum of economic risks and interactions rules should account for the likelihood and cost of 1. How IFRS 9 affects the dynamics of
with regulatory and accounting rules, as well as the breaching future regulatory capital requirements. regulatory capital at horizon
instruments contribution to earnings volatility and The key to managing the dynamics in regulatory IFRS 9 affects the supply and demand for regulatory
capital surplus dynamics. capital requirements is to quantify the likelihood capital in at least two ways. First, IFRS 9 generally
With stringent regulatory and accounting that the supply of capital is sufficient to address requires an institution to recognise 12-month
requirements, risk managers can struggle with future regulatory capital requirements. This expected credit loss of a financial instrument as
incorporating regulatory capital requirements and probability is determined by the dynamics of soon as the instrument is originated or purchased.
loss accounting rules into investment decisions. regulatory capital supply and demand, with the Meanwhile, IFRS 9s predecessor, IAS 39, generally
An economically appealing approach considers supply affected by earnings and loss recognition requires material credit events to trigger loss
stakeholders who maximise return while recognising rules. Compared to its predecessor, IAS39, the new provision. Thus, IFRS 9 can cause an initial reduction
the risks and who face regulatory capital constraints; IFRS 9 accounting standard for financial instruments in the Tier 1 capital supply, driving required regulatory
investment decision rules recognise both regulatory requires institutions to set aside provisions at capital to be more constraining for banks using
capital and economic risks1. Intuitively, an investment origination. In addition, the staging rule for IFRS a standardised approach to compute regulatory
with less concentration risk, all else being equal, is 9 requires institutions to update loss allowance to capital2. In addition, IFRS9 staging rules can result in
better diversifying and more appealing. Similarly, reflect changes in credit quality at each reporting further reduction in the capital supply when lifetime
an investment that attracts less regulatory capital, date, which can increase earnings volatility that losses must be considered.
all else being equal, is less constraining and flows into the supply of capital. Second, IFRS 9 can increase the volatility in

23 risk.net October 2016


Model building

1 Loss allowance at horizon of diversified portfolio versus concentrated portfolio

capital supply. IAS 39 requires provisioning under deteriorated credit environment. Capital surplus regulatory capital requirements and its potential
significantly negative credit triggers, which generally measures the gap between capital supply and violation. The likelihood of such a breach depends
dampens the impact of credit migration on capital demand. In the context of Basel III and IFRS 9, the greatly on the portfolio composition, the degree
supply volatility. In contrast, under IFRS9, institutions change in capital surplus is driven by the change in to which it is diversified and its capital surplus.
update loss allowance to reflect changes in credit regulatory capital required by Basel III, and earnings Investment decision rules should account for
risk on every reporting date, resulting in credit that are driven by interest income, default losses and the likelihood and the cost of breaching future
migration being accounted for in capital supply. An provisions either 12-month or lifetime, depending regulatory capital requirements. Institutions have
important corollary to this observation is that more on the assets stage. addressed this issue by adopting buffers beyond
concentrated portfolios will, in general, be more Since the change in capital surplus captures the their stated required regulatory capital requirements.
impacted by the IFRS 9 volatility increase. Intuitively, dynamics of both required regulatory capital and The challenge is in quantifying the buffer, how
a perfectly diversified and granular portfolio exhibits earnings, it provides a foundation for measuring portfolio composition can improve managing that
no volatility. Figure 1 compares the IFRS 9 loss how much capital must be set aside. With that buffer, and how all this should feed into investment
allowance at horizon of a well-diversified portfolio said, the expected change in capital surplus decision rules. Intuitively, institutions should set
and a portfolio with high concentration in the oil associated with each individual instrument does not aside capital buffers, so the likelihood of a capital
industry. It can be seen that, for the simulated trials, account for concentration and diversification risks. breach does not exceed a target probability. In
loss allowance for the well-diversified portfolio never Consequently, an institution should not use it as the addition, institutions should assign an additional
exceeds 10%. In contrast, the diversified portfolio only measure when making investment decisions. capital buffer to each instrument according to the
loss allowance at horizon exceeds 10% in 3.7% For example, all else being equal, an instrument expected change in capital surplus associated with
of trials. All else being equal, organisations loss with a 4% expected increase in capital surplus may the instrument, and how that change contributes to
provisions will be more extreme for concentrated be more attractive than one with a 6% expected the overall likelihood of a breach. The distribution of
credit portfolios, driving a higher volatility in earnings increase, if its credit risk is less correlated with other changes in capital surplus is depicted in figure 2.
and likelihood of facing a regulatory capital shortfall. instruments in the portfolio. The left-hand side of figure 2 shows the
distribution of changes in capital surplus over a one-
2. Quantifying change in capital surplus 3. Leveraging an economic framework to year horizon for a sample loan portfolio; the right-
As discussed above, IFRS 9 loss provision affects manage earnings dynamics and the demand hand side depicts the capital surplus distribution
the supply of capital, potentially impinging on and supply of regulatory capital for a more concentrated portfolio that does not
an institutions ability to meet regulatory capital As discussed in the introduction, the LKMZ benefit from country and industry diversification.
requirements. The dynamics of loss allowance, framework accounts for regulatory capital In this case, the probability of a capital breach
with each reporting date, can further constrain the constraints at the time of investment; in reality, more than doubles, to 26 basis points, if the
organisation, as it should consider buffering for a future credit deterioration results in changes to same 2.2% additional capital buffer is set aside3.
Limiting the probability of a capital breach to 10bp
1
 or example, Levy, Kaplin, Meng and Zhang (LKMZ) (2012) introduce a regulatory capital-adjusted Rorac measure by integrating economic capital
F
with regulatory capital under a capital asset pricing model framework. Xu and Levy (2015) extend LKMZs model and create a composite capital requires a 2.2% additional capital buffer to be set
measure that serves as a capital allocation measure accounting for both regulatory capital requirements and economic risk. aside beyond what is needed to address current
2
Under the Basel III rule for advance IRB banks.
3
Th e required regulatory capital for each instrument is computed based on the Basel III advanced IRB approach in all examples in this article. regulatory capital requirements.

risk.net 24
Model building: Whitepaper

2 Probability of capital breach: diversified portfolio versus concentrated portfolio

To account for the full spectrum of economic risks and interactions with
3 Change in capital surplus versus change in capital risk regulatory and accounting rules, one can leverage the LKMZ framework
and associate an additional capital buffer charge to each instrument as the
organisation ensures capital solvency In the future. The resulting investment
decision rules account for capital surplus dynamics as well as the concentration
and diversification risks associated with each instrument.

Conclusion
The introduction of IFRS 9 changes the dynamics of capital supply and
demand and affects institutions investment decisions. In particular, the new
loss recognition rule under IFRS 9 can make regulatory capital requirements
more stringent and can increase the uncertainty of capital adequacy in the
future. IFRS 9 can also introduce significant concentration risk into capital
planning. These implicit costs should be accounted for in investment decisions
and capital allocation. An extended LKMZ model leverages an economic
framework and derives investment decision rules based on the full spectrum of
risk, and it accounts for regulatory capital as well as future dynamics in capital
supply and demand.

It is important to note that, even though the two portfolios shown in


The authors
figure 2 have different capital breach probabilities, both have the same
expected change to their capital surplus: 1.95%. Therefore, it is clear that the Amnon Levy (pictured, left)
expected change to the capital surplus by itself is not sufficient to describe Managing Director, Research
E amnon.levy@moodys.com
an instruments risk, as it does not account for portfolio concentration and
diversification effects. This trait is similar to expected loss measures not being Jing Zhang, Managing Director, Head of Research
impacted by diversification and concentration. E jing.zhang@moodys.com
Figure 3 provides an additional perspective to the dynamics of capital surplus
by comparing it with portfolio fair value loss. While the change in portfolio capital Andriy Protsyk, Associate Director, Research
surplus has a general inverse relationship with portfolio loss, there is a reasonable E andriy.protsyk@moodys.com
amount of dispersion. One primary reason behind this observation is that fair
Pierre Xu, Associate Director, Research
value portfolio loss, which includes both default loss and credit migration loss, is
E pierre.xu@moodys.com
entirely driven by the migration of point-in-time probability of default, while the
change in capital surplus is partly determined by the migration in through-the- www.moodysanalytics.com
cycle probability of default, which feeds into regulatory capital calculations.

25 risk.net October 2016


Cutting edge: Credit risk
Cutting
Model edge:
building: Creditedge
Cutting risk

Loan classication under IFRS 9


IFRS 9 requires classifying non-defaulted loans in two stages depending on their credit quality evolution since initial
recognition by the bank. In this paper, Vivien Brunel proposes an optimal way to perform this classification. Target values of
some key performance indicators of the provisioning model emerge from the implementation of this process. In particular he
computes the target value of the stage 2 hit rate and the size of the stage 2 portfolio

S
coring and rating models have been used in the eld of the grant- a rating or scoring tool are based on the cumulative accuracy prole
ing of credit and in credit risk management for some time. In (CAP) or the receiver operating characteristic (ROC), and on their sum-
2001, the Basel Committee required the use of internal mod- mary statistics, namely the accuracy ratio (AR) and the area under the
els to be extended to capital charge measurement (Basel Committee ROC curve (AUC), respectively (see Sobehart, Keenan & Stein 2000).
on Banking Supervision 2001). Since then, banks and regulators have We mention that the AR and AUC are criticised for being awed, par-
both developed statistical tools to evaluate the quality of internal rating ticularly when expressed in terms of misclassication costs, and that
models because bad performance can lead to inefcient allocation of a more objective measure exists (Hand 2009).
capital. By considering that the two-stage classication process of perform-
In 2014, the International Accounting Standards Board (2014) pub- ing loans is based on a scoring model, we link the two-stage accuracy
lished the nal version of the IFRS 9 accounting standards, which aim ratio of a portfolio of loans to the underlying score accuracy ratio. We
to overcome the problems that arose during the nancial crisis because show that optimality in the two-stage classication can be reached by
of the previous IAS 39 incurred loss model. The new requirement is appropriately choosing the hit rate (called the stage 2 hit rate here-
to recognise loss allowances or provisions on all loans, including per- after) that we target, ie, the proportion of defaulted loans that come
forming loans. This is done in a two-stage process for non-defaulted from stage 2 loans. In many realistic cases the optimal target stage 2
loans. hit rate is in the range 7090%. Additionally, we derive a formula for
 Stage 1: if the credit risk of a nancial instrument has not increased the size of the stage 2 portfolio. We show that, for a given portfolio of
signicantly since initial recognition, the loss allowance is equal to loans, the main driver of the provision is the stage 2 hit rate and not
the 12-month expected credit loss (ECL). the size of the stage 2 portfolio.
 Stage 2: if the credit risk of a nancial instrument has increased To make our paper self-contained, we describe the main statistical
signicantly since initial recognition, the loss allowance is equal to tools used to assess the performance of a rating or scoring system in
the lifetime ECL. the next section. Later we show how these performance indicators are
In general, stage 1 loans are of better credit quality than stage 2 loans. shifted when loans are classied into only two buckets, and we com-
Paragraphs 5.5.10 and 5.5.11 of the norm (International Accounting pute the optimal target stage 2 hit rate. We go on to derive the formula
Standards Board 2014) provide some requirements about the transfer that links the size of the stage 2 portfolio to the other parameters, and
criteria. However, the norm is principle based and does not detail in the nal section we provide a simple proxy formula for the total
how to determine which instruments should be in stage 1 or in stage 2. provision.
Implementing the standards is subject to interpretation and to some
subjective choices in terms of credit risk quantication. The measures of discriminatory power
In this paper, we propose how assets should be assigned to the two A scoring model aims to rank the clients of a bank according to their
stages. Our proposal involves the following three assumptions. creditworthiness, ie, their ability to pay back the loan they have been
 We assume that the transfer criteria from stage 1 to stage 2 are based granted. Whatever it is based on either a mathematical model or an
on scoring and rating systems of the bank and that an instrument is expert-based judgment, or both the performance of a scoring model
transferred to stage 2 when its absolute level of risk has gone beyond is measured by the concordance of low scores with the occurrences of
a given threshold; this is a good approximation when the bank does defaults. When a scoring model is random, ie, contains no information
not originate any loan with a score under a given cutoff value. about the likelihood of a default, the conditional default probabilities of
 Loan classication aims to accurately predict defaults over a given the clients are uncorrelated with their scores. Conversely, for a perfect
time horizon, and its performance will be assessed accordingly. Even scoring model, the scores perfectly rank the risk of the clients: the
if the ECL is measured over the lifetime of an instrument, we assume clients that go to default are assigned the worst scores prior to default.
that the performance of rating and scoring models is measured over a We mention that the IFRS 9 norm focuses on the notion of credit risk
one-year horizon as this is usually the case in practice. deterioration and does not provide any denition of the default event.
 In the specic case of retail exposures, we assume that the scores Banks usually have only one denition of default, which coincides
and ratings under consideration incorporate some specic issues such with the Basel denition.
as multiple defaults, restructured loans or default contagion. We consider a homogeneous portfolio of loans, meaning that the
As we will emphasise, efcient provisioning in the IFRS 9 frame- loans have the same risk drivers. These loans are granted to the same
work is based on measuring credit risk and model performance accu- types of client in the same geographic area and belong to the same asset
rately. The main statistical tools used to assess the performance of class (for instance, prime residential mortgages in the UK originated

risk.net
risk.net
risk.net 73
77
26
Cutting edge: Credit risk
Cutting edge: Credit
Model building: risk edge
Cutting

literature because it often leads to very good ts (Hanley 1996). We


1 The ROC curve (the hit rate as a function of the false alarm
rate)
will set b D 1 hereafter; as emphasised by Tasche (2012) and Cramer
(2003), this special case motivates the choice of modelling the prob-
ability of default curves with the inverse logit function. Additionally,
C
100 b D 1 is the only value of the parameter b for which the binormal
ROC curve is concave over the whole interval 0; 1.
80
Another t uses an exponential shape for the CAP curve. It was pro-
A D
Hit rate (%)

60 posed by Van der Burgt (2008) in the context of low-default portfolios


(sovereign credit risk in his paper). In this paper, we introduce the
40 exponential t of the ROC curve, which is similar to Van der Burgts
Perfect model
AR = 70% t:
20 Two-stages model 1 eku
O Random model RkE .u/ D (3)
z B E 1 ek
0
0 20 40 60 80 100 In what follows we will assume that the ROC function R.u/ is concave
False alarm rate (%)
and has either a binormal or an exponential shape. Other shapes are
of course possible and our results can be extended straightforwardly.
by entity X of the bank). We call p the one-year average uncondi- From (1), we derive the value of AR as a function of the parame-
tional probability of default within the loan portfolio. We consider a ters of the ROC function in the particular cases of the binormal (see
rating model that produces a continuous score over the set of debtors equation (3.14) in Tasche (2010)) and exponential approaches:
in the portfolio. The higher the score assigned to a loan, the lower its 9
a >
probability of default. We rank the debtors according to their credit- Binormal t .b D 1/ AR D 2N p 1 >
=
2
worthiness, starting with those that have the lowest scores and going (4)
1 1 1 > >
to those with the highest scores. Let us consider the fraction x of the Exponential t AR D 2 ;
1 ek k 2
debtors having the lowest scores. Among all the defaulters in the port-
folio, we call HR.x/ the hit rate function, which is the proportion
of defaulters that have been classied correctly regarding the score The stage 2 hit rate target value
value corresponding to x. Similarly, we call FAR.x/ the false alarm In contrast to the Basel framework, in which rating systems are
rate function, which is the proportion of non-defaulters that have been required to have at least seven grades for non-defaulted loans, the
classied incorrectly regarding the score value corresponding to x. IFRS 9 norms introduce an unusual classication for non-defaulted
The cumulative accuracy prole (the CAP curve) is obtained by loans with only two grades: stage 1 and stage 2. The resulting clas-
plotting HR.x/ when x ranges from 0% to 100%. The receiver operat- sication of loans performs less well than the original score because
ing characteristic (the ROC curve; see gure 1) is obtained by plotting of the loss of information in the bucketing process. Indeed, to assess
HR.x/ as a function of FAR.x/ when x ranges from 0% to 100%. In the two-stage classication rule, we assume that the ranking of the
what follows, we call the ROC function R.u/, where u D FAR.x/ is loans within each stage is random, but loans in stage 1 all have a better
the false alarm rate. For a random scoring model, the hit rate is equal ranking than loans in stage 2. The resulting ROC curve, named R2 .u/,
to the false alarm rate for all x, and the ROC curve is the diagonal of where u is the false alarm rate obtained with the new rankings, is an
the unit square; the area under the ROC curve, called AUC, is equal to afne function per interval (the purple line in gure 1).
1
. For a perfect model, the hit rate is always equal to 100% and AUC is Let us call AR2 the two-stage accuracy ratio and the stage 2 hit
2
equal to 1. The CAP and ROC curves are closely related to each other, rate, ie, the proportion of defaults captured by the stage 2 portfolio.
as well as to their associated summary statistics (see Engelmann et al We show in the appendix that AR2 is linked to the stage 2 hit rate :
2003):
Z 1 AR2 D R1 ./ (5)
AR D 2AUC 1 D 2 R.u/ du 1 (1)
0
We obtain a direct relationship between the proportion of defaults that
When considering real data, both the CAP and ROC curves are stage 2 catches and the performance of the two-stage model. We see
noisy. The most popular t of the ROC curve used in statistics is the geometrically that AR2 6 AR due to the loss of information in the
binormal approach, which is based on a two-parameter family of ROC bucketing process; we propose to undertake the bucketing in such a
functions (Hanley 1996): way as to minimise this loss of information. From (5), we show that
AR2 reaches a maximum for D , which is the solution of:
B
Ra;b .u/ D N.a C bN 1 .u// (2)

@R1 ./
where the function N./ is the cumulative normal distribution function. D1 (6)
@ D
This type of ROC function corresponds to a normal distribution of the
scores of both the defaulters and the survivors, which is a reasonable The selected value of depends on the calibration of the transfer
assumption in practice. The case b D 1, for which the score volatility criterion from stage 1 to stage 2. Transfer criteria for which >
is the same for defaulters and survivors, is often used in the statistical are not relevant because the two-stage accuracy ratio decreases for

74
78
27 risk.net
risk.net May
risk.net May 20162016
2016
October


Cutting edge: Credit risk
Cutting
Model edge:
building: Creditedge
Cutting risk

this range of parameters. After some algebra, we obtain a one-to-one


2 Optimal hit rate and two-stage accuracy ratio as functions of
relationship between the underlying scoring model accuracy ratio and the score accuracy ratio (AR)
the maximum target hit rate from (6) and (3):
p 9 100%
2 1 1 C AR >
Binormal t .b D 1/  D N N >
=
2 2 (7) 80%
 1 1 1 C AR >>
;
Exponential t D D
1 ek k 2 60%

We note that in the binormal case we obtain the same result as that
40%
obtained by Tasche (2012) using a criterion based on misclassication
cost. We plot the values of the maximum target stage 2 hit rate  and Target hit rate (binormal b = 1)
20% AR2 (binormal b = 1)
of the associated two-stage accuracy ratio AR 2 as functions of AR in Target hit rate (exponential)
gure 2. AR2 (exponential)
0%
0 20 40 60 80 100
For realistic values of the accuracy ratio AR between 60% and
Score accuracy ratio (%)
80%, say the maximum value of AR2 is reached when the stage 2
hit rate is in the range 7080% in the binormal case and in the range
8090% in the exponential case. For the exponential ROC function, 3 Target size of the stage 2 portfolio as a function of the score
the link between AR 2 and AR is implicit, with intermediate variable k. accuracy ratio
This link is explicit in the case of the binormal approach, and we obtain
from (5) and (7):
50 p = 1%
p p = 5%
B 2 1 1 C AR
AR2 D 2N N 1 (8) 40 p = 10%
2 2 stage 2 portfolio (%)
Target size of the

The maximum attainable values of AR2 are in the range 4565% when 30
the underlying score accuracy ratio is in the range 6080%, and they
20
are very similar for both approaches. We also observe the maximum
attainable values of AR2 depend only on AR and not on the portfolio 10
default probability.
From a practical point of view, we propose using the stage 2 hit 0
0 20 40 60 80 100
rate and the two-stage accuracy ratio to assess the calibration and per- Score accuracy ratio (%)
formance of the transfer criterion. People in conferences and working
groups sometimes refer to the value of 70% as a target for the stage 2
hit rate, which looks sound at rst sight. However, the above equa- We notice that this formula remains true whatever the transfer criterion,
tions show that the relevant targets for the stage 2 hit rate depend on even if it is not based on a score or a rating system. Let us focus on
the two-stage accuracy ratio or on the underlying score accuracy ratio, the binormal ROC function from now on in order to get some orders
ie, on the quality of the scoring model. In most cases, a stage 2 hit rate of magnitude. When the score accuracy ratio is equal to AR D 80%
of 70% is suboptimal. and the stage 2 hit rate is equal to D 70%, we get ARB 2 D 60:1%;
when p < 15%, the size of the stage 2 portfolio is then in the range
Stage 2 portfolio size formula 1020%. We plot the size of the stage 2 portfolio corresponding to the
Let us call B2 the size of the stage 2 portfolio expressed as a percentage optimal target stage 2 hit rate as a function of the score accuracy ratio
of the total portfolio exposure. The default probability within bucket 2 AR for several values of p in gure 3:
is then equal to p=B2 . The stage 2 hit rate is equal to the probability p
2 1 1 C AR
that a loan is in stage 2 prior to default; from the denition introduced B2 .AR/ D .2p 1/N N C1p (11)
earlier, we have HR D . The stage 2 false alarm rate is equal to the 2 2
amount of surviving loans in stage 2 divided by the total amount of We observe B2 is quite stable when the portfolio default probability
surviving loans of the portfolio: changes and remains considerably above the default probability for
B2 p realistic values of p.
FAR D (9)
1p
IFRS 9 provision proxy formula
These values of the stage 2 hit rate and stage 2 false alarm rate corre-
spond to the co-ordinates of point A in gure 1. By computing the area The amount of provision is equal to the ECL over one year for all
under the two-stage ROC curve (the area under the R2 .u/ function), loans in stage 1 and to the ECL at maturity for all loans in stage 2
we obtain the following relationship (see the appendix): (International Accounting Standards Board 2014). We assume that the
losses given default are the same within stage 1 and stage 2 (this
AR2 .1 p/ D B2 (10) assumption can easily be relaxed). As the stage 1 portfolio has a size

risk.net
risk.net
risk.net 75
79
28


Cutting edge: Credit risk
Model
Cuttingbuilding: Cutting
edge: Credit riskedge

equal to B1 D 1 B2 , a proxy of the IFRS 9 provision is given by the Second, we obtained a formula that links the size of the stage 2 port-
following formula: folio with risk and performance indicators. We showed that the IFRS 9
  provision is driven by the stage 2 hit rate and not by the size of the
.1 /p p
P D LGD B1 D1 C B2 D2 stage 2 portfolio, whatever the transfer criterion between stage 1 and
B1 B2
stage 2.
D p LGD.1 /D1 C D2 (12) The proposed approach sets some targets for the stage 2 hit rate
and the two-stage accuracy ratio that can be helpful for calibrating and
where D1 (respectively, D2 ) is the IFRS 9 risky duration within the
backtesting the transfer criteria that banks are using. The proxy formula
stage 1 (respectively, stage 2) portfolio, dened as the probability
may also be used as a simplied approach to computing IFRS 9 pro-
weighted value of 1 invested in the stage 1 (respectively, stage 2)
visions when too little data are available for calibration. This formula
portfolio over a one-year horizon (respectively, lifetime). Because of
can, at least, be used as a benchmark for IFRS 9 provisions.
the small probability of default in stage 1, we have D1 DF.1/, where
DF.T / is the discount factor associated with horizon T . Conversely,
defaults and maturity effects are no longer negligible in stage 2; we take Appendix: proof of equations (5) and (10)
into account the survival rate of the loans thanks to the one-year default The area under the ROC curve of the two-stage model, called AUC2 , is
probability within the stage 2 portfolio, which is equal to p=B2 , and equal to the sum of the areas of the triangle OAB, the rectangle ADEB
we get approximately: and the triangle ADC in gure 1. The co-ordinates of point A are .z; /,
  where, from (1), we have D R.z/. The geometry of gure 1 leads
p WAL to:
D2 WAL 1 DF.WAL/ (13)
B2
z .1 /.1 z/ 1Cz
AUC2 D C .1 z/ C D
To obtain this formula we have computed the duration of a bullet port- 2 2 2
folio with maturity WAL (equal to the weighted average life of the
We obtain AR2 D 2AUC2 1 D z D R1 ./. From (9),
stage 2 portfolio). In general, banking book loans have a maturity
the stage 2 false alarm rate links the size of the stage 2 portfolio and
higher than one year on average and we have D2 > D1 . In such a
the value of z:
case, we see that the provision is not necessarily a decreasing function
of the stage 2 hit rate . A more careful study should then be made B2 p
z D FAR D D R1 ./ D AR2
to establish when an accurate model (with a high value of AR) gen- 1p
erates lower provisions (this situation would correspond to negative
This last equation leads to (10). 
misclassication costs).
Vivien Brunel is the head of risk and capital modelling at
Socit Gnrale and professor of nance at Lonard de Vinci
Conclusion
Ple Universitaire in Paris. The author is grateful to two anony-
We obtain two important results in this paper. First, we derived a mous referees who provided very valuable comments and sug-
quantitative criterion to determine loans that should go into either gestions. He also thanks Benot Sureau and Yann Trguer from
stage 1 or stage 2. We obtain the optimal target stage 2 hit rate of the Socit Gnrales risk department. This article reects the
two-stage classication at a one-year horizon, which is around 70 authors opinions and not necessarily those of his employers.
80% for realistic scoring models, corresponding to the binormal case. Email: vivien.brunel@socgen.com. Previously published on Risk.net

REFERENCES
Basel Committee on Banking Hand DJ, 2009 International Accounting Tasche D, 2010
Supervision, 2001 Measuring classier Standards Board, 2014 Estimating discriminatory power
The internal ratings-based performance: a coherent IFRS 9 nancial instruments and PD curves when the
approach alternative to the area under the July number of defaults is small
Consultative Document, January ROC curve Working Paper, available at http://
Machine Learning 77, Sobehart J, S Keenan and arxiv.org/pdf/0905.3928.pdf
Cramer JS, 2003 pages 103123 R Stein, 2000
Logit Models from Economics Benchmarking quantitative Tasche D, 2012
and Other Fields Hanley JH, 1996 default risk models: a validation Bounds for rating override rates
Cambridge University Press The use of the binormal model methodology Journal of Credit Risk 8(4),
for parametric ROC analysis of Moodys Rating Methodology pages 329
Engelmann B, E Hayden and quantitative diagnostic tests
D Tasche, 2003 Statistics in Medicine 15(14), Van der Burgt M, 2008
Testing rating accuracy pages 15751585 Calibrating low-default portfolios
Risk January, pages 8286 using the cumulative accuracy
prole
Journal of Risk Model Validation
1(4), pages 1733

29
76
80 risk.net
risk.net
risk.net MayOctober
May 2016
2016 2016
IT infrastructure

Implications for IT systems


Joined-up approaches are needed to couple risk and finance spaces in an effort at continuous default risk monitoring

1 Affected IT infrastructure for IFRS 9 projects

Scorecards and metrics Early warning system


Data mining
and analysis Reporting and dashboards
Taxonomies

Scenario and

Internal
Data marts OLAP cubes
Instruments

audit
stress testing

governance
Analytic
Messaging and business rules
Data model

IT
engines
Operational BI appliances
Metadata

Information security
Data warehouse Market risk
Reference and

Credit risk
master data

Financial
controls
Liquidity risk
RAROC

compliance

Source: Chartis, IFRS 9 Technology Solutions, 2016


Policy and
Operational data storage
Data quality

Finance
Prices

Operational
Trusted sources

risk
Data cleansing Changed data
and enrichment capture Others
Client &
account
mining
Data

Banking operations
Cards Treasury Auto loans Core banking

IFRS 9 applies further pressure to financial institutions already faced with Apart from the obvious changes to internal ratings-based (IRB) models
increasing regulatory reporting requirements. These cause particular stress to required for IFRS 9, workflow across the enterprise is a key area for analysis. The
legacy IT systems. extensive reworking of models is shadowed by the requirements for continuous
Impact analysis if IT design and architecture for IFRS 9 is required covers: monitoring of relative changes, since origination of loans and exposures in
Historic data analysis and retention probability of default, loss-given default and exposure at default. Early warning
Analytics systems and limits and exposure measurement systems will all require review to
Calibration ensure they are IFRS 9-enabled.
Monitoring The technologies required for IFRS 9 will need to bridge the gap between risk
and finance. As an accounting initiative, the controls and reconciliations that exist
Figure 1 summarises the affected IT infrastructure for a typical IFRS 9 project for other accounting processes will be a priority. Any repurposing of IRB and risk
those parts most affected are coloured yellow. technology will therefore have to address these emerging functional requirements.

risk.net 30
IT infrastructure: Whitepaper

Preparing for the IFRS 9 game-changer


How to reconcile the demands of risk
and finance on a single, flexible platform
IFRS 9 is a new accounting standard for financial instruments, which firms around the world must implement by January 1, 2018 at the
latest. It has been designed to value assets and liabilities in a more risk-sensitive manner than the incumbent IAS 39. Despite being an
accounting standard and not a supervisory measure, it introduces a new approach to credit risk

IFRS 9 presents many challenges for financial firms, reports with data that is dynamically collected from All of this financial data is intended to
including accommodating the differing demands a number of internal and external sources, with no inform stakeholders how well the company is
of risk and finance, and managing large volumes duplication of data and no need to change the IT performing, so they can make informed decisions
of risk and financial data, which must be refreshed infrastructure. This enables a rapid implementation about its future. The financial figures are also
at more regular intervals than ever before. This at limited cost, all the while maintaining strong data used to calculate taxes a serious matter. This
whitepaper reveals how these challenges can be lineage for mining and auditing. This is exactly what is why the disclosures bear the signatures of
overcome by using a single, integrated platform with AxiomSLs state-of-the-art platform offers. Proper both the chief executive officer (CEO) and CFO.
a flexible data model. data management will produce significant benefits Inaccuracies, even if inadvertent, may result in
by integrating risk and finance, but only if the a prison sentence. Consequently, accounting is
Overview of IFRS 9 following challenges are addressed: performed in a very controlled, structured and
implementation challenges Data
 handling must be upgraded to support joint rigid production environment.
The major challenges of implementing IFRS 9 risk and accounting compliance. Risk works in a different, less structured
revolve around the requirement to accommodate The nature of internal databases must be changed. ecosystem because it is intended to support
the perspectives of both risk and finance. Historically, The divergent goals of chief risk officers (CROs) and decision-making in an uncertain environment. The
these two functions have operated in isolation from chief financial officers (CFOs) must be addressed. risk function tries to reduce the uncertainties of the
each other, developing very different cultures. To Conflicting regulatory and accounting views of risk future by clarifying the alternatives. The resulting
grasp the uncertainties of the future, the risk world is must be reconciled. insights are used to support decisions on which
fond of statistics, which it uses to analyse historical risks to take and to what extent, and which risks to
patterns and spot recurring behaviour. In essence, it is Each of these issues is described below and hedge or avoid. Outputs are holistic by nature risk
a world based on principles rather than formal rules. appropriate solutions are proposed. appetite, risk tolerance, risk hedging, etc.
In contrast, the world of finance and, more specifically, Despite the efforts of regulators to formalise and
accounting is characterised by many detailed rules. At 1. Upgrading data handling for joint risk industrialise risk assessment, practitioners need vision,
times, the excessive complexity of these rules results and accounting compliance anticipation, openness and reactivity when measuring
in situations that defy common sense. These very The Basel Committee on Banking Supervisionsregula- risk and supporting risk/return decision-making. This
different cultures have deep roots in the bank, and tion 239 (BCBS 239), Principles for effective risk data calls for flexible and easy-to-change decision-support
they translate into different types of organisations: aggregation and risk reporting, refers to risk data, systems with an underlying IT environment that
typically, risk management has a degree of liberty as not accounting data, which is managed based on ac- should also share these characteristics.
long as it is compliant with regulations; accounting, on counting standards. Nevertheless, implementing IFRS 9 Trying to combine the goals of accountants and
the other hand, has to strictly follow accounting rules requires a large amount of data that will have to fulfil risk managers leads to the following question: can
with no kind of freedom. both sets of requirements. So, what is the underlying data be managed in a controlled, structured and
Accordingly, risk and accounting data is logic of each framework? How do they conflict? And rigid production environment while simultaneously
organised differently, and the goal of combining how can they be accommodated? being available for open, flexible and evolving
these different types of data is a real challenge. This Accounting aims to provide accurate and decision support?
whitepaper analyses the technical challenges of auditable figures. Billion-dollar balance sheets must AxiomSLs platform provides both the controlled
implementing IFRS9 and suggests effective ways be published with figures that are precise down to environment required by accounting and the
for firms to maximise their return on investment in the last dollar, and it must be possible to trace these flexibility required by risk. By imposing no constraints
data management. financial figures back to their sources. Moreover, the on where the data is located and by avoiding
The common feature of the solutions proposed in figures must be disclosed within tight timelines and duplication and double storage, data can be used in
this whitepaper is the ability to feed processes and must be accurate and transparent. a rigid production structure as well as in a flexible,

31 risk.net October 2016


IT infrastructure

adaptable decision-support environment. Complex


1 Options Main benefits Main challenges
logic can be defined graphically and understood by
CFO owns the data First-hand IFRS 9 user Understanding and using risk data
both IT and business users. The platform enriches the Controls everything

data, but retains the links to all sources, providing Provisions managed dynamically

full data lineage. Thus, all data changes, whether due CRO owns the data Maintains coherence Complying with accounting requests
to human interaction or system logic, are tracked Regulatory compliance
Knows risk systems

and auditable. Full data lineage is also critical during


Duplicate sets of data one for CRO Both CRO and CFO own The possibility of discrepancies
testing and production to understand the results.

and one for CFO  Can implement different


The ability to drill down to the raw data sources, frequencies, perimeters and
which are already known to the users, is essential for content constraints

establishing trust in the system and for reassuring


management about the reliability of the final results. available ... at the reporting date about past events, to fulfil all upcoming supervisory requirements.
current conditions and future economic conditions Moreover, s/he will undoubtedly be summoned
2. Changing the nature of (IFRS 9 5.5.17). As ECL changes will have to be by finance to provide the credit risk data required
internal databases disclosed and explained, a significant mass of for IFRS 9 compliance. As a result, the new issues
In order to implement IFRS 9, databases will need to external economic and market data, both qualitative facing CROs include managing more granular data
be larger, more dynamic and open to external data. and quantitative, will have to be collected and stored down to the transaction level, moving towards PIT
Under IFRS 9, the correct assessment of expected for future retrieval and analysis. and introducing forward-looking components into
credit loss (ECL) will require significant changes to How can financial firms design an open solution their analysis.
internal databases. The fact that almost all assets that can handle this granularity, these frequent So, should the CFO or CRO own the credit risk
will have to be impaired will dramatically increase updates and large volumes of external data? data used for ECL assessment? Should the data be
data granularity and data volumes; the point-in-time AxiomSLs platform can store, manage and control duplicated or can it be shared? Will CEO arbitrage
(PIT) requirement will make high-frequency updates data at different levels of granularity; and the be necessary or should there be a chief data officer
compulsory; and the forward-looking approach will frequency at which this data is updated can be (CDO) position to decouple data sourcing and usage
require a significant inflow of external financial and tailored to meet the needs of individual parts of the (figure 1)?
economic data. business. The AxiomSL platform is unified and can The current trend is to give ownership of the
Why will data need to be more granular? Assessing be used to control all data and processes. A single data to the CFO because s/he owns the impairment
ECL under the IFRS 9 framework means calculating environment, team and architecture can be leveraged process and because the data used to assess PIT
impairments for all assets in the amortised cost or to manage large volumes of historical, retail and measures and the forward-looking stance are mostly
fair value through other comprehensive income statistical data and precise balance-sheet information related to the accounting standard. Nevertheless,
categories. These two accounting categories should for corporate clients. This unique approach allows CRO supervision is usually granted. AxiomSLs state-
comprise the bulk of all assets. Therefore, the number financial firms to reuse resources, including of-the-art dynamic data model makes it possible to
of assets to be impaired will increase dramatically hardware, software and teams, leading to significant implement any of these options. For instance, giving
from thousands to millions. This is a whole new ball reductions in the total cost of ownership (TCO). the CFO ownership of credit risk data can be done
game: in terms of data management, the processes of without changing the risk IT infrastructure.
purifying and enriching the data, and the operations 3. Addressing the goals of both
of extraction and transfer will have to be massively CROs and CFOs 4. Reconciling conflicting regulatory
enhanced and automated. IFRS 9 compliance will require risk data. Both the and accounting demands
Why will data need to be updated more CRO and CFO will want to own this data because it IFRS 9 introduces a nave and refreshing view of
frequently? The spirit of IFRS 9 is to identify credit is critical raw material they need to meet their own how to assess credit risk. This includes using past and
risk increases as early as possible. This is apparent in current and emerging requirements. The CFO signs off present data as well as available forecasts, assessing
the PIT requirement and the changes to the buckets the financial accounts and is liable to the shareholders risk based on the whole life of transactions and
in which assets are grouped. PIT measures such as and the auditors with regard to the transparency and properly including diversification effects.
PIT probability of default (PD), loss-given default fairness of the disclosures. Consequently, s/he is used In contrast, even the most sophisticated current
(LGD) and joint default correlations will probably to controlling accounting data from A to Z. The new regulatory approach, the internal ratings-based (IRB)
have to be updated at least once a month. This is issues faced by the CFO include a massive increase in approach, does not offer a proper way to assess
the update frequency for many macroeconomic the number of assets to impair and a higher volatility credit risk. This methodology, unchanged for the past
indicators affecting the markets. Higher frequencies of provisions, which s/he will have to dynamically 12 years, assesses credit risk using through-the-cycle
must also be considered: the supervisory guidance forecast and manage. Securing the quality and (i.e. backward-looking) measures. It only looks at a
specifies that ECL must capture all significant availability of data will be a critical success factor. one-year time horizon, regardless of the real maturity
increases in credit risk (Guidance #45), so ECL The concern is the same for the CRO: s/he is of the transactions. Diversification and concentration
assessment must be linked to market information on the frontline managing regulatory pressure, effects are still based on an oversimplistic asset
and must be ready to be updated at short notice. including compliance with BCBS 239, Basel III, the correlation methodology. Recovery risk assessment
Why will external data be needed? ECL is assessed Asset Quality Review (AQR) and European Banking is collapsed into one single figure, the downturn LGD.
as a probability-weighted discounted cash shortfall. Authority (EBA) stress tests. The CRO must deliver Facing the tsunami of changes that are to come,
The weighting refers to possible future scenarios. complex reports at high frequency. To that end, s/ competent authorities are trying to stay in control.
These scenarios have to include information that is he has an explicit responsibility from the board The Guidance on credit risk and accounting for ECLs

risk.net 32
IT infrastructure: Whitepaper

was published by the Basel Committee in December


2 Regulatory view IFRS 9 view
2015. Regarding loan portfolios, it adds to and
EAD expected value one year ahead (typically nominal + EAD = net present value (NPV) for two of the three
reinforces the IFRS 9 requirements. The guidance one year of interest) asset classes
was developed on the principle of non-objection by
PDs through-the-cycle PDs point-in-time
the International Accounting Standards Board (IASB)
One-year PD Multi-year PDs
and, consequently, it remains quite soft. In particular,
Downturn LGD LGD distribution
even though it mentions the discrepancies between
the approaches (...regulatory capital models Best worst case Weighted average scenarios

may not be directly usable in the measurement of One-year time horizon One-year and lifetime time horizon

accounting ECL due to differences between the


objectives of and inputs used for each of these
3 AxiomSL process flow for IFRS 9
purposes [Guidance#9]), it gives no indication of
how to overcome them.
Moreover, the same vocabulary is used across Collective impairment
Financial
the accounting and regulatory frameworks, and assets Measurement Portfolio Staging Through-the- Point-in-
Retail

classification segmentation cycle PD time PD


is certain to be a source of confusion. Commonly & risk drivers Definition of a
Multi-period Statistically or
Business On initial significant
selection
used parameters of credit risk assessment, such as model recognition change in
PD transition
matrices
expert base
adjustment
credit risk
exposure at default (EAD), PD, LGD, correlations and SPPI criteria Subsequent
measurement
expected loss, are used indiscriminately, despite their Amortised
Cost Individual impairment Expected credit
different concepts and realities (figure 2). FV through
loss (ECL)
Corporate

OCI Historical Monte Carlo simulation of balance sheet ECL 1 year for
It is therefore essential to have explicit definitions, FV through
Corporate (Algo Save) stage 1 and
ECL lifetime
Customers Produces correlated scenarios of default for stage 2 and 3
a well-defined data model and attributes that remain P&L
Financials Point-in-time PD and LGD assets and
Disclosures &
attached to the data wherever it is sent or used. reporting
Two features of AxiomSLs state-of-the-art, flexible
Bank
Market and
data model are particularly important for fulfilling exposures Effective interest rate
Exposures

macroeconomic data
Customer
the above requirement: on the one hand, every data Product
Amortisation and cashflow generator
Guarantees
source is clearly identified, named and mapped in Collaterals

the data model; on the other, even when delivered


within complex reporting, the data remains in its
source environment and is not duplicated. This data
lineage avoids the risk of losing track of the datas calculation and reporting solutions, has anticipated into the AxiomSL platform. AxiomSL has formed a
nature and identity. It also facilitates data mining these challenges and offers a robust technology partnership with AlgoSave and has integrated its
and auditing, and guarantees the reliability of the platform that is fully equipped to implement IFRS 9. model, which assesses ECL using historical financial
reporting, all the while allowing for quick checks The fully integrated platform is designed to enable data, current market data and economic forecasts
and controls at any stage of the reporting process. financial firms to make the best decisions in terms (Figure 3). The model has been field-tested in the
of organisation and modelling, while reducing the asset management industry and is now being made
Conclusion IFRS 9 calls for a time and effort needed to access and manage the available to other types of financial institutions. In
game-changer: AxiomSL relevant data. this way, AxiomSL offers a compelling solution to the
Fintech is a game-changer. Data has become External ECL models can be easily integrated challenges presented by IFRS 9.
a critical asset for all financial institutions, and
fintech companies enable firms to manage it more
efficiently than ever. The author
IFRS 9 is premised on widespread adoption Jean-Bernard Caen is a subject matter expert at AxiomSL specialising in risk and
finance. He is responsible for supporting AxiomSLs client base in analytical and
of fintech its requirements would have been
reporting issues on IFRS 9 and providing direction and insights into the firms
unthinkable just a few years ago, before technology
professional services and product development teams.
played such a significant role at financial firms.
Prior to joining AxiomSL, Jean-Bernard was head of economic capital and
As demonstrated above, IFRS 9 will require:
strategy for Dexia Group for 12 years, where he was in charge of Basel II and Pillar
an ever-increasing quantity of granular data;
2 implementation, and risk/finance co-operation.
more frequent data updates to support the PIT
In 1990, Jean-Bernard founded Finance & Technology Management, a
approach; management consulting firm he ran for 12 years as chief executive officer. As such,
the challenge of sharing the same data between
he directed numerous assignments for European financial institutions in the areas
users with different objectives and concerns; and of shareholder value, risk management, capital allocation and asset-liability management.
the automation of complex processes, such
Jean-Bernard is a member of the Professional Risk Managers International Association France executive
as ECL calculations for impairments, without committee, of the Association Francaise de la Gestion Financire management board and teaches at the French
compromising data lineage and auditability. National School of Economics and Statistics. He is a French civil engineer and graduated from Massachusetts Institute
of Technology.
AxiomSL, the leading global provider of regulatory

33 risk.net October 2016


Sponsor profiles

Oracle Financial Services Analytical Applications (OFSAA) provides financial standards while using a modularised design approach. This allows for
institutions solutions for risk, treasury, finance, financial crime, compliance easier integration and client-specific extension and allows banks to perform
and the front office. The applications are built on the integrated and unified granular calculations on a common set of data. The solution is integrated
Oracle Financial Services Data Foundation. This draws on a comprehensive with other Oracle applications related to regulatory and accounting
understanding of the industry over many years and brings together various purposes, such as Basel reporting, regulatory capital reporting, Oracle
interlinked disciplines in a single source of truth cleansed, standardised, Financial Accounting Hub and Oracle General Ledger. Oracles approach
reconciled and ensuring completeness, accuracy and financial integrity provides the following benefits:
of data. Shared data, metadata, computations and business rules enable
institutions to meet emerging business and regulatory requirements with Maximisation of existing investments in risk and finance systems, allowing
reduced expenses. reutilisation of existing data, business rules and technology infrastructure,
reducing costs and time-to-market.
A common analytical infrastructure underpins the OFSAA suite with Consolidation of technology and data repositories for risk and finance.
metadata-driven R modelling capabilities and the industry-leading Oracle The availability of a comprehensive, end-to-end solution to address IFRS 9,
Business Intelligence platform for management reporting. The OFSAA from data management and computations to accounting and reporting.
infrastructure, built on open technologies, provides a set of tools to build
and maintain applications using a common business language across OFSAAs IFRS 9 solution allows banks to actively incorporate risks into their
the analytical applications suite and helps financial institutions leverage decision-making and to deliver actionable customer, business-line and
existing investments. profitability insights. In addition, it helps the bank promote a transparent
risk management culture and pervasive intelligence across its departments.
OFSAA offers a single, unified IFRS solution that is compliant with IASB www.oracle.com

Moodys Analytics helps capital markets and risk management professionals a forward-looking impairment model, developing quantitative credit risk models
worldwide respond to an evolving marketplace with confidence. Its expertise and benchmarking. Credit impairment analysis software facilitates the end-to-
and experience in credit analysis, economic research and financial risk end process of ECL calculations by centralising data from numerous sources,
management enable the company to offer unique tools and best practices co-ordinating and managing a wide variety of models, evaluating changes in
for measuring and managing risk. By providing leading-edge software, credit risk and calculating expected losses and provisions accordingly for export
advisory services and research, including proprietary analyses from Moodys to external accounting systems.
Investors Service, Moodys Analytics integrates and customises its offerings
to address specific business challenges. Moodys Analytics has more than 20 years of experience forecasting
challenges and helping financial institutions successfully address their credit
Moodys Analytics suite of credit risk models and data, economic forecasts, loss estimation. This experience, combined with deep domain expertise,
advisory services and infrastructure solutions can assist with the granular in-house economists, extensive data sets and modelling capabilities, and
and dynamic methods required for implementing expected credit loss (ECL) award-winning regulatory and enterprise risk management software, is the
and impairment analysis under the new IFRS 9 accounting standard. foundation for successful credit loss and impairment analysis.

Moodys Analytics offers a wide range of solutions for implementing IFRS 9 Moodys Analytics has been recognised for its award-winning tools for
requirements. Its advisory services support clients with IFRS 9 roadmap design, measuring and managing risk. Notably, Moodys Analytics was voted#1
model deployment, simulation of IFRS 9 provisions and implementation. in enterprise-wide credit risk management in the Risk Technology
Credit modelling solutions allow firms the opportunity to implement internally Rankings 2015, is a Preferred Vendor in the IDC Financial Insights FinTech
developed or off-the-shelf models and incorporate forward-looking information Technology100 Rankings, and is recognised as a category leader in the
into existing frameworks. Comprehensive and granular credit risk, economic and FinTech Quadrant for IFRS 9 technology solutions 2016.
financial data sets help capture and collect historical data required for building www.moodysanalytics.com

risk.net 34
Sponsor profile

AxiomSL is the leading global provider of regulatory reporting and risk Dodd-Frank Act, Fatca, AEI (CRS), Emir, Corep/Finrep, CCAR, FDSF, BCBS239,
management solutions for financial services firms, including banks, broker Solvency II, AIFMD, IFRS, central bank disclosures and both market and
dealers, asset managers and insurance companies. Its unique enterprise credit risk management requirements. The enterprise-wide approach
data management platform delivers data lineage, risk aggregation, analytics, offered by AxiomSL enables clients to leverage their existing data and risk
workflow automation, validation and audit functionality. management infrastructure, and reduces implementation costs, time-to-
market and complexity.
The AxiomSL platform seamlessly integrates clients source data from
disparate systems and geographic locations without forcing data conversion. AxiomSL was voted best reporting system provider in the 2015 Waters
It enriches and validates the data and runs it through risk and regulatory Rankings and was highlighted as a category leader by Chartis Research in
calculations to produce both internal and external reports. The platform its 2015 sell-side risk management technology report. The companys work
supports disclosures in multiple formats, including XBRL. The unparalleled has also been recognised through a number of other accolades, including
transparency offered by the high-performance platform enables users to drill success in the best reporting initiative category of the American Financial
down on their data to any level of granularity. Technology Awards and the customer satisfaction category of the Chartis
RiskTech100 rankings.
AxiomSLs platform supports compliance with a wide range of global and
local regulations, including Basel III capital and liquidity requirements, the www.axiomsl.com

About Chartis

Chartis is the leading provider of research and analysis for risk technology Chartis is focused solely on risk and compliance technology, providing a
on the global market.Part of Incisive Risk Information, which owns market- significant advantage over generic market analysts.
leading brands such as Risk and WatersTechnology, Chartis goal is to
support enterprises as they drive business performance through better risk Chartis has brought together a leading team of analysts and advisers from
management, corporate governance and compliance, and to help clients the risk management and financial services industries. This team has hands-
make informed technology and business decisions by providing in-depth on experience of implementing and developing risk management systems
analysis and actionable advice on virtually all aspects of risk technology. and programmes for Fortune 500 companies and leading consulting houses.

Our areas of expertise include: Visit www.chartis-research.com for more information, or join our global
Credit risk online community at www.risktech-forum.com
Operational risk and governance, and risk and compliance
Market risk
Asset-liability management and liquidity risk
Energy and commodity trading risk
Financial crime, including trader surveillance, anti-fraud and
anti-money laundering
Cyber-risk management
Insurance risk
Regulatory requirements including Basel II/III, Dodd-Frank,
Emir and Solvency II

35 risk.net April 2016

Das könnte Ihnen auch gefallen