HOME | CAREERS | CONTACT US | SITE MAP
 
Telephone : + (94) 112 055318,+ (94) 112 058994
Fax : + (94) 112 338290, + (94) 112 320828
E-mail : apbsl@sltnet.lk
   
 
 
 
 
 
 

IMPLEMENTING BASEL II IN THE
COMPLIANCE CONTINUUM

 

By
Arun Pingaley and Kiran Narsu

Faced with impending Basel II deadlines in key jurisdictions and engaged in frenetic implementation of compliance initiatives, several banks are now beginning to realize that their perceptions of the Accord have been off-the-mark, after all.

In research carried out by the writers’ organization, several stages of a Basel II compliance programme were identified. Banks can benefit enormously from understanding this “continuum” of stages in their journey to Basel II compliance.

Internationally active banks seeking Basel II compliance typically proceed through three distinct stages along the continuum. While banks may have implemented varying types of vendor solutions or built technological frameworks during each of the states outlined below, the existence of technology platforms erected during each stage does not appear to be a barrier to progress along subsequent stages of the continuum.


Stage II – Data

In the first stage of compliance, banks often wrestle with data challenges:

   Data availability challenges
   Data movement challenges
   Data quality challenges

Data Availability Challenges

Banks have varying expectations from the Basel approach they undertake. For instance, Bank A, adopting an Internal Ratings-based Advanced Approach for credit risk, will have to wrestle with significantly more data than would Bank B that is planning to adopt the Standardized Approach. Common to both the Standardized Approach and the Internal Ratings-based Approach is the validation of Internal Ratings data. For this, banks would depend on default data supplied by an external data provider such as Moody’s KMV. However, only the Internal Ratings-based Approach requires data about recoveries – data that is not required under the Standardized Approach.

In all cases, however, data availability begins with an understanding of data for a chosen approach (as also the ability to find incremental data when mandated by the home or host country – say Advanced Internal Ratings-based or Foundation Internal Ratings-based, in addition to Standardized). Data element identification ensures that the bank knows the data that is required for its preferred approach.


Data Movement and Consolidation Challenges

Once data is identified and available, banks need to ensure that the proper “pipes” are laid to move data out of the appropriate transactions systems into a repository or central “risk warehouse’. Many banks have several such warehouse investments, but in the research referred to, it was found that these often do not have nearly enough of the risk data elements required, and even if they do, the data elements are often not at the suitable level of granularity required for downstream capital computation of risk weighted assets. Subsequently, the pain of enhancing such repositories to include the appropriate data is a challenge that must be properly addressed.


Data Quality Challenges

While availability and completeness of data are critical, banks must also ensure that data that is used for capital computation is accurate. Banks, therefore, must seek to reconcile accounting system data with general ledger data, and pass necessary adjustment entries in order to ensure that they are synchronised. In general banks should not underestimate the importance of ensuring strong data quality processes during the build and delivery of the risk central repository.

Banks will be serving potentially multiple regulatory masters during the Basel II supervisory review process, and bad data entering into a capital computation system, no matter how powerful the system, will result in extra work for the finance organization in correcting the numbers.

Typically, the primary owners at a bank that face these challenges during the first stage of the Basel II Compliance continuum are the technology staff who have an in-depth data element knowledge.

Stage II – Modelling

As banks grow more confident in the quality, sufficiency and availability of their data, and move along the continuum, they face a new set of challenges. Inherent to the Basel II compliance process is an accurate means of estimating the Probability of Default (PD) of an obligor or the Loss Given Default (LGD) of an exposure. Here, numerous landmines await most banks. Given that models for PD must be based on historical default data, the lack of availability of such data severely compromises the bank’s ability to estimate and validate its PD models. Many banks attempt to solve this problem by purchasing published default data from various vendors, but then, the bank must ensure that their portfolios correlate to the external vendor’s data in order to determine just how applicable the external default data is to the bank’s customer base.

However, for some businesses, such as Private Banking, such data is difficult to collect as there is seldom sufficient internal default history or any external data provider. This restricts a bank’s ability to use validated IRB Models for estimating PDs or LGDs.

The above problems also lead to the possibility of supervisors being unwilling to accept the PD/LGD estimates provided by the bank since they would be unable to back-test and validate them.

In most cases, the risk group within the bank is the primary owner of the development and implementation of the statistical models used in estimation of PD and LGD among others.


Stage III : Capital Computation

The Basel II compliance initiative at a bank is complete when it is able to compute regulatory capital and provide the appropriate capital adequacy return (reports) to regulator(s). Bank which have progressed through the first two stages of the continuum begin exploring in detail, at this stage, the capital computation process which they perceive to be both well defined (which it is) and easy to implement using simple tools such as spreadsheets (which it is not).

In fact, one may consider the capital computation process to be the most challenging stage of the compliance continuum due in large part to the gulf between early expectations and late-stage realization of the nitty-gritty details, compounded by the ever-shrinking number of days left to comply, for a bank. These capital computations done by a bank span credit, market and operational risk (though there is precious new, in the Basel II accord, on Market Risk).


Credit Risk Computational Challenges

While many banks believe that the only decision they will need to take is to pick one of three approaches for calculating Unexpected Loss- Advanced Internal Ratings – based (AIRB), Foundation IRB (FIRB), or Standardized – it needs to be kept in mind, that there is an abundance of options prescribed in the accord within these three basic approaches. For instance, there are two options for assigning Risk Weights and two collateral handling options to choose from within the Standardized approach itself. Further, within the IRB approaches, there are two different options for treating specialized lending and equity exposures. When one considers all permutations, and couples this with the requirement for multijurisdictional reporting (e.g., reporting as per CAD III in Europe and OCC requirements in the USA), and the necessity to support different approaches for each jurisdiction (including all the options within each approach), the computation process becomes immensely challenging.

Therefore, any computational engine, whether bought or built, must have the ability to support any and all of these approaches simultaneously, and support the ability to switch between these approaches when necessary.

Moreover, adequately addressing Pillar II and Pillar III to the satisfaction of their supervisors is often overlooked by many banks caught up in the mechanics of the Pillar I calculations. With Pillar III, the requirement to support multiple reporting formats brings its own challenges. For example, supervisors may at any given point, for market discipline purposes, call on a bank to share new information about the risk being carried in their books, which may, in turn, imply a new set of reports at very short notice. This would require the creation of a new computational set that previously was not available in the engine. Banks must ensure that the appropriate level of Pillar III capabilities are provided in the solution they implement.

Similarly, the challenge of providing Supervisory oversight over the Pillar I calculations is significant, as the computational processes need to be transparent, accessible and detailed enough to explain all aspects of the calculations.

The finance department at a bank is typically designated with the task of performing capital computations and the necessary reporting to the supervisors.


Conclusion

Banks today are discovering the above continuum the hard way – by discovering the problems as they stumble along the path to compliance. However, if they can benchmark themselves on which stage they fall in along the above continuum, and buckle themselves up for the challenges that lie ahead; it will not only serve them well in their endeavors towards Basel II compliance but help them to enjoy the benefits of maintaining reduced capital.

 

 

Arun Pingaley is Head - Functional Solutions Expert Group, Reveleus, i-flex Group of Companies, India

Kiran Narsu is a Vice President, Business Development, Reveleus, i-flex Group of Companies, India

     
 
 
     Copyright © 2004 - 2013 by Concept & Development By
     Association of Professional Bankers - Sri Lanka I-WEB SOLUTIONS