By using this site, You consent to the use of Cookies. For more information read our Privacy & Cookie Policy. OK

MEETING THE REQUIREMENTS OF BCBS 239 – AGGREGATING AND REPORTING RISK DATA

Suranjan Som, Joint Practice Head, Business Intelligence at IMGROUP

The global financial crisis highlighted that most large banks lacked appropriate facilities to aggregate, assess and analyse risks in a timely and flexible fashion. A new set of regulations, from the Basel Committee on Banking Supervision, known as BCBS 239, aims to strengthen banks’ risk data aggregation capabilities and internal risk reporting practices.
While the seemingly never ending myriad of new regulations and frameworks must seem overwhelming, banks should realise that the aims of BCBS 239 are consistent with what they should already be doing to strengthen their risk management data and processes. The insights that effective data analysis can deliver for risk management purposes will enable banks to make better informed strategic and investment decisions and help communication across departments.

The regulation

The BCBS 239 framework can be broadly categorised into 4 main pillars. Pillar 1 covers “Governance and Infrastructure.” Governance entails putting in place the appropriate organisational and process structures to ensure that risk aggregation receives the kind of strategic importance that any other business critical process would for the bank.  This ranges from day to day management structures to senior management and C-level ownership of risk data. This Pillar also requires organisations to put in place the right kind of technology and process infrastructure,not only for risk aggregation requirements, but also to offer an extensible framework that will allow easy incorporation of newer forms of risk and sudden spikes in computation capabilities in stress or crisis scenarios.

Pillar 2 covers the core “Risk aggregation” capability of the bank. Among other things, the bank will need to ensure that it has in place the right resources to provide:

  • Accuracy and reliability via data quality processes
  • Adherence to an “enterprise data dictionary”
  • Well documented unambiguous processes – either automatic or manual
  • Completeness in terms of data usage and coverage
  • Consistent latency for aggregating risk within agreed SLAs
  • Flexibility and adaptability to provide new aggregations easily

Pillar 3 aims to strengthen the bank’s “Risk reporting” capabilities. The supervisors would need to be confident that the bank has in place a suitable risk reporting infrastructure that is:

  • Accurate with appropriate data quality processes
  • Comprehensive – covering all the agreed risk across agreed organisational entities i.e. asset classes, organisational structures, locations, counterparties, etc.
  • Clear, intuitive and useful for the end users to easily comprehend
  • Available and refreshed at agreed frequencies
  • Distributed to the users using appropriate content distribution processes

Pillar 4 ensures that the bank has the capability to “Review, Collaborate and Act” on the aggregated risk data as quickly and seamlessly as possible. In other words, the supervisors should have appropriate means to review the aggregated risk output, make any remedial changes as part of the workflow and interact collaboratively with peers.

The approach

Suranjan Som
Suranjan Som

Speaking to one of the key individuals looking after BCBS 239 on the business side for a large bank in the UK, his views were “Everything in this regulation should be common sense. But that worries me as common sense it not that common these days”.  While this was a “tongue-in-cheek” comment it does reflect the general approach taken towards risk management in most large banks. The approach is siloed, departmental and has gaping holes when it comes to integrating the information for regulatory purposes across the bank. Quantifying and generating granular risk numbers is well-solved problem now – however aggregating it up across all key entities is a herculean data management exercise that requires well-defined, clean data (and master/meta data) on everything that enriches those risk numbers from organisational hierarchies to product structures to risk attributes.

While most of the GSIBs (Global Systemically Important Banks) have been working to address some of the key requirements as a matter of normal course, not every organisation is at the same state of readiness to achieve full compliance in the next 24-36 months. Banks must still look at how they accelerate the process of identifying current readiness to meet the Principles of BCBS 239. They must also look to establish a practical roadmap to help meet the regulation and assist them in the implementation of a seamless risk data management operating model. In order for this to happen banks need to ensure that the right people are in the right roles, as well as having the right processes and governance in place to ensure a quality controlled data management operating model and the appropriate technology to support the scale, speed and agility of flexible risk data aggregation and delivery.

The opportunity

According to one of the former MDs at a large Swiss Investment Bank, “Banks should see this as a major opportunity to overhaul their existing system and processes rather than treat it as ‘box ticking’ exercise”. While it is very true that there is an overlap of what BCBS 239 prescribes over a number of initiatives supporting other regulatory frameworks (e.g. Basel III, MIFID II, COREP, CRD IV, etc.) , it will be missed opportunity for them, if they do not leverage this opportunity to fundamentally change the way risk is generated, stored, aggregated, distributed and monitored. The insights that effective data analysis can deliver for risk management purposes will enable banks to make better informed strategic and investment decisions, not only making them safer but also more profitable in the long run.

Leave A Reply

4 + nineteen =