UNDERSTANDING THE PRINCIPLES FOR EFFECTIVE RISK DATA AGGREGATION AND RISK REPORTING (PERDARR)

The Principles for Effective Risk Data Aggregation and Risk Reporting (PERDARR), proposed by the Basel committee are due to come into force in 2016, yet many large banks are struggling to even survey their existing data and data architecture. The principles push banks towards unified systems and real-time reporting, but thanks to M&A’s, diverse offerings and an explosion in data volumes, the systems and processes that underpin these goals are complex, out of date and under strain.

Peter Walker

Peter Walker

Information Builders, a business intelligence and analytics specialist, and Capital Markets Consulting (CMC), experts in regulatory risk and finance change delivery, have partnered to provide banks with an integration, governance and management platform for risk data aggregation and risk reporting.

Steve Wyer, Managing Director at CMC and Peter Walker, Country Manager for Information Builders UK discuss with us the data challenges and opportunities that banks face in aligning themselves with the Basel committee’s principles.

What do the PERDARR principles mean for the banking sector?

Steve Wyer: “On one level it’s a very good thing, post crisis, to take a closer look at all forms of risk reporting and particularly, where risk concentrations or contagion caused problems in the past. To get a consolidated view it’s important to look at the underlying risk data and ensure that it is complete, accurate and consistent. It is also critical that all underlying data can be accessed in a timely manner. This data can then act as the foundation for the risk control mechanisms suitable and necessary to manage large and complex organisation such as a Systemically Important Banks (SIB’s).

PERDARR provides a great chance for banks and regulators to align, create data reporting standards and secure the required focus on data, particularly at a senior level within these organisations.

That said, it will be a huge challenge for banks to reach this position. Historically banks have siloed data for many reasons, whether intentionally for the purposes of arbitrage or as a symptom of limited integration following events such as mergers and acquisitions. General industry consolidation over the past 30 years has also contributed to this practice.

Peter Walker: “Banks are often very keen to build long term strategic data warehousing projects. The issue is that these projects are slow to deliver and even slower to achieve return on investment.

Steve Wyer CMC

Steve Wyer CMC

The PERDARR principles are backing up competitive imperatives in that banks have realised they cannot afford to wait for lengthy data architecture projects to be implemented and delivered. To ensure competitiveness modern banks need to have a far greater level of agility in their operational systems. To achieve this they need to access all operational sources – not just databases in modern systems – but legacy data and transactional data. Through this banks can put the right data in the right hands to drive performance and also comply with regulation.”

How quickly do banks need to move to bring themselves in line with the principles?

Steve Wyer: The principles are already in effect so most, if not all, have already begun work with the process of self-assessment. The consultation paper was issued in the middle of last year, so banks that haven’t begun to implement change should start immediately. For those banks that have already initiated change programmes, we would suggest they engage with trusted data specialists who have the knowledge, skill, innovative techniques and tooling to accelerate delivery and guarantee results. In our experience this new breed of data professional is not generally found resident within these institutions.

For large complex organisations, it can take as long as five years to refocus the organisations upon data centricity. It’s not purely a technical thing; it’s very much more a cultural process. Banks can utilise tools to assist in this process, but creating the required level of understanding requires a focus upon all employees and how they regard and work with data.

Systemic risks caused by opaque markets and poor data can lead to major issues and market movements, such as those we saw during the crisis. As such there will be significant penalties if the regulators presume that institutions are not doing all that they can to support systemic change and limit systemic risk.

There have already been significant fines for banks found to be non-compliant in areas of regulation, risk and regulatory data. If you are not compliant there is a real chance of facing similar penalties.

Peter Walker: “Speed is of the essence here for many banks. The nature of the challenge requires technology that allows quick access to the data at source. Without the long lead times involved with rip and replace of data systems. All banks are going to be challenged by the need to move quickly and also the need to be seen to have effective data governance and management in place. Some banks will naturally be in a better place than others but it’s vital that each uses the right tools and strategies to assess their current status, the integrity of their data, the number of sources they are using and what value they can get from new data processes and systems.”

How far away are banks from real-time, integrated data?

Steve Wyer: In the majority of cases, banks are a long way from real-time integrated data. Some are better than others, but the average level of preparedness is certainly low. From a risk reporting perspective banks have operated daily for some time, but other business functions particularly finance are frequently not as well positioned. This is a particular issue when considering the boundaries, overlaps and interconnectivity between risk and finance data and reporting

When delivering integrated data infrastructure another consideration is the sheer volume of data that needs to be managed and processed within core systems. Take for instance the case of trade transparency reporting under Dodd Frank; this now requires banks to report all trades within 15 minutes of execution. This has already placed considerable demands upon stretched resources, both technical and business. This is only set to increase when frequency of reporting is increased to 60 seconds in 2014.

This increased reporting data will also need to subsumed into risk and finance support systems with existing underlying data. It will need to be checked and managed to avoid duplication and inconsistency. This is yet another complexity brought about by increased regulatory change and enhanced reporting requirements.  In order to manage all of this change effectively institutions need to consider the environment and requirements in a structured manner.

Peter Walker: Banks are already used to dealing with real time data from a trading perspective. This minute-by-minute approach doesn’t yet translate to the operational systems of the business and its here that banks will be found out.

For most banks it would be more appropriate to stop developing new systems for each and every regulatory report and focus upon re-useable architectural components and more effective re-use of data, to limit further data siloing.

By reusing the same approach banks will be able to address new requirements for data access and delivery, supporting now and future regulatory and commercial requirements.

How can banks go about bringing themselves up to date?

Peter Walker: By and large banks will have to live with what they have in terms of data architecture, if they hope to deliver in time. Rip and replace strategies are not feasible in the existing time frame. The most successful approach is to integrate with existing architecture and infrastructure. Selecting tools that can coexist means that time to value is that much quicker.

For a large banking organisation, a rip and replace project is simply not feasible. With an approach like virtualisation we are typically seeing projects of 6-18 months (dependant on complexity). When compared with certain data warehousing and rip and replace projects, timescales can be two or three times this.

Steve Wyer: Banks are now realising that in order to bring themselves to the required level, they need to be addressing data and its’ quality management and governance at board level. Many are looking to appoint a chief data officer to give the issue the focus that it needs.

But whilst that addresses it at a strategic level, there is tactical work that needs to be done by everyone within the enterprise. Banks need to appoint the right custodians, data managers, data stewards and outline the right processes to ensure edicts made at highest level are then followed through on a daily basis by all employees whether they handling, inputting or managing data.

Peter Walker: “We’re beginning to see chief data officers and data custodian roles emerging. This combines executive board focus with stewards at an operational level. This is vital, particularly if you’re seeking to extend beyond the pure demands of regulatory and compliancy orientated solutions, to the opportunities that present themselves by monetising that data or providing it back to the customer.”

Steve Wyer: In my experience there is a lot of intransigence in the sector, the approach is often ‘we’ve always done it this way’, and overcoming this requires wholesale cultural and technical change. Banks generally have a long way to go to get from their current state to their target, and not a great deal of time to do it.

Data architecture is often an area where there isn’t a huge amount of internal expertise in banks. Given the nature of the change, getting an external perspective that is free of politics and inflexibility – and has the advantage of experience with new techniques – can be vital to success.

However this external view must have deep knowledge and experience within banking and trading and the FS sector in general. Change is about aligning with the business and its’ nuances and you can’t achieve the complex changes that are necessary without that understanding.

What challenges do banks face in unravelling existing data architecture?

Steve Wyer: Frankly, at an enterprise level many banks simply cannot fully complete what is necessary in the time that’s available. It is unrealistic to expect banks to re-engineer entire systems and data architectures in so short a period of time.

What the majority of banks really need to do is embrace the necessary change by utilising the innovative technologies and methods that are now available, such as virtualisation. This approach will allow banks to stage systems and data solutions whilst making the transition from current to target state. It will deliver immediate benefits and also make headway with long standing data challenges. These solutions will however need to be delivered within a considered and methodical strategic framework. In this way the results will accommodate long term architectural goals, whilst ensuring consistent, timely reporting and coherent regulatory compliance.

Peter Walker: “Data virtualisation is the most viable solution for banks. It is one of the only ways that banks can achieve the change needed in the time available. For this approach to be successful it’s vital that the new technology can co-exist with the existing legacy systems.

Through this approach, data can be gathered at source from any operational or transactional system, before being pushed into the virtualised environment. Controlled transformations and enrichment can take place under strict quality management and governance rules, and reporting can then be undertaken according to individual requirements. This creates an agile method of delivery, operating under strict controls, producing timely, consistent, coherent reports with minimal interruption to critical systems.

Steve Wyer: “This way banks can get on top of reporting data issues while leaving data in its current location. This will ensure that bad data is not further copied around the organisation and therefore we are able to limit data decay. Data can be accessed in real time as it is pulled from source systems as required.”

What are the competitive advantages from effective data management and risk reporting?

Steve Wyer: The responsiveness and accuracy of reporting that is achieved through effective data management creates enhanced trust, credibility and reputation with regulators and clients.

Having a timely and complete understanding of consolidated risk position in times of stress is essential. Without accurate, complete and timely data/information this is impossible.

High quality data, managed and governed by a business in the right way can also lead to a competitive edge. Having real time or near real time knowledge of your risk position and managing opportunities accordingly, allows you to take advantage of market opportunities in the most timely and cost effective manner – often ahead of other market participants.

Peter Walker: “Decision making throughout the business can also substantially improve. High data integrity has an immediate knock on effect, putting the right data in the right hands to make effective business decisions. By applying this approach throughout the organisation, new opportunities, particularly related to customer service, retention and overall performance can be identified.

Share this Article

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

Comments

comments