Ted Rees, partner at Crossbridge, looks at the challenges still facing banks surrounding Basel III.
Banks are faced with sizable challenges in achieving the Basel III deadlines and objectives. The sheer scale of the change required results in high resource costs of implementation that, together with the impact of the actual capital requirements, will significantly dent banks’ return on equity over the coming years.
Firms will need to undergo huge expense to implement the Basel III and other important US and EU regulations, which will result in resources being diverted from other areas that typically drive revenue growth, such as product development, improved infrastructure and enhanced customer service. As a result, profitability will suffer and the net effect of which, ironically for some banks, will be vicious circle of weakening of their capital buffers.
At Crossbridge, we see challenges in three key areas – project delivery complexities, data quality, and resource constraints. How banks manage the balance-sheet impact of the new capital charges is also critical to their continuing success.
The Big Game
The scale and complexity of large regulatory projects adds considerably to their delivery risk. Unlike similarly sized internal projects failure or postponement is not an option. Simply managing software release schedules presents headaches when there are already busy release windows for risk and finance systems and parallel changes being made due to other regulations. Co-ordinating such a large programme of work requires careful planning and understanding of the real dependencies, as well as the potential for leveraging and re-using work undertaken across other regulatory programmes – a bit like ensuring the roads only get dug up once when gas, electricity and water repairs are needed.
To give a flavour of the scale of change, one of the Basel III regulations requires a bank to calculate value at risk (VaR) on individual client/counterparty exposures. In many cases, this will require a wholesale re-engineering of the way that “normal” aggregated market risk VaR is calculated. Given the amount of testing that is therefore necessary, there is an increased dependency on significantly greater volumes of data that could lead to calculation errors. Similar problems occur with many of the other key regulations, such as central counterparty clearing under MiFID II.
Regulatory projects also provide a number of headaches for managers and teams applying the classic structured, ‘waterfall’ approach. Often the project has to begin the process of implementing changes before the regulations have actually been clarified. A good example occurred in Basel 2.5, where the inclusion of sovereign positions (excluded from risk capital calculation under the old rules) was only clarified by the regulators half way through the process. This resulted in large changes to both the risk capital models and the required data to input into the calculations. Project managers who have experienced regulatory projects in the past will be acutely aware of these difficulties and will strive to ensure that architecture and designs are sufficiently flexible to cope with last minute changes. Model parameters often have to be designed to second guess regulators’ revised demands, as market, economic and political conditions change.
Dealing with Curve-balls
The process of getting all the legislation drafted and approved is by no means straight-forward – each jurisdiction may have different local laws and languages, which can cause subtle differences in interpretation, often leading to delays and protracted clarification of the final requirements. This issue is amplified for global institutions that have to comply with several local regulators and their legal rulings on each regulation.
More recently, banks have also had to consider changes in regulators’ use of enforcement. In the last few years since the Credit Crisis, banking regulators have become increasingly assertive and intrusive in their monitoring activities, requiring greater evidence that banks’ internal processes meet regulatory requirements. In the context of Basel III, regulators may increasingly demand evidence that banks meet the requirements of the Pillar II “Use Test” leading to significant changes to firms’ operating models and capital adequacy report submissions.
When considering the impact of Basel III on their Risk and Capital processes and systems, banks also need to consider other external economic influences such as the rating agencies. Banks’ credit ratings have come under significant pressure, with many firms experiencing one or more downgrades and the corresponding impacts on their funding costs and profitability. In this environment, banks are placing more importance on the information they provide to the rating agencies, driving a requirement for high quality information on capital adequacy that aligns with the rating agencies’ needs. This places additional requirements on the capital and risk calculation and reporting systems and processes, which do not align fully with those needed for Basel III. Banks need to consider how best to align their responses to these competing demands, so that they deal effectively with, and maximise re-use across both.
Rubbish In, Rubbish Out
The data quality challenges need to be addressed in three main areas: market price data, trade and transaction level data and reference data (particularly for clients).
Historical market data is needed to calculate possible losses over a given period (as used in VaR calculations). In many cases data clean-up programmes are already underway to ensure that data is fit for purpose, however further clean-ups are likely to be required, as the business and capital impacts are reviewed and explanations of unexpected results point to data quality and calibration issues rather than actual structural or capital inadequacies.
In most cases transaction level data is already available in existing front office feeds to risk and finance control systems. However, these may not be sufficiently detailed to provide the required inputs to the new calculations or the data may be available but has never been used for such critical calculations, so may not be reliable.
The third area where data quality issues are likely to occur is in reference data. Fortunately previous regulatory and credit risk programmes have led organisations to adopt better frameworks to link their client reference data together. These multi-year programmes have often become expensive and require continuous focus and improvement to ensure requirements are being met, leading to further unwelcome costs.
Fielding the Right Team
Another obvious challenge is ensuring the number and capability of the people engaged is right. In practice, most regulatory projects are looking for the same type of resources at the same time and market-wide shortages often result.
In some cases, a good solution is to redeploy front office experts onto risk or finance projects – simply employing “regulatory specialists” may appear to be the right choice on the surface, but in practice a detailed knowledge of a firm’s proprietary front office products, pricing and valuation algorithms, accounting practices and data representations is more valuable than being well versed in details of the regulations.
However due to cost cutting and the diversion of existing resources on to competing, mandatory regulatory projects, many front office areas are now more sparsely staffed. This leads to risks that either delivery deadlines are missed, supplied data feeds are not up to the required standard, or the calculation and transmission processes have been insufficiently tested.
There are no easy solutions to these challenges, but a lot can be achieved by ensuring the programme leadership teams have broad experience and calm heads. Ensuring all the teams work together across all affected support, risk, finance and front office functions requires respected leadership, with a strong mandate and a sense of common purpose.
A key to the success of Basel III projects, is securing realistic capital impact calculations as early as possible. These allow the business to plan ahead and also to flush out any inaccuracies early on in the process. Whilst the challenges of inaccurate data and uncertain regulatory requirements can unfortunately, weaken the impact of this exercise and risk that incorrect planning decisions are made, obtaining sight of these issues helps to prioritise resolution and ensure awareness of the underlying problems is highlighted in a timely fashion.
An open dialogue with the regulator to address challenging points up front will also help to prevent misunderstanding, big surprises or unintended consequences when updated regulations are issued. This will help firms decide where to re-focus efforts, and where to adopt a ‘wait and see’ approach or move forward based on well-founded assumptions.
The industry faces an unprecedented period of uncertainty, volatility and external scrutiny, and the on-going nature of the regulatory environment we now operate in means that future initiatives will largely depend on the speed with which banks are able to embed regulatory compliance into the day-to-day functions of the organisation. Traditionally under-funded ‘control functions’ are growing in importance and relevance and it may be time to establish regulatory compliance and internal control as a key pillar of firms’ business and operating models.
Ted Rees is a partner at Crossbridge, a specialist consultancy focused exclusively on Financial Markets with a client base covering many of the world’s leading banks and investment firms.