Business
Big Data for Financial Risk: Getting your Vs right, and your Cs together

SuranjanSom and David Marriage, IMGROUP
Risk analysis is higher on the agenda than ever before as banks and financial institutions struggle to regain trust from both shareholders and the public. The economic crisis and a more recent string of high profile financial scandals have demonstrated the need for more robust internal risk management. This comes at a time when the amount of risk data and the number of sources both delivering it, and asking for it, is growing almost exponentially.
First in the queue asking for the data are the regulators. Their need for greater transparency and accuracy means that they no longer want reports; they want raw data. Therefore financial institutions need to ensure that they are able to analyse their raw data at the same level of granularity that regulators will be. They need to be able to identify any issues before the regulators do. For the first time, financial institutions have to deal with Big Data.
Getting your Vs right
Big Data is often described using three Vs: Velocity, Variety and Volume. From IMGROUP’s perspective that is not enough. Whilst Velocity, Variety and Volume go some way to describing the tidal wave of data that banks are trying to deal with, two other Vs – Validity and Value – are just as important.
Data validity underpins any successful Big Data strategy. Capturing data accurately, at the finest grain, at source and continuously monitoring data quality means that issues can be identified as data comes into the system. Inaccurate data at the source invalidates even the best laid Big Data plans.
Keeping value top of mind helps to ensure that the focus of data quality processes is on data points that can be used to make a decision, analyse an issue or theory, or cover a regulatory requirement. If it is not used for any of those things, it’s worthless.
By ensuring that the data collected is both valid and has value at the source, useful and complete reference data can then be successfully applied. This means that any regulatory performance or analysis question asked by any stakeholder can be answered quickly and reliably.
And your Cs together
Unfortunately financial institutions’ infrastructures have evolved over time into complex silos, making holistic data management intrinsically difficult. For too long, banks have simply added layer upon layer of functionality and applications to an infrastructure already creaking under the strain. These applications are often designed in-house and in some cases over half of what exists is redundant. Navigating this web of systems to try and understand where the information is, what bits are important and where it should go is a major headache for CIOs and IT teams across the sector. This means that all too often the IT department itself has only limited knowledge of what exists and who is using what. In such a complex environment getting anything done quickly or accurately, let alone both together, is very difficult.
Rising above these silos and getting out of the infrastructure weeds will enable a financial institution to look across the entire businesses’ data and understand it. This means that no matter what the regulator asks, the business is in a much better position to deal with the unknown. Demand for real-time risk analytics is growing quickly. Once seen as virtually impossible due to the huge volumes of data and multiple sources, in reality, we’re now able to effectively deal with many of the issues.
The challenges that financial institutions are faced with when looking to implement a Big Data strategy run throughout the institution, dissecting different levels of hierarchy.
At the organisational level, the challenge is getting the appropriate executive buy-in to drive these programmes through in a holistic way. Getting the CFO, COO, CTO (or CIO) working together is normally a coherent way of driving a strong Big Data strategy forward. They will also need the remit from the CEO to do what is necessary to make it happen. A change of ethos is also required. Currently businesses tend to pick up the latest regulation requirement and start there; instead they need to think longer term. By getting C level executives together and thinking strategically rather than tactically, financial institutions can lead the way in information management.
At the business level, as a symptom of the siloed nature of financial institutions, the people are often also working in this way. Being able to step above the silos and see how everything relates is inherently difficult in a bank.
At the technology level, the silos are also evident. Focussing only on the technical issues within their department, there is a need for someone who can look at the technology needs of the wider group. An accurate view across the bank is not possible unless there is a system in place that can connect them and the analytical tools available to give the business a view across all of them.
Next steps
I would advise financial institutions to take the following steps:
- Audit the current position: with a view to understanding where the institution really is from an information management perspective; look at what’s possible (even with the tools already at your disposal).
- Create a vision: What are other companies both within the financial services industry and outside of it doing in the Big Data space? (For example, the pharmaceutical industry has experienced similar levels of regulation that banks are now facing, for some time).
- By investigating the gap between where the institution is currently and where it is aiming to be, it will be possible to come up with a number of recommendations and a roadmap for the next 2-5 years.
- Start closing the gap making sure that any projects support the vision. This way any money spent is contributing to the overall goal that has been set.
Currently financial institutions are looking at the onslaught of regulations, the need for better management information and decision support, financial reporting and the management of risk individually. However, Mifid, Basel 3, COREP and FINREP all overlap massively with creating an effective reporting and analytical capability. By capturing the right information at the right level, financial institutions can deal with each problem as part of an overall programme that will add value to the business. Each new regulation, question and data requirement will be easier to deal with because all the right information has been captured at the right level of granularity. This is the strategic approach to Big Data, and it’s the only one that really makes sense.

-
Top Stories2 days ago
Oil tumbles 2% to 3-week low on strong dollar, profit taking
-
Top Stories2 days ago
Rouble recovers slightly after slide past 100 vs dollar
-
Top Stories2 days ago
UK’s Petrofac gets over $600 million carbon capture contract from ADNOC Gas
-
Top Stories2 days ago
Dollar climbs to near 150 vs yen after US shutdown avoided, data