By Graeme Dillane, manager, financial services, InterSystems
Traditional database architectures are coming up short in today’s data-driven world. Part of the problem is that it is difficult for them to handle the tsunami of data coming into the organisation. Data volumes are growing rapidly. Analyst firm, IDC recently projected1 that by 2025 the global ‘datasphere’ will have grown to a staggering 163 zettabytes of data generated per year, ten times the data generated in 2016.
Scoping out the Challenge
Financial services organisations often find it especially difficult to get a handle on the vast volumes of data they have at their disposal – and they therefore struggle to use it to get a clear picture of how best to address specific organisational or operational challenges.
WANT TO BUILD A FINANCIAL EMPIRE?
Subscribe to the Global Banking & Finance Review Newsletter for FREE Get Access to Exclusive Reports to Save Time & Money
By using this form you agree with the storage and handling of your data by this website. We Will Not Spam, Rent, or Sell Your Information.
Most of these businesses are facing a complex raft of different issues. Many have had the same legacy systems in place for decades and they are often reliant on these systems for their day-to-day operations. These legacy systems are often not fully integrated with the rest of the organisation. As a result, many of their applications run in siloed environments.
That, in itself, is a significant challenge. Added to it, (and partly because of it), financial services organisations are often doing very little with the vast volumes of data that they have access to. They frequently have no means of analysing that data, particularly when it is unstructured, let alone analysing it in real-time. This has been a problem for many years across multiple financial services sectors but with important new regulations like Markets in Financial Instruments Directive (MiFID II) and Fundamental Review of Trading Book (FRTB) coming on stream, it is becoming ever more urgent.
To meet, these and other industry regulations, financial services organisations often need to provide regulators with information from right across these silos, analysed in a real-time environment. Typically, they don’t have the technological capability to be able to do this today.
Regulation impacting the financial services sector is nothing new, of course. Since the banking crash of 2008, there has been a plethora of new legislation. In the past, financial services organisations have tended to address these regulations in a piecemeal fashion by putting in place new siloed applications to meet the specific needs of each new ruling. Up to now, this approach has worked after a fashion but many organisations have merely done the minimum they needed to do to meet each successive regulation.
The latest round of regulations is raising the stakes though and effectively demanding that these organisations break down their data silos, better integrate their data enterprise-wide, and analyse it in real time in the context of new event and transactional data. In line with this, financial services organisations increasingly understand the scale of this problem and are actively seeking out solutions. The precise solution chosen will of course be different depending on the specific financial services sector and the status of the business.
The more data that organisations are storing on legacy solutions, the more they are going to require an updated data platform that can handle real-time analytics to meet the pressing regulatory requirements they face. Even organisations that have fewer legacy systems are still likely to require solutions that deliver enhanced interoperability to help provide a real-time view across the business.
Finding a Solution
The above highlights the broad-brush requirements that financial services organisations should be looking for. But at a more granular level, they need to think through the step-by-step processes required to meet these regulations. To comply with FRTB, for instance, organisations will typically need to bring information in from multiple applications; run reporting on this data on a real-time basis and generate that in a format that meets the regulator’s precise requirements. That’s just one example but in general terms there will be a host of complementary processes that organisations will need to implement that also help support compliance with regulatory requirements.
Organisations need to seek out a data platform that can ingest data from real-time activity, transactional activity and from document databases. From here, the platform needs to take on data of different types; from different environments and of different ages to normalise it and make sense of it. Interoperability is key. As is granular, role-based security. Any chosen solution needs to be able to ‘touch’ those disparate databases and silos, bring information back and then make sense of it in real-time.
Data platforms also need to be agile.
as businesses move systems and applications into the cloud, they are starting to use software to ‘containerise’ their applications and modules. Once containers have been set up in the cloud, they are then reusable by other applications within the suite.
It is crucial too that the chosen platform can perform analytic queries – or ask questions – of the data that the organisation holds even if that data is in large data sets and stored in different data and application silos. This capability is critical for complying with regulatory requirements, and answering unplanned ad hoc questions from industry regulators, for example.
Of course, the power of analytics can take businesses far beyond regulatory compliance. The ability for platforms to provide a panoramic view of disparate data in a secure, role-based manner, for example, can also be used by financial services organisations for a variety of other business requirements, for example, for calculating real time position values used in program trading with millisecond performance to meet strict performance SLAs.
The ability to process transactions at scale in real-time and simultaneously run analytics using transactional (real-time) data and large sets of non-real-time data (e.g. historical and reference data) is a critical capability for various business requirements, for example fpr powering mission critical trading platforms that cannot slow down or drop trades, even as volumes spike. This kind of capability has the potential to bring significant benefits to many financial services businesses today.
Across the financial services sector, though, it is the onward march of regulation that is acting as a key disruptor. Businesses are being driven to innovate and adopt the latest platforms, spanning data management, interoperability, transaction processing, and analytics, by an urgent need to comply.