BANKING ON TRADE STORES: MINIMISING RISK AND MAINTAINING COMPLIANCE

By Adrian Carr, Senior Vice President, MarkLogic

Adrian Carr
Adrian Carr

Falling foul of the regulators is an expensive business. The FCA levied eye-watering fines just shy of £1 billion on banks in 2015 – not to mention the costs pertaining to possible litigation or the time spent addressing the problem or salvaging a bank’s reputation.

And if there’s one thing you can bank on, it’s that more fines are on the cards as regulatory requirements – like MiFID II and BCBS29 – demand even tighter controls and more granular levels of reporting. But the pressure arising from compliance and the changing regulatory framework can be reduced. By getting their data in order now, banks can be fully prepared.

There is a growing realisation in the financial services industry that one answer to this lies in the deployment of a Trade Store (or Data Hub) – a virtual filing cabinet that holds a single source of truth for all trade events within a bank. With this Trade Store approach – effectively a trade lifecycle management system – every single ‘event’ that happens to a trade or asset is recorded, allowing a full audit trail for auditors and regulators.

Over the last few years, I have been involved in many Trade Store discussions and deployments. The common denominator for these has been the constraints imposed by technologies conceived in a different era. Technologies which have become cumbersome, expensive and slow to maintain.

For valid historic reasons, banks frequently have a multitude of individual product lines and trading systems that feed into specific information silos which keeps data separate and hinders organisations from achieving that single source of truth. The flurry of M&As hasn’t helped, creating even more trading system fiefdoms and silos. New regulatory pressures dictate that these now have to be cobbled together somehow to provide an integrated, consolidated view for reporting purposes.

Some banks have tried – and failed – to use their legacy relational databases to build a Trade Store. The changing nature, variety and complexity of trading data does not lend itself to the rigidity of a schema-based relational model. With each separate Trading system comes a new schema requiring complex interfaces to reconcile the disparate fields. If anything changes, which of course it does, at a minimum everything needs to tested or, more frequently, re-designed. An additional constraint with relational databases is the need to know what queries you will run in the future whilst still in the decision stage. Our customers find that relational databases are simply not agile enough to integrate mission critical data across many silos in a timely and cost-effective manner.

Some of the world’s largest investment banks have solved the conundrum by building a Trade Store on an Enterprise NoSQL database platform.

A NoSQL database is flexible, schema-agnostic and specifically designed for rapidly changing, multi-structured, complex data applications – like a Trade Store. Choosing the right NoSQL database is important though: Open source variants do not have all the enterprise-grade features required. These features include support for ACID transactions, government-grade security, high availability, elasticity or scalability and disaster recovery. An Enterprise NoSQL database marries the two so financial institutions can benefit from a flexible, agile and scalable platform while knowing its data will be secure, never lost and always available.

In addition to finding a database with both enterprise and NoSQL traits, it’s good to find one that embodies innovations that are ahead of the market. Combining technical innovation and banking insights, one leading investment bank already working with the MarkLogic database to deploy a Trade Store recognised the future importance of determining what was known at any particular point in time. As a result of these discussions, a new and increasingly important feature called Bitemporal data management was developed. MarkLogic’s Bitemporal capability allows banks to minimize risk through “tech time travel”—time stamping and rewinding documents to identify changes by looking at data as it was over the course of time without having to reload data backups. This is critical to maintain and demonstrate compliance with, for example, Dodd-Frank data analysis processes.

By adopting a ‘Regulatory Book Of Record’ approach, not only are banks able to avoid the prospect of hefty fines, but they can also reduce costs because there is no longer a need to develop and maintain multiple different systems. In the new Enterprise NoSQL world, the system is built on a commodity scale-out architecture. The result is a lower cost per trade.

To cut costs and introduce efficiencies, a leading investment bank – booking an average of 100,000 derivatives trades a day, resulting in about 32 million live deals – reengineered its disparate legacy derivatives trading architecture. The bank deployed MarkLogic’s Enterprise NoSQL database, giving it a single unified view of derivatives trades. It replaced 20 Sybase databases with a single database, making trade information retrievable as well as actionable in real-time. As well as enhancing compliance reporting, it has dramatically reduced maintenance costs and the bank can now develop and deploy new software – and therefore launch new products in response to the market – much more quickly.

Another global investment bank built a Trade Store on the MarkLogic database in just six months. This acts as a single source of truth for all trade events and related data, and ensures data consistency. More than 30 Trading systems were connected to bring vast amounts of unstructured and structured data into one central repository and make it accessible by tens of line of business’s downstream systems. This allows the bank to handle various reporting requirements, including regulatory reporting, and helps to protect against regulatory fines.

Today, the same Trade Store takes inputs from between 20 and 30 trade sources and three reference data sources, and unifies and stores more than 25 TB of unstructured and structured data in one single searchable repository. It currently holds information on over a billion trades and ingests, in near real-time, up to 2 million trade events and reference data records per day from upstream production systems, including validity and duplication checks and version management. Importantly, the system can also be quickly adapted, extended and enhanced to meet changing business and regulatory requirements without redesigning schemas or ETL (Extract, Transform and Load).

Looking ahead, the idea of a Trade Store on steroids used to evaluate aggregated risks – for example for BCBS 239 – is being mooted by some banks. If they know where they stand with regard to a risk-adjusted position at any one time, they can work out how much capital they have available to trade with as well.

As BIS highlighted in its recent report on the adoption progress of the BCBS 239 risk data governance principles, ‘Banks should critically examine their data architecture and data adaptability capabilities’.  It will be interesting to observe which banks take a strategic approach to regulatory reporting and align to the European Banking Authority’s public ambition to have a daily electronic regulatory submissions pipeline by 2020.

The countdown has begun. Banks need to act now to get their digital filing in order to avoid falling victim to massive – and wholly avoidable – regulatory fines.

Comments are closed