Connect with us

Technology

Why the cloud is key for optimal market and reference data management

Why the cloud is key for optimal market and reference data management

By Mark Hermeling, CTO, Alveo

Suboptimal market and reference data management has long caused operational and compliance complications that has impeded the growth of financial institutions. Today, however, change is in the air and a radical transformation of market data management has begun.

Not as straightforward as thought

Market and reference data provide the information required to complete and settle financial transactions. At its most basic level, reference data includes important standard identifiers for the underlying security, buyer, broker of the transaction and the price. This data is surprisingly complex as it can involve thousands of groups and data attributes for each product.

Reference data is not an area where mistakes can happen. It is imperative to the smooth functioning of business operations for both the financial services industry, its service providers and its regulatory agencies. The industry has, therefore, been pursuing a policy of standardising reference data that defines and describes trade transactions. However, it has not been straightforward as first thought.

Spaghetti architecture

Much of this has been down to the current IT landscape of financial services firms. Historically, firms have followed a convoluted approach to provisioning market data to its users and business applications often creating separate silos based on product lines, customer segments, geographies or reflecting past mergers and acquisitions. This has evolved into a costly and unmanageable ‘spaghetti’ architecture over time, with a multitude of disparate sources, databases and flows.

As the external reporting requirements increase and as the existing set-up becomes an increasing brake on innovation and new product development, there is now a need to have one single version of the truth. However, internal consumers of data often source, manage and store the reference data locally for their department or within their own specialist applications. This can lead to a creation of many copies of the same data.  This complicates data aggregation for any data analysis in risk and finance or for external reporting. Yet, because of the fragmented nature of the application landscape, standardising reference data can be a significant exercise.

Do it once, do it right

In the current climate, financial services firms face a squeeze on their margins. In many ways, they are being punished for the fact they were early adopters of the previous wave of automation. This means they are often mired by constraints imposed by legacy infrastructure. This is stifling their ability to take advantage of the opportunities in new data analytics technologies and meet new regulatory requirements.

Any financial services organisation embarking on a market data transformation journey should seize the opportunity to do it once and do it right; focusing on delivery on achieving reusable assets that generate recurring value and develop a sustainable and cost-effective solution. The good news is that an entire rip and replace is rarely needed. However, financial organisations should formulate a comprehensive market data transformation plan. This plan should focus on creating recurring value and developing collaborative and sustainable relationships among market data vendors, IT, and business units.

Transformation should be driven by using configuration driven products, as these can be much closer aligned with business requirements. This will apply both for the initial transformation and shift to the cloud as well as in supporting change afterwards. Configurable products will lead to faster turn-around on business decisions around new data-sets and process change.

A transformation has begun

An industry wide transformation has begun and for many financial services firms, shifting their data management into the cloud has been key. The cloud not only reduces infrastructure and maintenance costs but also ensures an element of future proofing by increasing scalability and elasticity. By helping reduce market data cost, the cloud enables better management of market data to appropriate-sized infrastructure and centralised licensing. This is achieved through the combination of cloud infrastructure and vendor-managed integrated managed data services with a ‘one stop shop’ for the end-to-end provision of market and reference data.

Often, firms suffer from far too much local automation, meaning they have trouble seeing the forest for the trees. Now is the time for them to lift their head. Good data management makes all business applications work better. It is an energy shot to business processes. Plus, enhancing data quality control and governance helps meet regulatory requirements, where fines from mismanagement of data can be increasingly crippling. It also allows the organisation to understand its inner workings, helping to lifts the veil of confusion. Improved transparency of data demand and usage enables better controls to be in place to monitor real time usage/cost across all data sources, categories, and user groups.

Lift and shift

There is no doubt that organisations are increasingly taking a holistic approach to market data, where users of the data are at the forefront of any design. And for good reason. Moving to the cloud drives an organisation-wide standardisation of data charging and consumption. Plus, an improved data lineage ensures that source data and any transformation in the data’s lifecycle can be clearly captured. This transparency not only helps firms optimise their data assets but reduces the cost of change.

Cloud-native data aggregation and data quality management enables financial organisations to easily access trusted data while maximising their data ROI. However, it is important not to run in to any transformation blindly but opt for a cloud agnostic solution with a ‘lift and shift’ mentality to avoid lock-in to specific data and cloud providers in the future. The use of open-source technologies is critical here in avoiding being overly dependent on a single cloud infrastructure provider. This approach will put data operations on a future-proof footing and materially reduce the cost of change as new business requirements and reporting are a given.

Editorial & Advertiser disclosure
Global Banking and Finance Review Awards Nominations 2021
2021 Awards now open. Click Here to Nominate

Recommended

Newsletters with Secrets & Analysis. Subscribe Now