Connect with us

Global Banking and Finance Review is an online platform offering news, analysis, and opinion on the latest trends, developments, and innovations in the banking and finance industry worldwide. The platform covers a diverse range of topics, including banking, insurance, investment, wealth management, fintech, and regulatory issues. The website publishes news, press releases, opinion and advertorials on various financial organizations, products and services which are commissioned from various Companies, Organizations, PR agencies, Bloggers etc. These commissioned articles are commercial in nature. This is not to be considered as financial advice and should be considered only for information purposes. It does not reflect the views or opinion of our website and is not to be considered an endorsement or a recommendation. We cannot guarantee the accuracy or applicability of any information provided with respect to your individual or personal circumstances. Please seek Professional advice from a qualified professional before making any financial decisions. We link to various third-party websites, affiliate sales networks, and to our advertising partners websites. When you view or click on certain links available on our articles, our partners may compensate us for displaying the content to you or make a purchase or fill a form. This will not incur any additional charges to you. To make things simpler for you to identity or distinguish advertised or sponsored articles or links, you may consider all articles or links hosted on our site as a commercial article placement. We will not be responsible for any loss you may suffer as a result of any omission or inaccuracy on the website. .

Technology

Managing Reference Data risk – A best practice approach

Derek-Kemp

By Derek Kemp, EVP and Head of EMEA Region, iGATE
With an array of new regulations and a challenging economic environment, financial institutions are under immense pressure to improve data management processes and mitigate risk. The poor quality of data,substantial redundancy and duplication of effort in the area of reference data management, continues to create major problems for financial institutions, globally.Derek-Kemp

As a first step, firms first need to understand the gap that exists between their current reference data management processes and newer best-practice approaches.

The business impact of poor data quality

In the rush to improve efficiency with straight-through processing (STP) have firms failed to pay sufficient attention to the risks associated with poor data quality?

Based on current market trends, this certainly seems to be the case.

Duplication of reference data results in unnecessary costs: All large financial firms access data from a variety of sources where disparate and siloed data systems and operations are the norm. To add to the complexity, organizations typically source reference data from a range of internal and external providers based on the data consumption requirements of different departments. This siloed style of functioning however, makes it difficult for an individual department to access data that may have already been purchased by another department. This typically leads to reference data purchases being duplicated, thereby leading to unnecessary costs.

Increased operational risk: When inconsistencies or inaccuracies in reference data arise, exceptions in the trade lifecycle occur, leading to increased operational risk, lost revenue opportunities, and financial liabilities. This gains even more significance during periods of market volatility when firms feel the need for expensive and time-consuming manual trade duplication and reconciliation processes.

Consolidating market data management and counterparty systems

The reference data management problem is shared by both the buy-side and the sell-side. Each has to worry about securities reference data to settle trades reliably and to provide fair valuations. The sell-side firms are now beginning to consider market data and counterparty data as part of the larger reference data problem.

Trading firms have for a long time been concerned with sourcing accurate market data on a real-time basis with minimum latency in order to feed algorithmic trading engines. Only the largest Tier 1 firms can afford the luxury of storing their own tick histories, but for the majority, there are cost-effective industry utility services such as Tickdata and Reuters DataScope Tick History, which provide clean tick history on demand. These can be used for proof of best execution and algorithmic back testing.

Firms should closely examine whether the additional costs of the more complex platforms and the cost of the actual tick data storage provide sufficient benefits when compared with less expensive securities reference data management platforms and tick data history utilities.

Many firms have invested heavily in counterparty data management. However, through a spate of recent acquisitions, organizational, geographic and functional silos have developed resulting in multiple databases in different formats. For a firm, there is undoubtedly a benefit from integrating counterparty data in one place. Now that industry utilities exist where clean data can be reliably sourced, it is no longer a proprietary advantage. The advantage would stem from the uniform usage of that data source throughout the firm.

Market data and counterparty data should be acquired as a utility and mapped to reference data through cross-referencing. But a robust reference data management platform must underpin such efforts if they are to succeed.

Managing multiple risk points for efficient reference data management

Best-practice reference data management may involve outsourcing, but only when certain risk points in the process have been considered:

Risk-point 1: Purchase: In the case of reference data, it is common to buy multiple sets from multiple vendors, according to the needs and preferences of individual employees and teams within the organization. This results in organizations spending money on duplicate, non-optimized data. Best-practice reference data management circumvents purchase point risks by using advanced tools to track and analyze what data sets are being purchased and where they are being used, tracking the data path from contributor to consumer, and then monitoring by user and data element. Smart companies also turn to trusted independent partners to help them determine what data sources are right for the organization and its employees.

Risk-point 2: Cleansing: Reference data generally is non-optimized at the time of purchase and hence it needs to be cleansed in order to identify inconsistencies and faults. If reference data is held in multiple silos rather than centrally, it is likely to be cleansed multiple times. There is also the increasing impact of corporate actions on reference data to consider. Some corporate actions are relatively easy to handle, such as splits and dividends. Others, like complex rights issues, require labor-intensive intervention from a specialist. As a best-practice for reference data management, use automated tools that cleanse data by monitoring incoming data and checking for relationships and expected patterns. When exceptions occur, manual intervention may be required but smart companies use skilled staff in low-cost offshore locations to do this.

Risk-point 3: Distribution: Once reference data has been bought and cleansed, it needs to be fed to the individual systems that consume it. That requires an in-depth understanding of what data sets and fields each consuming system requires, when the data is needed, and in what format it is expected. As a best-practice for reference data management, use ETL (extraction, transformation, and loading) tools to subset and reformat the data contained in the Golden Copy, and send it to each consuming system in a way that it can be understood and used. When a request for a new feed is submitted, skilled personnel are on hand to decide which fields from which data sets are required to build it.

Leveraging third-party specialist organizations to manage commoditized reference data

Many firms have now realized that there is little competitive advantage to be gained from managing publicly available, highly commoditized reference data in-house. Increasingly, firms are turning to third-party specialist organizations, not only to manage reference data on their behalf, but also to re-architect data systems in such a way that they are outsourcing ready.

Using a combination of best-of-breed tools and skilled resources, the third-party specialist will normalize, cleanse, validate, and aggregate multi-source content to achieve a single, standardized Golden Copy of the reference data, which is fed back to the client as a managed service. There has never been a better time for financial firms to leverage third-party specialist organizations to manage commoditized reference data. The risks associated with reference data now are an immediate concern and require immediate action. Best-practice reference data management is critical to current performance and a prerequisite for achieving further growth and efficiency.

 

 

 

Global Banking & Finance Review

 

Why waste money on news and opinions when you can access them for free?

Take advantage of our newsletter subscription and stay informed on the go!


By submitting this form, you are consenting to receive marketing emails from: Global Banking & Finance Review │ Banking │ Finance │ Technology. You can revoke your consent to receive emails at any time by using the SafeUnsubscribe® link, found at the bottom of every email. Emails are serviced by Constant Contact

Recent Post