Editorial & Advertiser Disclosure Global Banking And Finance Review is an independent publisher which offers News, information, Analysis, Opinion, Press Releases, Reviews, Research reports covering various economies, industries, products, services and companies. The content available on globalbankingandfinance.com is sourced by a mixture of different methods which is not limited to content produced and supplied by various staff writers, journalists, freelancers, individuals, organizations, companies, PR agencies Sponsored Posts etc. The information available on this website is purely for educational and informational purposes only. We cannot guarantee the accuracy or applicability of any of the information provided at globalbankingandfinance.com with respect to your individual or personal circumstances. Please seek professional advice from a qualified professional before making any financial decisions. Globalbankingandfinance.com also links to various third party websites and we cannot guarantee the accuracy or applicability of the information provided by third party websites. Links from various articles on our site to third party websites are a mixture of non-sponsored links and sponsored links. Only a very small fraction of the links which point to external websites are affiliate links. Some of the links which you may click on our website may link to various products and services from our partners who may compensate us if you buy a service or product or fill a form or install an app. This will not incur additional cost to you. A very few articles on our website are sponsored posts or paid advertorials. These are marked as sponsored posts at the bottom of each post. For avoidance of any doubts and to make it easier for you to differentiate sponsored or non-sponsored articles or links, you may consider all articles on our site or all links to external websites as sponsored . Please note that some of the services or products which we talk about carry a high level of risk and may not be suitable for everyone. These may be complex services or products and we request the readers to consider this purely from an educational standpoint. The information provided on this website is general in nature. Global Banking & Finance Review expressly disclaims any liability without any limitation which may arise directly or indirectly from the use of such information.


Reconfiguring old models of data management could help the financial services sector meet evolving global regulatory requirements. The trick is to shift emphasis away from building huge data repositories, and to concentrate on developing a data supply chain that gets the right data to the right place at the right time. By Richard Petti, CEO, Asset Control

The critical shift in perspective is to recognize that change is a constant and to move away from a monolithic hard-wired data warehouse model towards a dynamic data supply chain.  In markets requiring constant business innovation, products, processes and organizations must become more agile in how they support the business and meet regulators’ needs.  Financial data models need to be dynamic, adjusting quickly to capture new products created to solve client needs in new ways and the same data must flow quickly through the middle and back office to minimize the risk of exception bottlenecks or reconciliation errors.

Richard Petti
Richard Petti

Increasingly, proactive organizations are deploying strategies that regard data management as a dynamic logistics activity. The most effective have placed a data management platform at the center of the complex multi-source, multi-system distribution process – taking inputs from vendor feeds and departmental sources, testing them for quality and routing them through the platform to downstream systems and users.  As data flows through the system, the platform provides the framework for auditing activity and monitoring performance against critical SLAs.

Such systems simplify the technical challenges significantly. Because they eliminate potentially hundreds of point-to-point connections, they make the administration, control and delivery of reference, market and risk data much more manageable. Moreover, workflows become more efficient, enabling organizations to save time and money. Crucially, the centralized approach built around the effective development of a data supply chain, is helping companies mitigate risk and meet the growing demands of regulatory compliance.

So how can organizations achieve this?

The first step in the process is to ask the right questions to identify and address each organization’s specific data management challenges. In understanding the challenges, the critical internal SLAs become clear and the organization will get a picture of the necessary workflows in order to get the right data to the right people at the right time. Working with a specialist data management team an organization can also be directed as to best practice in formulating the appropriate workflows and dealing with these issues.

The next step is the implementation of a robust system, supported by a dedicated data management team, to help implement the workflows that will compel organizations to address their procedural concerns and allow compliance and reporting to become more highly automated.

The Principles for effective risk data aggregation and risk reporting, known as BCBS239 within Basel III, mandate banks to impose strong data governance to assure the organization, assembly and production of risk information. The principles, similar to Dodd-Frank in the US, begin with traditional notions of soundness: risk reporting should be transparent, and the sourcing, validation, cleansing and delivery of data should be tightly controlled and auditable. But the new regulatory model also makes timeliness and adaptability fundamental requirements.  This is a significant change from Basel II which addressed the formulation of risk models in detail but, in retrospect, failed to identify the need for accurate data. Without which, models and analysis tended to underestimate the frequency of major portfolio losses and underestimate resulting capital ratios.

The data supply chain approach shifts the focus away from the accumulation of data and shifts focus to delivery. Every activity becomes focused on ensuring the right package of data is delivered to the customer at the right time – everything works backwards from that primary objective. This is a challenge to incumbent models that largely focus on the aggregation and organization of huge volumes of data into a monolithic fixed schema.

In this new approach, the core components of data management – capture, validation and delivery – remain constant. But the process begins from the end-user’s perspective, with Chief Data Officers considering two key questions: 1. who am I delivering this data to? and 2. under what Service-Level Agreement (SLA)? By adopting an SLA-led approach and focusing on the end-game of delivery, it becomes much easier to work backwards and align performance (and costs) with business needs. With the overarching SLA as the start-point, data management becomes a logistics exercise whose primary objective is to get the right data, to the right people in time to meet their local SLAs – in effect, a data supply chain. The new approach makes changes to data requirements much easier to absorb; by-passing the need to change a data schema and incorporating the change directly to rules that govern the data package provides an agile and transparent mechanism for on-the-move data changes.

With the right rules and workflow in place, not only will challenges be resolved and risk mitigated, but it also places the organization in the best possible position to adopt an open and transparent enterprise-wide data management strategy that incorporates the entire data supply chain to deliver users the data they want, when and how they want it.

Although we may have survived the consequences of regulatory and information failures that characterized the financial crisis, organizations cannot afford to be complacent. A reliance on inefficient legacy models will no longer suffice.

To progress, Chief Risk Officers and Chief Data Officers must drive the reconfiguration of financial data management – and establish it as a logistical exercise.