Martijn Groot, VP Product Strategy at Asset Control
In a bid to drive down costs and improve business agility in the face of fast evolving digital competition, many financial services organisations are rationalising IT infrastructure. Yet the focus is as much external as internal, with banks looking to facilitate partnerships with the burgeoning fintech ecosystem as well as enhance direct customer interaction.
IT is, therefore, now also tasked with automating an end to end supply chain involving different service providers – and that means finding a way to easily yet securely expose data to these third parties.API teams are focused on integrating and decommissioning legacy applications, and Data Stewards are tasked with eradicating line of business duplication; yet information consumers across the business are still constrained by both data siloes and proprietary data discovery methods, while the creation of links to external providers is fraught with risk. Where is the data governance? The ability to manage permissions, comply with data privacy laws, adhere to content license agreements or safeguard commercially sensitive information?
Martijn Groot, VP Product Management, Asset Control outlines the importance of a mature data governance model that empowers both internal and external business users through secure, managed self-service access to trusted, cross-functional information resources.
WANT TO BUILD A FINANCIAL EMPIRE?
Subscribe to the Global Banking & Finance Review Newsletter for FREE Get Access to Exclusive Reports to Save Time & Money
By using this form you agree with the storage and handling of your data by this website. We Will Not Spam, Rent, or Sell Your Information.
IT rationalisation has become a major focus for financial services firms over the past couple of years – from Deutsche Bank’s Strategy 2020 which includes modernising outdated and fragmented IT architecture, including the reduction of operating systems, hardware and software applications, to HSBC’s Simplify the Bank plan which includes an architecture-led strategy to halve the number of applications across the whole group over a 10-year period.
This emphasis on streamlining complex infrastructure is being driven by the new competitive and regulatory landscape. It has become very clear over the past decade that continuing with line of business data silos has become a significant risk, not only given the cost of regulatory compliance, with its demands for cross-sectional reporting, but also the implications for speed of business change.
As a result, a key part of this rationalisation process has been an investment in APIs to enable interoperability between applications and, hopefully, support the eradication of duplicate applications.However, while many organisations have appointed Data Stewards with a remit to determine data and application requirements across specific business functions, the siloed mentality remains due to a lack of data governance maturity. From cost reduction to business agility, the realisation of any successful application rationalisationor data supply chain improvement project will require significantly improved models for data governance.
Consistent Data Model
At the same time, of course, the business focus is turning increasingly outward, as organisations recognise the importance of the new financial ecosystem. IT is not only tasked with rationalisation but also moving away from individual process automation to automating an end-to-end supply chain involving different service providers.
With a need to expose data to the new fintech partners, as well as customers, many banks are putting in place their own API marketplaces through which they expose their data to selected third parties. While such changes in the retail market are being driven in the EU by the revised Payment Services Directive (PSD2), corporate products in cash, foreign exchange, liquidity and finance data will also demand new APIs.
Given this demand for openness both internally and externally, a common, cross-application taxonomy of products and services and uniform data dictionary is clearly important. Without this, services could still be added on top of existing infrastructures but the integration would be brittle and error prone and not up to quality levels or interaction speeds clients would expect.
But this model has to go further: consolidating data from different sources, mastering it and subjecting it to different quality controls and creating a common data model is a great start. But how is that data being consumed? Requiring business users to rely on the IT team to use proprietary APIs to gain access to this data makes for a steep and costly learning curve, undermines data value and compromises both the IT rationalisation vision and the creation of a successful financial ecosystem.
Mobilised and Empowered
It is essential to empower business users to explore and exploit this consistent information resource, not only to meet regulatory demands but to support business change. Replacing proprietary tools for data access and discovery by using industry standards APIs – such as the Representational state transfer (REST) API – will simplify the integration of standard data discovery tools. In addition, the use of a standard data schema within a datamartwill provide a shared understanding of terminology, definitions and values. The combination of standard data model with the REST API will enable business users to gain access to this golden copy repository in a lightweight fashion – without reliance on the intervention of IT.
Opening up a single, consistent data source to business users via standardised, self-service technologies is transformative. A simple browser based interface that enables business users to select the required data on demand, with the addition of formatting and frequency tools, effectively opens up the data asset to drive new value. Data can be accessed, integrated into other systems and/or explored via standard data discovery tools – all without any complex proprietary Java based tools.
Obviously this model has to be controlled – from avoiding a data deluge to ensuring confidentiality is maintained, the data cannot be left open to everyone. The ability to manage permissions, for service providers, internal users and customers is essential if the organisation is to ensure compliance to data privacy laws, adherence to content license agreements and protection of commercially sensitive information. A REST API should include the ability to control access to specific data to avoid exposure of data to users who are not permitted due to license constraints or data sensitivity.
With the right security measures in place, information that would have taken business users week to access whilst waiting for IT, can now be discovered and reported upon in days. Given the increasing need for reports – both regulatory and data discovery to support business change – this self-service access to trusted, standardised data is key. In addition to reducing the cost of business change, the use of the REST API also enables a simple, lightweight integration that reduces infrastructure costs – and avoids the need for expensive and highly trained experts.
The regulatory reporting requirements that have evolved over the past decade may have put the spotlight on the endemic, silo based infrastructure model but it has become very clear to the financial services industry that if operational costs are to be reduced, IT rationalisation is an imperative. At the same time, an integrated financial ecosystem is becoming vital in both retail and corporate markets. Without a mature data governance model that leverages new enablers, including APIs and standard data dictionaries, organisations will struggle to realise both rationalisation and extension goals.
To realise the new vision of agile, simplified financial services business models that are competitive in new digital markets organisations need to not only create a centralised data source but also explore new standardised technologies to mobilise data and empower users throughout the business and beyond.