Editorial & Advertiser Disclosure Global Banking And Finance Review is an independent publisher which offers News, information, Analysis, Opinion, Press Releases, Reviews, Research reports covering various economies, industries, products, services and companies. The content available on globalbankingandfinance.com is sourced by a mixture of different methods which is not limited to content produced and supplied by various staff writers, journalists, freelancers, individuals, organizations, companies, PR agencies etc. The information available on this website is purely for educational and informational purposes only. We cannot guarantee the accuracy or applicability of any of the information provided at globalbankingandfinance.com with respect to your individual or personal circumstances. Please seek professional advice from a qualified professional before making any financial decisions. Globalbankingandfinance.com also links to various third party websites and we cannot guarantee the accuracy or applicability of the information provided by third party websites.
Links from various articles on our site to third party websites are a mixture of non-sponsored links and sponsored links. Only a very small fraction of the links which point to external websites are affiliate links. Some of the links which you may click on our website may link to various products and services from our partners who may compensate us if you buy a service or product or fill a form or install an app. This will not incur additional cost to you. For avoidance of any doubts and to make it easier, you may consider any links to external websites as sponsored links. Please note that some of the services or products which we talk about carry a high level of risk and may not be suitable for everyone. These may be complex services or products and we request the readers to consider this purely from an educational standpoint. The information provided on this website is general in nature. Global Banking & Finance Review expressly disclaims any liability without any limitation which may arise directly or indirectly from the use of such information.

How application integration creates value from legacy data

By Renat Zubairov, CEO and co-founder, elastic.io 

Organisations in the financial sector already know that knowledge empowers better, more informed business decisions.

Much of the necessary knowledge – data – already resides or is being collected inside the company’s own infrastructure of applications and databases. The value of this mountain of data, however, relies on it being accessible and structured for analysis.

With a strong focus on digital transformation in the financial industry, these factors of accessibility and structure are often hindered by a disconnect between old (legacy) and new (digital-first) systems as organisations seek to modify and migrate.

The questions, how can companies access and create value from data collected and stored in legacy systems as they move towards digital transformation?

The operational importance of legacy data

Renat Zubairov
Renat Zubairov

Critical investments, policy management and trading strategies can be de-risked with the right data and analysis to determine the best course of action. Banks need historical data for modelling customer profiles to inform product offerings. Regulatory challenges from GDPR to MiFID II, PSD2 and the US Foreign Account Tax Compliance Act (FATCA) – to name just a few – have prompted operational changes and require more detailed document management to prove due diligence.

As such, the global finance industry has been amongst the early adopters of the concept of ‘big data’; the collection, processing and analysis of structured data generated through business process applications and unstructured data, such as emails, images, documents, presentations and webpages.

Although the industry has embraced the concept of ‘big data’, an estimated 97% of the data produced and collected is never put to use and therefore has no value to the organisation.

Largely, that is owing to legacy systems’ incompatibility with new, digital-first technologies.

Unlocking the value of legacy data

Whether a result of a history of mergers, acquisitions or organic growth, the typical IT infrastructure of a financial organisation was already a complex patchwork of systems that digital transformation is now further obfuscating.

In most financial organisations, data prior to the last two to three years is likely to be stored in outdated hardware and software that doesn’t fit the new digitally transformed enterprise. Often it will be in formats that can’t easily be accessed and interrogated by modern Machine Learning (ML) and data analytics techniques.

That doesn’t mean, though, that the system is obsolete. It still has value as a repository of legacy data. From trading decision and risk analysis to compliance and customer service, day-to-day operations in the finance industry require access to high volumes of data to create algorithmic models of market trends and behaviours. Models then need to be tested, again using the historical perspective that legacy data delivers.

However, information with enormous strategic business value is often locked, without access or structure, in unconnected databases, incompatible data formats and outdated storage.

Maximising data access with integration

Systems that are capable of being connected into the new, digitally transformed structure are worth retaining. But maximising value from the legacy data they hold relies on integration to overcome fragmentation often found between old and new systems.

Integration minimises the complexity of IT architectures to create more straightforward access and structure, regardless of within which system the data resides. The integration layer effectively acts to connect disparate applications, processes and services onto one common platform.

In fact, integration is increasingly prioritised as a top three requirement in enterprise technology investments generally. That means integration across new systems as well as with legacy systems to ensure that common fields and processes are linked and data is consistent and accessible in any location, database or application.

Whether achieved using a series of coded interfaces or deployed as a simple integration-platform-as-a-service (iPaaS) layer, interoperability is the critical factor in harnessing the latent value of legacy data systems.

Once data sources are integrated to create a unified source, it can be structured to support analysis for business operational purposes and decision making. Without it, the value of legacy data will remain largely inaccessible.

Digital-first financial organisations

The financial sector is traditionally an innovator in technology. It has long recognised the value that IT can deliver in competitive advantage, whether that is through faster trading processes, more accurate risk management or better customer services.

But investment in piecemeal technologies without emphasis on the importance of interoperability means that data becomes trapped in legacy systems and disconnected silos, never delivering its potential value.

With an integration layer, legacy functionality and data can be made accessible and structured for analysis by digitally-led applications, unlocking the value of big data for financial institutions.