Technology
How application integration creates value from legacy data
By Renat Zubairov, CEO and co-founder, elastic.io
Organisations in the financial sector already know that knowledge empowers better, more informed business decisions.
Much of the necessary knowledge – data – already resides or is being collected inside the company’s own infrastructure of applications and databases. The value of this mountain of data, however, relies on it being accessible and structured for analysis.
With a strong focus on digital transformation in the financial industry, these factors of accessibility and structure are often hindered by a disconnect between old (legacy) and new (digital-first) systems as organisations seek to modify and migrate.
The questions, how can companies access and create value from data collected and stored in legacy systems as they move towards digital transformation?
The operational importance of legacy data
Critical investments, policy management and trading strategies can be de-risked with the right data and analysis to determine the best course of action. Banks need historical data for modelling customer profiles to inform product offerings. Regulatory challenges from GDPR to MiFID II, PSD2 and the US Foreign Account Tax Compliance Act (FATCA) – to name just a few – have prompted operational changes and require more detailed document management to prove due diligence.
As such, the global finance industry has been amongst the early adopters of the concept of ‘big data’; the collection, processing and analysis of structured data generated through business process applications and unstructured data, such as emails, images, documents, presentations and webpages.
Although the industry has embraced the concept of ‘big data’, an estimated 97% of the data produced and collected is never put to use and therefore has no value to the organisation.
Largely, that is owing to legacy systems’ incompatibility with new, digital-first technologies.
Unlocking the value of legacy data
Whether a result of a history of mergers, acquisitions or organic growth, the typical IT infrastructure of a financial organisation was already a complex patchwork of systems that digital transformation is now further obfuscating.
In most financial organisations, data prior to the last two to three years is likely to be stored in outdated hardware and software that doesn’t fit the new digitally transformed enterprise. Often it will be in formats that can’t easily be accessed and interrogated by modern Machine Learning (ML) and data analytics techniques.
That doesn’t mean, though, that the system is obsolete. It still has value as a repository of legacy data. From trading decision and risk analysis to compliance and customer service, day-to-day operations in the finance industry require access to high volumes of data to create algorithmic models of market trends and behaviours. Models then need to be tested, again using the historical perspective that legacy data delivers.
However, information with enormous strategic business value is often locked, without access or structure, in unconnected databases, incompatible data formats and outdated storage.
Maximising data access with integration
Systems that are capable of being connected into the new, digitally transformed structure are worth retaining. But maximising value from the legacy data they hold relies on integration to overcome fragmentation often found between old and new systems.
Integration minimises the complexity of IT architectures to create more straightforward access and structure, regardless of within which system the data resides. The integration layer effectively acts to connect disparate applications, processes and services onto one common platform.
In fact, integration is increasingly prioritised as a top three requirement in enterprise technology investments generally. That means integration across new systems as well as with legacy systems to ensure that common fields and processes are linked and data is consistent and accessible in any location, database or application.
Whether achieved using a series of coded interfaces or deployed as a simple integration-platform-as-a-service (iPaaS) layer, interoperability is the critical factor in harnessing the latent value of legacy data systems.
Once data sources are integrated to create a unified source, it can be structured to support analysis for business operational purposes and decision making. Without it, the value of legacy data will remain largely inaccessible.
Digital-first financial organisations
The financial sector is traditionally an innovator in technology. It has long recognised the value that IT can deliver in competitive advantage, whether that is through faster trading processes, more accurate risk management or better customer services.
But investment in piecemeal technologies without emphasis on the importance of interoperability means that data becomes trapped in legacy systems and disconnected silos, never delivering its potential value.
With an integration layer, legacy functionality and data can be made accessible and structured for analysis by digitally-led applications, unlocking the value of big data for financial institutions.
-
Banking3 days ago
Open Banking and Cross-Border Payments: Advancements and Challenges
-
Finance3 days ago
Cross-border payments: The key to global business success
-
Interviews3 days ago
Navigating the Transformative Banking Landscape
-
Finance3 days ago
An Overview of Exchange-Traded Funds (ETFs) and Their Benefits