Technology

Poor Data Quality Is Giving Financial Institutions the Stick

Published by Jessica Weisman-Pitts

Posted on July 29, 2022

Featured image for article about Technology

By Marc Binck, Head of Cloud Services, Gresham Technologies

There has been an increased regulatory focus on poor quality data and reporting errors in recent years, yet it has not resulted in a real reduction as regulators had hoped. Given the ever-growing complexity of the modern data-led economy, these are higher than ever, and financial institutions are under pressure to clean house as regulators increase the teeth of compliance edicts.

Two methods that will compel financial institutions to get their regulatory compliance right can be categorised as the carrot and the stick approach, also known as the reward and punishment model all parents and pet owners are familiar with.

When we think about the stick in terms of regulatory finance, it typically comes in the form of fines imposed by the National Competent Authorities (NCAs). Fines under MiFID II are used to penalise firms for poor data quality and regulatory reporting financially. These fines quadrupled in value in 2020, reaching an aggregated €8.4 million (comprising 613 sanctions and measures), compared to just €1.8 million (371 sanctions and measures) the year prior. While fines are increasing, data integrity and reliability has not improved in tandem.

The stick is never the sole answer to improving a market. There’s a strong role for the carrot (reward) for financial institutions fixing their challenging compliance regimes. The carrot provides a softer solution to the issue of regulatory compliance, it increases the firm’s awareness of the benefits of strong data integrity such as lower costs, higher efficiency and enhanced business offerings – all leading to higher profit margins. Given the volatility of global trade, energy, commodities, and labour issues, shaving inefficiencies off the margins has never been more appealing.

These two strategies must be used together, the threat of fines alongside the benefits of strong data integrity is what regulators must leverage to help financial institutions get at the forefront of regulatory matters, hand in hand with improving their operations and service.

Growth and acquisition led to data chaos. Technology shows a way out

Smarter software engineering, streamlining and connecting solutions and data sources within financial institutions, is the lifeline these firms are looking for.

Some of the largest financial institutions, including UBS, Blackrock and BNP Paribas, rely on real-time data integrity and control solutions to keep their part of the global economy ticking. Large volumes of data are managed every day within financial services, which is why it’s so critical to have agile technology that scales. Many use Gresham services today for their flexible and scalable solutions explicitly for ensuring their data is secure, compliant, auditable, and immediately useful for business users.

Large institutions require efficient technology with minimal configuration complexity and little overhead to win the data complexity battle. Gresham is a microcosm of the challenges that large organisations grow into. Over the years through various acquisitions, it accumulated a handful of technology stacks. Eventually, it required standardisation to increase productivity and better serve customers. We became a complex landscape and it made sense to better integrate our teams to ensure we provide better customer solutions for increasingly demanding and highly regulated financial markets.

Standardising Gresham’s continuous integration / continuous deployment (CI/CD) platform was the ultimate long-term solution for enhancing delivery, and decreasing overheads. Subsequently, the team saw immediate improvements including faster test speeds and a decrease in the cost of execution.

Financial institutions can invest in modern software architectures, high-performance processing technologies, AI and robotics, on-premise or in public clouds – but without a handle on software engineering and delivery, errors, inefficiencies, and vulnerabilities will creep in from the ground up. CI/CD should be incredibly important to any institution seeking a measure of agility and control of their modern service delivery.

Our own ‘ah ha’ moment came when partnering with CI/CD provider CircleCI. We wanted to trigger jobs from different software workflow definitions, and easily script build launches with all the required options we wanted. Picking the right platforms and solutions makes the job of managing the complexity of our solution, and ultimately, enabling our customers to manage their data complexities, simple. We build across Linux, Mac, and Windows in an agile environment because we put the processes in place to remove technical debt and de-risk how we build and maintain services.

All institutions must find their way of getting to grips with their IT and data environments. The security of data as business applications, the drag of legacy technologies, streamlining and debugging software, all contribute to slow growth and increased regulatory risks. De-risking technology provides a way to the ‘carrot’, and out from under the threat of the stick.

;