Martijn Groot, VP Product Strategy at Asset Control
In a bid to drive down costs and improve business agility in the face of fast evolving digital competition, many financial services organisations are rationalising IT infrastructure. Yet the focus is as much external as internal, with banks looking to facilitate partnerships with the burgeoning fintech ecosystem as well as enhance direct customer interaction.
IT is, therefore, now also tasked with automating an end to end supply chain involving different service providers – and that means finding a way to easily yet securely expose data to these third parties.API teams are focused on integrating and decommissioning legacy applications, and Data Stewards are tasked with eradicating line of business duplication; yet information consumers across the business are still constrained by both data siloes and proprietary data discovery methods, while the creation of links to external providers is fraught with risk. Where is the data governance? The ability to manage permissions, comply with data privacy laws, adhere to content license agreements or safeguard commercially sensitive information?
Martijn Groot, VP Product Management, Asset Control outlines the importance of a mature data governance model that empowers both internal and external business users through secure, managed self-service access to trusted, cross-functional information resources.
IT rationalisation has become a major focus for financial services firms over the past couple of years – from Deutsche Bank’s Strategy 2020 which includes modernising outdated and fragmented IT architecture, including the reduction of operating systems, hardware and software applications, to HSBC’s Simplify the Bank plan which includes an architecture-led strategy to halve the number of applications across the whole group over a 10-year period.
This emphasis on streamlining complex infrastructure is being driven by the new competitive and regulatory landscape. It has become very clear over the past decade that continuing with line of business data silos has become a significant risk, not only given the cost of regulatory compliance, with its demands for cross-sectional reporting, but also the implications for speed of business change.
As a result, a key part of this rationalisation process has been an investment in APIs to enable interoperability between applications and, hopefully, support the eradication of duplicate applications.However, while many organisations have appointed Data Stewards with a remit to determine data and application requirements across specific business functions, the siloed mentality remains due to a lack of data governance maturity. From cost reduction to business agility, the realisation of any successful application rationalisationor data supply chain improvement project will require significantly improved models for data governance.
Consistent Data Model
At the same time, of course, the business focus is turning increasingly outward, as organisations recognise the importance of the new financial ecosystem. IT is not only tasked with rationalisation but also moving away from individual process automation to automating an end-to-end supply chain involving different service providers.
With a need to expose data to the new fintech partners, as well as customers, many banks are putting in place their own API marketplaces through which they expose their data to selected third parties. While such changes in the retail market are being driven in the EU by the revised Payment Services Directive (PSD2), corporate products in cash, foreign exchange, liquidity and finance data will also demand new APIs.
Given this demand for openness both internally and externally, a common, cross-application taxonomy of products and services and uniform data dictionary is clearly important. Without this, services could still be added on top of existing infrastructures but the integration would be brittle and error prone and not up to quality levels or interaction speeds clients would expect.
But this model has to go further: consolidating data from different sources, mastering it and subjecting it to different quality controls and creating a common data model is a great start. But how is that data being consumed? Requiring business users to rely on the IT team to use proprietary APIs to gain access to this data makes for a steep and costly learning curve, undermines data value and compromises both the IT rationalisation vision and the creation of a successful financial ecosystem.
Mobilised and Empowered
It is essential to empower business users to explore and exploit this consistent information resource, not only to meet regulatory demands but to support business change. Replacing proprietary tools for data access and discovery by using industry standards APIs – such as the Representational state transfer (REST) API – will simplify the integration of standard data discovery tools. In addition, the use of a standard data schema within a datamartwill provide a shared understanding of terminology, definitions and values. The combination of standard data model with the REST API will enable business users to gain access to this golden copy repository in a lightweight fashion – without reliance on the intervention of IT.
Opening up a single, consistent data source to business users via standardised, self-service technologies is transformative. A simple browser based interface that enables business users to select the required data on demand, with the addition of formatting and frequency tools, effectively opens up the data asset to drive new value. Data can be accessed, integrated into other systems and/or explored via standard data discovery tools – all without any complex proprietary Java based tools.
Obviously this model has to be controlled – from avoiding a data deluge to ensuring confidentiality is maintained, the data cannot be left open to everyone. The ability to manage permissions, for service providers, internal users and customers is essential if the organisation is to ensure compliance to data privacy laws, adherence to content license agreements and protection of commercially sensitive information. A REST API should include the ability to control access to specific data to avoid exposure of data to users who are not permitted due to license constraints or data sensitivity.
With the right security measures in place, information that would have taken business users week to access whilst waiting for IT, can now be discovered and reported upon in days. Given the increasing need for reports – both regulatory and data discovery to support business change – this self-service access to trusted, standardised data is key. In addition to reducing the cost of business change, the use of the REST API also enables a simple, lightweight integration that reduces infrastructure costs – and avoids the need for expensive and highly trained experts.
The regulatory reporting requirements that have evolved over the past decade may have put the spotlight on the endemic, silo based infrastructure model but it has become very clear to the financial services industry that if operational costs are to be reduced, IT rationalisation is an imperative. At the same time, an integrated financial ecosystem is becoming vital in both retail and corporate markets. Without a mature data governance model that leverages new enablers, including APIs and standard data dictionaries, organisations will struggle to realise both rationalisation and extension goals.
To realise the new vision of agile, simplified financial services business models that are competitive in new digital markets organisations need to not only create a centralised data source but also explore new standardised technologies to mobilise data and empower users throughout the business and beyond.
TCI: A time of critical importance
By Fabrice Desnos, head of Northern Europe Region, Euler Hermes, the world’s leading trade credit insurer, outlines the importance of less publicised measures for the journey ahead.
After months of lockdown, Europe is shifting towards rebuilding economies and resuming trade. Amongst the multibillion-euro stimulus packages provided by governments to businesses to help them resume their engines of growth, the cooperation between the state and private sector trade credit insurance underwriters has perhaps missed the headlines. However, this cooperation will be vital when navigating the uncertain road ahead.
Covid-19 has created a global economic crisis of unprecedented scale and speed. Consequently, we’re experiencing unprecedented levels of support from national governments. Far-reaching fiscal intervention, job retention and business interruption loan schemes are providing a lifeline for businesses that have suffered reductions in turnovers to support national lockdowns.
However, it’s becoming clear the worst is still to come. The unintended consequence of government support measures is delaying the inevitable fallout in trade and commerce. Euler Hermes is already seeing increase in claims for late payments and expects this trend to accelerate as government support measures are progressively removed.
The Covid-19 crisis will have long lasting and sometimes irreversible effects on a number of sectors. It has accelerated transformations that were already underway and had radically changed the landscape for a number of businesses. This means we are seeing a growing number of “zombie” companies, currently under life support, but whose business models are no longer adapted for the post-crisis world. All factors which add up to what is best described as a corporate insolvency “time bomb”.
The effects of the crisis are already visible. In the second quarter of 2020, 147 large companies (those with a turnover above €50 million) failed; up from 77 in the first quarter, and compared to 163 for the whole of the first half of 2019. Retail, services, energy and automotive were the most impacted sectors this year, with the hotspots in retail and services in Western Europe and North America, energy in North America, and automotive in Western Europe
We expect this trend to accelerate and predict a +35% rise in corporate insolvencies globally by the end of 2021. European economies will be among the hardest hit. For example, Spain (+41%) and Italy (+27%) will see the most significant increases – alongside the UK (+43%), which will also feel the impact of Brexit – compared to France (+25%) or Germany (+12%).
Companies are restarting trade, often providing open credit to their clients. However, there can be no credit if there is no confidence. It is increasingly difficult for companies to identify which of their clients will emerge from the crisis from those that won’t, and whether or when they will be paid. In the immediate post-lockdown period, without visibility and confidence, the risk was that inter-company credit could evaporate, placing an additional liquidity strain on the companies that depend on it. This, in turn, would significantly put at risk the speed and extent of the economic recovery.
In recent months, Euler Hermes has co-operated with government agencies, trade associations and private sector trade credit insurance underwriters to create state support for intercompany trade, notably in France, Germany, Belgium, Denmark, the Netherlands and the UK. All with the same goal: to allow companies to trade with each other in confidence.
By providing additional reinsurance capacity to the trade credit insurers, governments help them continue to provide cover to their clients at pre-crisis levels.
The beneficiaries are the thousands of businesses – clients of credit insurers and their buyers – that depend upon intercompany trade as a source of financing. Over 70% of Euler Hermes policyholders are SMEs, which are the lifeblood of our economies and major providers of jobs. These agreements are not without costs or constraints for the insurers, but the industry has chosen to place the interests of its clients and of the economy ahead of other considerations, mindful of the important role credit insurance and inter-company trade will play in the recovery.
Taking the UK as an example, trade credit insurers provide cover for more than £171billion of intercompany transactions, covering 13,000 suppliers and 650,000 buyers. The government has put in place a temporary scheme of £10billion to enable trade credit insurers, including Euler Hermes, to continue supporting businesses at risk due to the impact of coronavirus. This landmark agreement represents an important alliance between the public and private sectors to support trade and prevent the domino effect that payment defaults can create within critical supply chains.
But, as with all of the other government support measures, these schemes will not exist in the long term. It is already time for credit insurers and their clients to plan ahead, and prepare for a new normal in which the level and cost of credit risk will be heightened and where identifying the right counterparts, diversifying and insuring credit risk will be of paramount importance for businesses.
Trade credit insurance plays an understated role in the economy but is critical to its health. In normal circumstances, it tends to go unnoticed because it is doing its job. Government support schemes helped maintain confidence between companies and their customers in the immediate aftermath of the crisis.
However, as government support measures are progressively removed, this crisis will have a lasting impact. Accelerating transformations, leading to an increasing number of company restructurings and, in all likelihood, increasing the level of credit risk. To succeed in the post-crisis environment, bbusinesses have to move fast from resilience to adaptation. They have to adopt bold measures to protect their businesses against future crises (or another wave of this pandemic), minimize risk, and drive future growth. By maintaining trust to trade, with or without government support, credit insurance will have an increasing role to play in this.
What Does the FinCEN File Leak Tell Us?
By Ted Sausen, Subject Matter Expert, NICE Actimize
On September 20, 2020, just four days after the Financial Crimes Enforcement Network (FinCEN) issued a much-anticipated Advance Notice of Proposed Rulemaking, the financial industry was shaken and their stock prices saw significant declines when the markets opened on Monday. So what caused this? Buzzfeed News in cooperation with the International Consortium of Investigative Journalists (ICIJ) released what is now being tagged the FinCEN files. These files and summarized reports describe over 200,000 transactions with a total over $2 trillion USD that has been reported to FinCEN as being suspicious in nature from the time periods 1999 to 2017. Buzzfeed obtained over 2,100 Suspicious Activity Reports (SARs) and over 2,600 confidential documents financial institutions had filed with FinCEN over that span of time.
Similar such leaks have occurred previously, such as the Panama Papers in 2016 where over 11 million documents containing personal financial information on over 200,000 entities that belonged to a Panamanian law firm. This was followed up a year and a half later by the Paradise Papers in 2017. This leak contained even more documents and contained the names of more than 120,000 persons and entities. There are three factors that make the FinCEN Files leak significantly different than those mentioned. First, they are highly confidential documents leaked from a government agency. Secondly, they weren’t leaked from a single source. The leaked documents came from nearly 90 financial institutions facilitating financial transactions in more than 150 countries. Lastly, some high-profile names were released in this leak; however, the focus of this leak centered more around the transactions themselves and the financial institutions involved, not necessarily the names of individuals involved.
FinCEN Files and the Impact
What does this mean for the financial institutions? As mentioned above, many experienced a negative impact to their stocks. The next biggest impact is their reputation. Leaders of the highlighted institutions do not enjoy having potential shortcomings in their operations be exposed, nor do customers of those institutions appreciate seeing the institution managing their funds being published adversely in the media.
Where did the financial institutions go wrong? Based on the information, it is actually hard to say where they went wrong, or even ‘if’ they went wrong. Financial institutions are obligated to monitor transactional activity, both inbound and outbound, for suspicious or unusual behavior, especially those that could appear to be illicit activities related to money laundering. If such behavior is identified, the financial institution is required to complete a Suspicious Activity Report, or a SAR, and file it with FinCEN. The SAR contains all relevant information such as the parties involved, transaction(s), account(s), and details describing why the activity is deemed to be suspicious. In some cases, financial institutions will file a SAR if there is no direct suspicion; however, there also was not a logical explanation found either.
So what deems certain activities to be suspicious and how do financial institutions detect them? Most financial institutions have sophisticated solutions in place that monitor transactions over a period of time, and determine typical behavioral patterns for that client, and that client compared to their peers. If any activity falls disproportionately beyond those norms, the financial institution is notified, and an investigation is conducted. Because of the nature of this detection, incorporating multiple transactions, and comparing it to historical “norms”, it is very difficult to stop a transaction related to money laundering real-time. It is not uncommon for a transaction or series of transactions to occur and later be identified as suspicious, and a SAR is filed after the transaction has been completed.
FinCEN Files: Who’s at Fault?
Going back to my original question, was there any wrong doing? In this case, they were doing exactly what they were required to do. When suspicion was identified, SARs were filed. There are two things that are important to note. Suspicion does not equate to guilt, and individual financial institutions have a very limited view as to the overall flow of funds. They have visibility of where funds are coming from, or where they are going to; however, they don’t have an overall picture of the original source, or the final destination. The area where financial institutions may have fault is if multiple suspicions or probable guilt is found, but they fail to take appropriate action. According to Buzzfeed News, instances of transactions to or from sanctioned parties occurred, and known suspicious activity was allowed to continue after it was discovered.
How do we do better? First and foremost, FinCEN needs to identify the source of the leak and fix it immediately. This is very sensitive data. Even within a financial institution, this information is only exposed to individuals with a high-level clearance on a need-to-know basis. This leak may result in relationship strains with some of the banks’ customers. Some people already have a fear of being watched or tracked, and releasing publicly that all these reports are being filed from financial institutions to the federal government won’t make that any better – especially if their financial institution was highlighted as one of those filing the most reports. Next, there has been more discussion around real-time AML. Many experts are still working on defining what that truly means, especially when some activities deal with multiple transactions over a period of time; however, there is definitely a place for certain money laundering transactions to be held in real time.
Lastly, the ability to share information between financial institutions more easily will go a long way in fighting financial crime overall. For those of you who are AML professionals, you may be thinking we already have such a mechanism in place with 314b. However, the feedback I have received is that it does not do an adequate job. It’s voluntary and getting responses to requests can be a challenge. Financial institutions need a consortium to effectively communicate with each other, while being able to exchange critical data needed for financial institutions to see the complete picture of financial transactions and all associated activities. That, combined with some type of feedback loop from law enforcement indicating which SARs are “useful” versus which are either “inadequate” or “unnecessary” will allow institutions to focus on those where criminal activity is really occurring.
We will continue to post updates as we learn more.
How can financial services firms keep pace with escalating requirements?
By Tim FitzGerald, UK Banking & Financial Services Sales Manager, InterSystems
Financial services firms are currently coming up against a number of critical challenges, ranging from market volatility, most recently influenced by COVID-19, to the introduction of regulations, such as the Payment Services Directive (PSD2) and Fundamental Review of the Trading Book (FRTB). However, these issues are being compounded as many financial institutions find it increasingly difficult to get a handle on the vast volumes of data that they have at their disposal. This is no surprise given that IDC has projected that by 2025, the global “datasphere” will have grown to a staggering 175 zettabytes of data – more than five times the amount of data generated in 2018. As an industry that has typically only invested in new technology when regulations deem it necessary, many traditional banks are now operating using legacy systems and applications that haven’t been designed or built to interoperate. Consequently, banks are struggling to leverage data to achieve business goals and to gain a clear picture of their organisation and processes in order to comply with regulatory requirements. These challenges have been more prevalent during the pandemic as financial services firms were forced to adapt their operations to radical changes in customer behaviour and increased demand for digital services – all while working largely remotely themselves.
As more stringent regulations come in to play and financial services firms look to keep pace with escalating requirements from regulators, consumer demand for more online services, and the ever-evolving nature of the industry and world at large, it’s vital they do two things. Firstly, they must begin to invest in the technology and processes that will allow them to more easily manage the data that traditional banks have been collecting and storing for upwards of 50 years. Secondly, they must innovate. For many, the COVID-19 pandemic will have been a catalyst for both actions. However, the hard work has only just begun.
Traditionally, due to tight budgets and no overarching regulatory imperative to change, financial institutions haven’t done enough to address their overreliance on disconnected legacy systems. Even when faced with the new wave of regulation that was implemented in the wake of the 2008 banking crash, financial services organisations generally only had to invest in different applications on an ad hoc basis to meet each individual regulation. However, as new regulations require the analysis of larger data sets within smaller processing windows, breaking down any and all data siloes is essential and this will require financial institutions that are still reliant on legacy systems to implement new technologies to meet the regulatory stipulations.
With this in mind, solutions which offer high-quality data analytics and enhanced integration will be key to the success of financial institutions and crucial to eliminate data silos. This will enable organisations to achieve a faster and more accurate analysis of real-time and historical data no matter where they are accessing the data from within smaller processing windows to keep pace with regulatory requirements, while also benefiting from low infrastructure costs.
This technology will also play a huge part in helping financial institutions scale their online operations to meet demand from customers for digital services. According to PNC Bank, during the pandemic, it saw online sales jump from 25% to 75%. Therefore, having data platforms that are able to handle surges in online activity is becoming increasingly important.
Real-time analysis of data
While the precise solution financial services institutions need will differ based on the organisation, broadly speaking, the more data they are storing on legacy solutions, the more they are going to require an updated data platform that can handle real-time analytics. Even organisations that have fewer legacy systems are still likely to require solutions that deliver enhanced interoperability to help provide a real-time view across the business and enable them to meet the pressing regulatory requirements they face. Let’s also not lose sight of the fact that moving transactional data to a data warehouse, data lake, or any other silo will never deliver real-time analytics, therefore, businesses making risk decisions based on this and thinking it is real-time is completely inappropriate.
As such, financial services firms require a data platform that can ingest real-time transactional data, as well as from a variety of other sources of historical and reference data, normalise it, and make sense of it. The ability to process transactions at scale in real-time and simultaneously run analytics using transactional real-time data and large sets of non-real-time data, such as reference data, is a crucial capability for various business requirements. For example, powering mission-critical trading platforms that cannot slow down or drop trades, even as volumes spike.
Not only will having access to real-time data enable financial institutions to meet evolving regulatory requirements, but it will also allow them to make faster and more accurate decisions for their organisation andcustomers. With many financial services firms operating on a global basis, this is vital to help them keep up not only with evolving regulations but also changing circumstances in different markets in light of the pandemic. This data can also help them understand how to become more agile, help their employees become productive while working remotely, and how to build up operational resilience. These insights will also be vital as financial institutions need to consider the likelihood of subsequent waves of the virus, allowing them to gain a better understanding of what has and hasn’t worked for their business so far.
The financial services sector is fast-paced and ever-changing. With the launch of more digital-only banks, traditional institutions need to innovate to avoid being left behind, with COVID-19 only highlighting this further. With more than a third (35%) of customers increasing their use of online banking during this period, it is those banks and financial services firms with a solid online offering that have been best placed to answer this demand. As financial institutions cater to changing customer requirements, both now and in the future, implementing new technology that provides access to data in real-time will help them to uncover the fresh insights needed to develop new and transformative products and services for their customers. In turn, this will enable them to realise new revenue streams and potentially capture a bigger slice of the market. For instance, access to data will help banks better understand the needs of their customers during periods of upheaval, as well as under normal circumstance, which will allow them to target them with the specific services they may need during each of these periods to not only help their customers through difficult times but also to ensure the growth of their business. As financial institutions not only look to keep pace with but also gain an advantage over their competitors, using data to fuel excellent customer experiences will be essential to success.
With the current economic uncertainty and market volatility, it’s critical that financial services are able to meet the changing requirements coming from all angles. With COVID-19 likely to be the biggest catalyst for financial institutions to digitally transform, they will be better able to cater to rapidly evolving landscapes and prepare for continued periods of remote working. As they look to achieve this, replacing legacy systems with innovative and agile technology solutions will be crucial to ensure they can gain the accurate and complete view of their enterprise data they need to comply with new and changing regulations, and better meet the needs of consumers in an increasingly digital landscape, whether they are located in an office or working remotely.
Business recovery from COVID-19 lies in implementing the practice of Open Book Management
By Suranga Herath is CEO of English Tea Shop, the leading independent speciality and organic tea company. Over the course of the...
Making Connectivity A Key Part of Cloud Strategy for Finance
By Eric Troyer, CMO at Megaport Finance organisations across the board are facing unprecedented disruption, with new technology entering the industry...
The Impact of Covid-19 on Planning
By Nilly Essaides, Sherri Liao and Gilles Bonelli, The Hackett Group The economic consequences of the coronavirus outbreak vary by...
Covid-19 can reboot belt and road initiative towards a sustainable future
A new CMS report reveals that Covid-19 has boosted Chinese enthusiasm for adopting the principles of BRI 2.0, leading to...
The (U)X Factor: The software bringing biometric payment cards to market
By Jonas Nilsson, Product Manager at Fingerprints With over 20 bank trials in progress and a second commercial roll-out imminent in...
Corporate treasuries under pressure need multi-banking trade finance technology
By Andrew Raymond, CEO, Bolero International The pressures on corporate treasuries in global trade have continued to mount since an...
How can financial services companies deliver great customer service and retain customer loyalty?
By Chris Angus, Senior Director, 8×8 The reality many banks are facing now is that given Amazon Prime can deliver...
Embracing digital automation without compromising on customer experience
By Mang-Git NG, CEO & Founder of Anvil Community banks have always prided themselves on their ability to serve their...
Two-thirds of finance professionals are now more efficient due to the Covid-19 crisis
The Covid-19 crisis is making a big impact on the efficiency of the UK’s finance departments, with 66% of financial...
Two thirds of people believe their work travel patterns have changed permanently
Alphabet research shows accelerating demand for mobility and EVs after lockdown Only 35% of people expect to return to normal...