Craig Talbot, Global Head Trading & Connectivity at Hatstand
Recent years have seen unprecedented changes to the technical infrastructure of financial institutions. Many of these changes have been driven by regulatory mandates drawn up in response to the financial crisis of 2007/8. As the Global Systematically Important Banks (G-SIBs) battle to comply with the January 2016 deadline of the Basel III Directive BCBS239, it is interesting to reflect on how many of them have had the time or the appetite to review whether the early implementations and changes made to their systems are fit for business.
With falling revenues, fewer clients and changes in market behaviours, those banks who have regularly reviewed their business strategy with mindful consideration of their accumulated technology debt, will be ahead of the competition once the regulatory dust settles.
After the Global Financial Crisis of 2007/8 it became apparent that the risk exposure data that banks relied on to make decisions was compromised; the data was inaccurate. The banks did not have the technology in place to aggregate and report such data quickly. Each bank subsequently reviewed how data was collected, managed and reported. This was followed by a series of internal projects to aggregate the data, fill in the missing gaps and report the risk exposure.
There is no doubt that huge business and technical efforts have been made to strengthen the banks’ risk data aggregation capabilities and internal risk reporting practices. However, in January 2015 the Basel Committee on Banking Supervision published its progress report showing that nearly half of the 31 G-SIBs said they would not be fully compliant by the January 2016 deadline, being unable to comply with at least one of the 14 Principles of BCBS239.
Furthermore, the banks have reported their reliance in part on manual workarounds to achieve their existing level of compliance. In the early days of aggregating risk data, it is understandable why manual workarounds or bespoke conversions were implemented. The lack of a company-wide governance policy for trading systems in general has resulted in a diverse landscape of gateway connections for trading and market data, protocols, vendors and standards across different departments linked to trading activities or asset classes.
Given the diverse, disparate and decentralised starting point from which banks have been building their compliant risk systems, it is unsurprising that they have been accumulating considerable technology debt. Technology debt in this context is the accumulation of outdated systems still in use or where the original reasons for implementation have changed and the product is no longer fit for purpose. It could be software code itself or the inappropriate or outdated use of workarounds implemented around and including trading technology applications.
Repayment of this debt can be the overall cost of replacing outdated or inappropriately used systems, the price of which would increase the longer it is left. It can also be the direct financial losses incurred as a result of system mistakes, trading and reporting errors, as well as fines. Furthermore, debt can be paid in client perception and reputational loss as well as lost opportunities due to excessive rigidity.
Tactical mitigation and manual workarounds cannot always be avoided when core components are deeply embedded and cannot easily be changed. Technical managers are aware of the limitations of each workaround but there must be a clear way to quantify each and every one when communicating to the business sponsors. Part of the business development strategy must include future operational risk visibility incorporating the accumulation of technology and process debt. This debt could be very costly and damaging to the business in the future.
Revolution not Evolution
The post 2007/8 financial crisis regulatory changes have applied a form of evolutionary pressure on the banks. Their response has been to make small changes to the organism with adaptations of the existing technical infrastructure in order to survive in the new environment.
Like evolution, the regulatory changes are indiscriminate. This will result ultimately in some banks evolving into dominant species. Other banks may become extinct – at least in some areas of business. Some banks, perhaps the ones with smaller and simpler systems, may realise that building workarounds or bespoke patches for old legacy systems in small evolutionary steps would ultimately accumulate an unacceptably high level of technology debt.
A strategic vision is needed to take fewer and bolder revolutionary steps that accumulate less technology debt to leave systems robust, flexible and reliable. The systems are then better able to cope with any future regulatory, business and technical pressures.
Realising where in the complex trading infrastructure the revolutionary change points are is a challenge. Focusing on reducing the system complexity and increasing centralisation is key to uncovering them.
Much effort has been made to identify the many touch points within trading systems from where to collect the relevant pre-trade and post- trade risk data. This can come from multiple single market trading servers and their risk control modules. Data can also come in different formats and from decentralised and regional locations. The pre-trade risk modules that feed their data into aggregated databases sit side-by-side or in-line with the order-entry application server, meaning each one is blissfully unaware of the trading activities of the other markets. Some vendors or internally-built trading systems can support a truly globalised real-time order book. However, implementing such an infrastructure in the past would have been unnecessarily expensive, added unacceptable latency and would have lacked the governance practices to control it.
When reviewing risk architecture for compliance, the focus has been on data collection and reporting. A tactical rather than strategic approach was initially taken, doing what was required to tick the regulatory boxes. As compliance education within financial institutions matures, a need has arisen to review the technology that controls risk at the execution gateways. It is important to gauge how fit for purpose it is now and in the future and how much of this technology is good enough, and will not result in unacceptable levels of technology debt.
None of the 14 principles of BCBS239 cover automatically adjusting trading risk monitoring tools in real time. They do focus on risk reporting. As more time is spent looking at risk systems and exposure, the next logical step would be to automate risk sentinel parameters. These risk sentinels are the trading risk modules that have the power to stop trades if limit thresholds have been breached.
With more centralisation and a desire for simplification, banks will look to implement ‘smart’ risk sentinels, to replace the outdated, siloed and localised ‘dumb’ risk sentinels that only know their market in isolation and can only take modified limit instructions from human controlled administration screens. While some trading risk modules have open Application Protocol Interfaces (APIs) or can read global order book databases, there are historical reasons why the technology has not matured or been implemented. Reasons can include cost, practical re- architecting, governance and latency issues.
With the wealth of risk data now available and better governance across trading systems, the ability to automate trade risk modules in real-time across multiple exchanges and currencies is now possible. The alternative is to keep relying on high touch and slow human interventions.
There is no doubt that once the regulatory dust settles, the sheer volume of data available to analysts and sponsors will be unrecognisable in comparison with 5 years ago. Just how useful this data is in helping each business make informed decisions will depend on the chosen method adopted when building the data management systems. Those that adopted the revolutionary or centralised methodology will be much better placed then those burdened with the evolutionary decentralised approach.
Creating a robust and centralised data aggregation facility is essential to identify risk exposure. However, it can also be used to gather trade lifecycle data, client connection and trading behaviour data in addition to risk data. Together this can all help to develop a clear strategy.
Robust quality assurance processes in data collection and centralised governance on standardisation and data protocols will help to reveal previous risk patterns and allow early alerts to be flagged when fledgling patterns emerge. This will provide business decision makers with a new tool to review and assess the trading practices of each client and desk.
Changing market conditions have also brought to the surface the need to know at a granular level who is trading what and how often and, equally important, how much each type of instrument costs to service in respect of market fees together with database and infrastructure hosting. Banks will have already have done some divesting to drive down capital costs. To have a clear picture of what each client does, how they trade and how profitable each market, segment and instrument are could result in ‘trimming off the fat.’ Decisions on which clients to drop and indeed which instrument, markets or segments to also drop or invest in would rely in part on the business intelligence collected.
A strong, symbiotic relationship between business and technical strategies is key when embarking on a programme to identify revolutionary change points in what could already be a highly developed and evolved technical architecture. A strong understanding of the already accumulated technology debt is needed together with a clear appreciation of how this will cripple further progress and apply excessive rigidity to the systems. Principle 6 of BCBS239 states that risk systems should be adaptable. Manual workarounds built on decentralised systems cannot truly be described as flexible and adaptive.
Nobody knows when the tide of regulatory pressures will ease but it will be the banks with the clearest, most progressive strategies built on the solid foundations of simplified, centralised systems and governance with minimal technology debt that will be ahead of the competition.
Why insurance needs Tesla’s autopilot too
By Christian Wiens, CEO of Getsafe
Digitization is the industrial revolution of the 21st century. What does this mean for a data-driven industry like insurance? The answer is simple: Turn everything on its head and reinvent yourself under high pressure- the future of insurance is digital.
“Hello Timo, nice to see you. I’ll be glad to help you.” Carla records claims 24 hours a day, seven days a week and takes less than two minutes to evaluate and process them. Carla works for a digital insurer and is a chatbot by profession. While she is answering Timo, she contacts the bank in the background, which pays Timo back his money – the same day. This is not a dream, but already reality.
In the digital age, intelligent machines are the new workers on the assembly line, and data is the new raw material. This applies to almost all industries and applies in particular to the insurance world as insurance is based on mathematical models and probability calculations – in short: on data. The more data on which the calculations are based, the easier it is to derive and price risk profiles. Data therefore changes the core of the product “insurance” in three essential areas; the offer phase, in the event of a claim and in the long-term customer relationship.
In the offer phase, we will experience long-term personalized product bundles that fit customer needs much better – away from standardized and inflexible policies. If the insurer can better assess the needs of the customer on the basis of his past history or behaviour, he is in a position to put together tailor-made insurance packages.
For example, it would be conceivable to automatically adjust the insurance cover as soon as the customer’s life changes, for example if the customer gets married, buys a car or a property or travels abroad.
Customer experience in the event of a claim will also change dramatically. Fraud is still the biggest problem in the system, with 2 percent of the customer base causing 40 percent of the system’s inefficiency. According to estimates by the Association of British Insurers (ABI), one insurance fraud is detected every minute – amounting to economic losses of £3bn every year. Of the estimated worth of total fraud cases a year, £2bn goes undetected.
But what if insurers are better able to assess customers on the basis of data and know which customers they can trust – and which not? Credible customers could then benefit from immediate payment of the loss incurred, while the few “black sheep” would not even be accepted as customers or would be checked more closely in the event of a claim being reported.
The computer does not act uncontrolled, but within certain parameters defined by humans. This is comparable to processes in the manufacturing industry: Here, too, people define the exact parameters that are to be checked – controls are implemented by machines that are significantly less prone to errors. The situation is similar when it comes to insurance fraud: people make value judgements and specify which indicators can point to a case of fraud. They retain sovereignty over the entire process. The smart algorithm, on the other hand, is only the tool for evaluating and linking the many individual data points. Smart algorithms will reduce employees’ workload, but will not replace them.
Finally, digitization will also change the long-term relationship between insurer and insured. Tomorrow’s insurance will not only settle claims, it could even prevent them arising. A better database will not only make it possible to calculate the probability and amount of loss more precisely, it will also make it easier to calculate the risk of loss. Digital systems and sensors can also help prevent possible claims. Telematic tariffs in motor vehicle insurance are already moving in this direction by promoting a prudent driving style.
Sensors on washing machines and industrial plants or intelligent smoke detectors are one thing – monitoring people in the health sector is another. Some health insurers reward sport activities, for example, if the customer can prove this with smart fitness watches. It remains to be seen to what extent customers are willing to exchange this personal data for premium refunds. In the long term, the legislator will also be asked to take action to ensure that the solidarity principle is not undermined.
However, the danger of increasing surveillance is countered by a clear increase in customer service, individualised services and flexibility on the customer side: Digital insurers rely on customer’s self-determination and a positive insurance experience in an industry that sometimes appears to be immobile and non-transparent.
Digitalisation has reached the insurance industry, but has not yet shaken its foundations. That will change: Tomorrow’s insurance will have little in common with today’s structures and processes. The autopilot at Tesla will also come for insurance. Not all companies will be able to master this switch to become digital insurers.
How ISO 20022 migration is changing the landscape in payments
By Paul Thomalla, Global Head of Payments at Finastra
The ISO 20022 standard is a catalyst for change in digitalisation and payments. The current edition of the standard was published in May 2013, and it’s been clear since then that the standard represents the future of payments messaging. This is due to the rich information, process automation and interoperability it enables. What started off in the Automated Clearing House world with the Single European Payments Area is increasingly becoming the de-facto standard for instant payments and for high-value payments worldwide. In fact, we estimate that all major payment systems and currencies will have moved over to ISO 20022 by the end of 2023.
Banks, meanwhile, will be able to get closer to their customers and offer better services. As this happens, the nature of the entire payments supply chain will change: there will be no one owner. Instead, consumers, corporates, banks, software vendors, fintechs and other stakeholders will all play a part.
Migration to ISO 20022 is moving at pace with one of two adoption models being taken. In the first approach, a ‘like-for-like’ migration occurs, which means data fields and messages are gradually moved over in compliance with the new ISO 20022 standard. However, the bank and client aren’t reaping the potential of the new standard as no further action has been taken. ‘Going native’ is the second approach. This allows extensive data sharing between banks and corporates unlocking a range of benefits including deeper insights into customers and partners, better accounting and financial data and more efficient payment processing. Data-rich messages can provide corporates with all the information they need to automatically reconcile transactions the moment they happen.
Banks deciding which way to move forward must remember that corporates have been waiting eight years for this new ISO 20022 functionality and if their bank is not able to deliver the promised benefits, they could decide to take their business elsewhere.
Planning the migration process
Deciding which approach to take is the first step in the migration process for banks. The main transition models being deployed to the market are: the ‘like-for-like’ translation model, or; for an ‘ISO-Native’ approach – either the complete overhaul model, or the hybrid model.
The translation model approach translates incoming MX messages to the SWIFT MT format and vice-versa for outgoing messages. This model is less disruptive and has a lower upfront cost. However, it involves high dependence on third parties resulting in less interoperability with fintechs and no new customer insight. The complete overhaul model allows organisations to execute a wholesale architecture transformation. This approach gives access to leverage rich data across the business including new insights on the market and customers. One negative aspect of this approach is the fact it is disruptive and requires a large upfront investment. Finally, the hybrid model works well for global banks where translation is needed across the board. This approach offers flexibility and the ability to localise strategic response, however it adds a level of complexity to users. The leading model is unclear, but banks must remember to align their payments operations with their chosen model.
That’s not to say that the adoption of ISO 20022 will be plain sailing. One challenge is that the standard describes an asynchronous messaging process. For banks which currently rely on return messages to confirm the successful completion of a payment transaction, this will cause significant upheaval, and is a change that underscores the need for everyone in the payments ecosystem to get ISO 20022 migration right. Banks will need to overhaul their business processes and operations to adapt to asynchronous messaging. This will in turn require new systems, such as Confirmation of Payee and Request to Pay.
The new format requires a fundamental change to the payments world, so the decision on which transition model best suits their needs isn’t to be taken lightly. Internal and external considerations will help banks determine next steps to successfully implementing ISO 20022. Internally, banks must ensure they have the right people to deliver this transformation, have processes in place to easily review and adapt back office functions and have the correct technology required for the migration. Our approach at Finastra has been to build a payments hub that is ISO 20022 native from the start – ready for widespread adoption across the industry. Banks must also look at external factors like customer impact, market share, competitors and regulatory constraints.
Benefits across the payments value chain
The adoption of ISO 20022 allows for additional, enriched data to be transferred within the payment instruction. The new format has more granular and better organised data elements as well as a consistent data dictionary across the payments chain to speed processing and improve compliance. This prevents misinterpretation and expensive manual interventions. All of this will facilitate improved processing and allow all agents in the payment to make more informed compliance decisions.
In the short term, including additional party and remittance information will help reconcile transactions. For example, QR codes are being used more widely on invoices, clearly identifying the beneficiary and facilitating automation in the back office. Looking at the medium term, institutions will be able to limit the resources they have to dedicate to exception handling and one-off investigations due to missing information or unstructured input that cannot be easily integrated into automated workflows. And finally, the benefits of ISO 20022 in the long term mean data that is properly structured and adhered to will support better regulatory compliance practices and financial crime monitoring.
The rewards of ISO 20022 make any temporary disruption more than worth it. We’re excited to enter a new era of payments messaging that will drive collaboration, innovation and efficiency through interlinked partner ecosystems.
Agile thinking in times of uncertainty
By Caryn Skinner, Co-Director of Sharpstone Skinner
“Several times lately, I have finished my work, closed the laptop and sat staring out of the window of my spare room office worrying that I don’t have the answers. That my team are looking to me for guidance about the future…and I simply don’t know.” Paul Jackson-Cole, Executive Director of Engagement, Parkinson’s UK
A genuine, honest reflection from an impressive and successful leader. He has gravitas, is trusted and a great coach to his senior reports. He is also highly intuitive, with an innate ability to be a pioneering visionary who can then work with others to ground that vision into reality. And yet, he is stuck. He still has his instincts, yet with the world, in flux, he is finding it hard to convince his team to go with him because they need more tangible evidence to ground his ideas.
Gut-feel judgement is part of agile thinking which is a crucial leadership skill. In the financial world you may have finely honed other types of thinking as you need to show evidence, use data and put forward your thoughts in a rational way.
Agile thinking has five main features:
Systems thinking – investigating an issue from a broad perspective to understand the interdependencies
Possibility thinking – to be open-minded and generate a wide range of possibilities, the classic brainstorm
Logical analysis – to reach valid conclusions using clear, rational logic
Evidence-based thinking – identify core issues by analysing evidence from relevant resources
The fifth one is gut-feel judgement – relying on your gut instincts to provide valuable input for decisions.
Richard Branson says, “I rely far more on gut instinct than researching huge amounts of statistics”, and he’s not done too badly.
Mr Branson may make you shudder though, as it is quite an extreme view. Most of us use all or a few of them combined. Yet in this world of unknowns, your instincts may need to be more finely tuned. It isn’t easy to find evidence and interdependencies if we have never been in this situation before. Rational logic needs something tangible to test it against, the world feels nebulous at the moment. Being open-minded looks like a good option yet can get stifled because the possibilities are almost endless.
Here are some ways to tap into and use your gut-feel judgement:
- Know that your instincts are not woolly ideas but based on your years of experience. The thought has come from somewhere, an experience you have had, something you have read a conversation you had with a colleague.
- Feed and grow your instincts. The more exposure you have to your market the harder your instincts will work. Keep getting out and about, visit your people, talk to them, learn from them about the front-line challenges and successes.
- See your business through the eyes of your customer or client. Why do they like doing business with you, what would they like you to do better and does your business align with their needs.
Make your own observations about what’s next for your business rather than staring at spreadsheets of cold data. I heard about a trader who regularly walks the shops to see what’s selling and what isn’t, it informed her instinct about where the next investments might be.
- Keep in touch with the world around you, tune into what’s coming over the horizon. A client of ours was in marketing for a bank, he regularly spoke to his teenage nieces and nephews about how they communicated, how many digital “languages” they spoke and which social platform they used for what. They were his future customers and the conversations fuelled his instincts in discussions with the senior team around the bank going online and changing the way they communicated with customers.
- Trust your gut then test it against other types of thinking to ground it and help you sell it in. Others may not get your vision so painting the picture for them with more solid evidence will make your job easier.
It is an exciting area of leadership and one that, perhaps, has been overlooked in a world that can access evidence, stats and data at the swipe of a screen.
Next time you find yourself staring out of your home office window, let your thoughts wander, don’t evaluate them or crush any ideas that come to you, it might be that your gut is trying to tell you something.
Rising to the Challenge of the Pandemic
For over seven decades, Development Bank of the Philippines (DBP) has been the Philippines premier development financing institution, supporting inclusive...
Who Needs an Offshore Bank Account?
By Luigi Wewege is the Senior Vice President, and Head of Private Banking of Belize based Caye International Bank Even today,...
Why insurance needs Tesla’s autopilot too
By Christian Wiens, CEO of Getsafe Digitization is the industrial revolution of the 21st century. What does this mean for...
What The Pandemic Has Taught Us About Remote Work
By Anthony Lamoureux, Strategy and Development Director at Velocity Smart Technology Before the turn of the decade – which already feels like...
The art of change management for finance and accounting teams
By Magali Michael, Director at Yooz The Covid-19 crisis has had a dual impact on businesses across the world. On one...
Humans vs Robots: Which Is Better for Managing Investments?
By Anton Altement, CEO of Polybius and OSOM Finance, In an era of technological advancement, innovation, and fear-mongering sci-fi programs,...
Why content should be at the heart of successful agile marketing
By Yogesh Shah, CEO, iResearch. During this time of unprecedented business change, campaigns today need to be agile, flexible and responsive and companies...
Can companies really afford to WFH?
By Carmen Ene, CEO of 3StepIT. Firms scrambled to enable Working from Home (WFH) at the beginning of the Covid...
FICO UK Credit Market Report September 2020 Shows Card Spend Rise Stalling
Analysis based on UK card issuers’ data also shows high level of unused credit could be a risk as festive...
Investors’ growing appetite for private markets means firms must improve their regulatory governance
· Both large and small firms are struggling to meet regulatory demands due to poor governance of deal distribution, inaccurate investor...