By Cliff Moyce, DataArt Global Head of Financial Services Practice
Unlike their challenger bank siblings and fintech cousins, incumbent banks have one particular problem to solve if they are to remain viable and competitive. That problem is high cost to income ratios of (typically) around 60%. Compare that figure to fintech firms engaged in ‘unbundling the bank’ who even when fully operational are operating at ratios as low as 20%.
A large part of the difference in costs is the cost of operating and supporting legacy system architectures.Other factors include the cost of branch networks, and the (over)staffing implications of functionally divided organisations. High IT infrastructures costs in large banks arise from significant duplication and hidden redundancy; poor integration; high complexity; poor systems documentation and knowledge; a lack of agility/ flexibility/ adaptability; old fashioned interfaces and reporting capabilities; difficulties integrating with newer models such as cloud computing and mobile devices; being difficult to monitor, control and recover; and, susceptible to security problems.
Getting old and new applications, systems and data sources to work seamlessly can be difficult, verging on impossible. This lack of agility means that legacy systems in their existing configuration can be barriers to improved customer service, satisfaction and retention. In regulated sectors they can also be a barrier to achieving statutory compliance. Pressure to replace these systems can be intensified by new competitors who are able to deploy more modern technologies from day one.
One radical approach to solving the infrastructure issue is to design and implement a new, more modern architecture using a radical clean-slate or blueprint-driven approach. Amusing analogies have often been used to encourage audiences to take such an approach, including the analogy of legacy infrastructures resembling an unplanned house that has been extended many times. But how easy is it to design and implement a new IT architecture in a large mature organisation with an extensive IT systems estate?
Rather than the unplanned house analogy, a better analogy might be a ship at sea involved in a battle. Imagine if you were the captain of such a ship and someone came onto the bridge to suggest that everyone stop taking action to evade the enemy and instead draw up a new design for the ship that would make evasion easier once implemented. You might be forced to be uncharacteristically impolite for a moment before getting back to the job at hand.
The temptation to start again is enormous, but big-bang approaches to legacy IT systems replacement can be naive, expensive and fraught with risk. At some point, many large organisations have attempted the enterprise-wide re-design approach to resolving their legacy systems problems. Yet so many initiatives have been abandoned when the scale of the challenge or the impossibility of delivering against a moving target become clear. Time has a nasty habit of refusing to stand still while you draw up your new blueprint. Re-designing an entire architecture is not a trivial undertaking, and building / buying and implementing replacement systems will take a long time. Long before a new architecture could ever be implemented the organisation will have launched new products and services; changed existing business processes; experienced changes to regulations; witnessed the birth of a disruptive technology; encountered new competitors; exited a particular business sector and entered others.
All of these things conspire to make the redesign invalid even before it’s live. If you are lucky, you may realise the futility of the approach before too much money has been spent. Furthermore, the sort of major projects required to achieve the transformation are the sorts of projects that run notoriously high failure rates. A 2005 KPGM report showed that in just a twelve month period 49% of organizations had suffered a recent project failure,with IBM later reporting in 2008 that only 40% of the projects met their schedule, budget and quality goals.And as recently as 2012, a McKinsey and Company report identified that 17% of large IT projects fail so critically as to threaten the very existence of the company.
So if wholesale blueprinting and re-engineering is impractical, what options are left to solve the problem? Luckily there are some practical and cost effective approaches that can mitigate many of the problems with legacy systems while obviating the immediate need to replace systems (though eventual systems replacement should be an objective). Two viable alternative approaches are service-oriented architecture (SOA) and web services. Used in combination, they offer an effective solution to legacy systems problem.
SOA refers to an architectural pattern in which application components talk to each other via interfaces. Rather than replacing multiple legacy systems, it provides a messaging layer between components that allows them to co-operate at a level you would expect if everything had been designed at the same time and was running on much newer technologies. These components not only include applications and databases, but can also be the different layers of applications. For example, multiple presentation layers talk to SOA and SOA talks to multiple business logic layers – and thus an individual prevention layer that previously could not talk easily (if at all) to the business logic layer of another application can now do so.
Web services aims to deliver everything over web protocols so that every service can talk to every other service using various types of web communications (WSDL, XML, SOAP etc.). Rather than relying on proprietary APIs to allow architectural components to communicate, SOA achieved through web services provides a truly open interoperable environment for co-operation between components.
The improvements that can be achieved in an existing legacy systems architecture using SOA though webs services can be immense, and there is no need for major high risk replacement projects and significant re-engineering. Instead organisations can focus on improving cost efficiency by removing duplication and redundancy though a process of continuous improvement, knowing that their major operations and support issues have been addressed by SOA and web services. Another benefit is that the operations of the organisation can start to be viewed as a collection of components that can be configured quickly to provide new services even though the components were not built with the new service in mind. This principle is known as the composable enterprise.
But addressing the issue of legacy systems in a way that makes good sense is not just an IT issue; it is also a people issue. It requires people to resist their natural inclination to get rid of old things and build new things in the mistaken assumption that new is always better than old. It requires people to resist the temptation to launch ‘big deal projects’, for all of the reasons that people launch big deal projects – from genuine belief that they are required (or the only way), to it being a way of self-promotion, and everything in-between. It requires people to take a genuinely objective view of the business case for change, while operating in a subjective environment. It requires people to prioritise customer service over the compulsion to tidy up internally. And, it requires the default method of change to be continuous improvement rather than step change projects – which can be counter intuitive in cultures where many employees have the words ‘project’ or ‘programme’ in their job titles.
So, to summarise, of course legacy enterprise IT architectures can feel like barriers to efficiency, agility and customer satisfaction and making even the smallest change can often feel like it takes too long and costs too much money. The overwhelming temptation to throw the legacy architecture away and start again is understandable, but succumbing to that temptation can be a mistake. Luckily we now have technical tools and approaches available to affect radical improvements without having to incur the expense, effort and risk of major replacement projects. But using these tools comes with a change of mindset and approach that may be counter-cultural in some organisations. It can mean a move away from step-change and ‘long-march’ projects, and a move towards continuous improvement. Education and engagement will be one of the keys to making it happen.
*Previously published in Issue 3
Iron Mountain releases 7-steps to ensure digitisation delivers long-term benefits
Iron Mountain has released practical guidance to help businesses future-proof their digital journeys. The guidance is part of new research that found that 57% of European enterprise plan to revert new digital processes back to manual solutions post-pandemic.
The research revealed that 93% of respondents have accelerated digitisation during COVID-19 and 86% believe this gives them a competitive edge. However, the majority (57%) fear these changes will be short-lived and their companies will revert to original means of access post-pandemic.
“With 80% still reliant on physical data to do their job, now is a critical time to implement more robust, digital methods of accessing physical storage,” said Stuart Bernard, VP of Digital Solutions at Iron Mountain. “Doing so can enhance efficiency and deliver ROI by unlocking new value in stored data through the use of technology to mine, review and extract insight.”
When COVID-19 hit, companies had to think fast and adapt. Digital solutions were often taken as off-the-shelf, quick fixes – rarely the most economical or effective. But they are delivering benefits – those surveyed reported productivity gains (27%), saving time (20%), enhancing data quality (13%) and cutting costs (12%).
So what now?
The Iron Mountain study includes guidance for how to turn quick-fixes into sustained, long-term solutions. The seven-steps are designed to help businesses future-proof their digital journeys and maximize value from physical storage:
1) Gather insights: The COVID-19 pandemic allowed organisations to test and learn. Companies should ensure these insights are fed into developing more robust solutions.
2) Use governance as intelligence: Information governance and compliance are fundamental to data handling. But frameworks aren’t just a set of rules, they hold valuable insights that can be turned into actionable intelligence. Explore your framework to extract learnings.
3) Understand your risk profile: A key early step is to analyse where you are most vulnerable. With data in motion and people working remotely, which records are at risk? What could be moved into the cloud? Are your vendors resilient?
4) Focus where you will achieve greatest impact: To prioritise successfully, you need to know where you will achieve the largest impact. This involves looking beyond initial set-up costs towards the holistic benefits of digitisation, including reducing time spent on manual scanning, and the risk of compliance violations.
5) Reach out and collaborate: We are all in this together. Your IT, security, compliance and facility management teams are all facing the same challenges. Ensure you collaborate across functions to develop robust, integrated solutions.
6) Find a provider who can relate to your digital journey: For companies that still rely heavily on analogue solutions, digitisation can be daunting and risky. It pays to find a vendor who has been on the same journey, understands your paper processes and can guide you through the digital world.
7) Prioritise and evolve communication and training programmes: To reap the full rewards from any digitisation initiative, thorough and continuous communication and training is critical. Encouragingly, our survey found that 81% of data handlers have received training to work digitally which is an excellent step in the right direction, but consider teams beyond data handling to truly succeed.
The research was commissioned by Iron Mountain in collaboration with Censuswide. It surveyed 1,000 data handlers among the EMEA region. It found that the departments that have digitised more due to COVID-19 include IT support (40%), customer relationship management (36%), and team resource planning (34%).
3D Secure: Why are fraudsters still slipping through the net?
By Tim Ayling, VP EMEA, buguroo
There is a constant tension between keeping online payments secure, and offering an easy and frictionless user experience. Digital transformation – especially accelerated by the global pandemic – leaves consumers expecting online services to be seamless. Customers are even liable to abandon a process altogether if they encounter a hurdle.
Financial regulation and security protocols exist to help ensure that a balance is maintained between offering customers this frictionless experience, and keeping them and their funds safe from fraud attacks.
What is 3D Secure?
3D Secure is one such protocol. This payer authentication system is designed to keep card-not-present (CNP) ecommerce payments secure against online fraud. The card issuer uses 3D Secure when a card is used to pay for something online, authenticating the customer’s identity based on personal identifiers, such as the three-digit CVV code on the back of a card, as well as the device they’re using to make the payment and their geolocation or IP address.
3D Secure is important because although transactions can be accepted or denied based on the level of risk, it’s not always as clear as ‘risky’ or ‘not risky’. A small number of transactions will have an undetermined or questionable level of risk attached to them. For example, if a legitimate customer appears to be using a new device to buy goods online, or appears to be attempting to make the transaction from an irregular location. In these instances, 3D Secure provides a step-up authentication, such as asking for a one-time password (OTP).
Getting the right balance
3D Secure is a helpful protocol for card issuers, as it allows banks to comply with Strong Customer Authentication as required by EU financial regulation PSD2 as well as increase security for transactions with a higher level of risk – thereby better filtering the genuine cardholders from fraudsters.
This means that the customers themselves are better protected against fraud, and the extra security helps preserve their trust in the bank to be able to keep their money safe. At the same time, the number of legitimate customers who have their transactions denied is minimised, improving the customer’s online experience.
So why are fraudsters still slipping through the net?
Fraudsters are used to adapting to security protocols designed to stop them, and 3D Secure is no exception. The step-up authentication that is required by 3D Secure in the instance of a questionable transaction often takes the form of an OTP, a password or secret answer known only by the bank and the customer. However, there are various ways that fraudsters have devised to steal this information.
The most common way to steal passwords is through phishing attacks, where fraudsters pretend to be legitimate brands, such as banks themselves, in order to dupe customers into giving away sensitive information. Fraudsters can even replace the pop-up windows that appear to legitimate customers in the case of stepped-up authentication with their own browser windows disguised as the bank’s. Unwitting customers then enter the password or OTP and effectively hand it straight over to the fraudsters.
Even when an OTP is sent directly to a customer’s phone, fraudsters have found a way to intercept this information. They do this through something called a ‘SIM swap scam’, where they impersonate their victim and manage to get the legitimate cardholder’s number switched onto a different SIM card that they own, thereby receiving the genuine OTP in the cardholder’s place.
This is especially an issue for card issuers when taking into account the liability shift that is attached to using 3D Secure. When a transaction is authenticated using 3D Secure, the liability moves to lie with the card issuer, not the vendor or retailer. If money leaves a customer’s account and the transaction was verified by 3D Secure, but the customer says they did not authorise the transaction, the card provider becomes liable for any refunds.
How AI and Behavioral Biometrics can be used to plug the gap
Banks need to find a way to accurately block fraudsters while allowing genuine customers to complete online payments. AI can be used alongside behavioural biometrics as an additional layer of security to cover the gaps in security through continuous authentication of the customer.
Behavioural biometrics can collect and analyse data from thousands of parameters around user behaviour such as their typing speed and dynamics, or the trajectory on which they move the mouse, throughout the entire online session. AI processes are used to dynamically compare this analysis against the user’s usual online profile to identify even the smallest of anomalies, as well as against profiles of known fraudsters and typical fraudster behaviour. AI then delivers a risk score based on this information to banks in real time, enabling them to root out and block the fraudulent transactions.
As this authentication occurs invisibly, the AI technology can recognise if the customer is who they say they are – and that it isn’t a fraudster trying to input a genuine OTP they have managed to steal through phishing or SIM swapping – without adding any additional friction.
Card issuers cannot decline all questionable transactions without losing customers, while approving them without additional checks poses security issues that can result in financial losses as well as losses in customer trust. Behavioural biometrics is a foundational technology that can work simultaneously to 3D Secure to keep customers’ online payments safe from fraud while maintaining a frictionless experience and minimising the risk of chargeback liability for banks.
Track and Trace and Other Lost Data
By Ian Smith, General Manager and Finance Director at Invu
You, like me, were probably amazed by the now infamous loss of the over 16,000 positive test results in the track and trace system due to an Excel spreadsheet error.
You, like me, probably wondered how the Government could get something so important so wrong?
But perhaps we should ask are we standing in a greenhouse launching stones?
Data risks from software
Today we are spoilt with software offerings that help us with both our personal and our work lives.
Microsoft Excel is a powerful application and offers many functions now that required moderately complex macro writing in the past, seducing all of us into submitting more data for it to analyse. In finance, we tend to solve all those problems our applications cannot address using Excel.
In finance, we also know the risks of formula errors, and if we have relied on it enough, we will have our own war stories to go with these risks. Yet, we often continue to use the tool for operations that make those folks with an information technology background shake their heads.
These Excel files nowadays may find themselves resident on a local file server or one of the many file servers in the cloud (like those from the big three, DropBox, Google Drive and Microsoft OneDrive or other less well-known file sharing applications). Many of us use these in multiple ways.
Beyond finance and Excel, there are now many applications that we run our data through and leave data stored in the form of documents, comments and notes.
The long-standing example is email. We today receive many documents via email, with content in the body often providing context. Email systems then become the store for that data. While this works from a personal point of view, for a business working at scale, the information stored this way can be lost to the rest of the business. Just like data falling off a spreadsheet when there are not enough rows to capture the results.
More recently, we have seen easy to consume applications develop in many areas like chat and productivity. Take for example task management apps, my own preference being Monday.com (I am sparing you the long list of these). The result of the task and how we got there, in the form of attachments or comments, are often stored in the application. Each application we touch encourages us to leave a bit of data behind in its store.
Many of these applications can have a personal use and an initial personal dalliance is what sparks up the motivation to apply the application to a business purpose. Just like the “Track and Trace System”, they can often find themselves being used in an environment where the scale of the operation overwhelms their intended use.
In our business lives, combining the use of applications in this way by liberally sprinkling our data across multiple systems often stored in documents (be they Microsoft Word, email, scans or comments and notes) puts us on the pathway to trouble.
Imagine how Matt Hancock felt explaining to Parliament that the world-class track and trace system depended on a spreadsheet.
Can you imagine a similar situation in your business life? Say, for example, that documents or data in some form was lost because of the use of disparate systems and/or applications that were not really designed for the task you assigned to them.
Who would be your Parliament?
Now you can see yourself in the greenhouse, you may not want to reach for that metaphorical stone.
If these observations create some concerns for you, you may want to consider the information management strategy at your business. You have a strategy, even if it is not addressed specifically in documents, plans or thought processes.
These steps may help figure out where you are and where you want to go.
- Assess your current environment.
Are you a centraliser, with all the information collected in one place? Or is all your data spread across multiple stores, as identified above? Are you storing your key business information on paper documents, or digitally or a mix of both.
- Assess your current processes.
Do your processes run on a limited number of software applications? Or do you enable staff to pick their own tools to get things done? The answer to this question is often a mix of both where staff bridge the gaps in those applications using tools like MS excel. A key application to think about is how the data in email, particularly the attachments, is made available to the business.
- Design a pathway for change and implement it.
Start with the end in mind. I suggest the goal is to enable the right people to have the right access to the information they require to do their job in real-time. I believe the way to effectively do this is to go digital. The fork in the road is then whether to centralise your information store or adopt a decentralised approach.
My own preferred route is to centralise using document management software that enables all your documents to be stored in one place. Applications like email can be integrated with it, significantly reducing the workload required to file and store the data. The data can then be used in business applications using workflows. Thinking these workflows through will help you assess the gaps between your key business applications and consider whether tools like excel are being stretched too far.
Tax administrations around the world were already going digital. The pandemic has only accelerated the trend.
By Emine Constantin, Global Head of Accoutning and Tax at TMF Group. Why do tax administrations choose to go digital?...
Time for financial institutions to Take Back Control of market data costs
By Yann Bloch, Vice President of Product Management at NeoXam Brexit may well be just around the corner, but it is...
An outlook on equities and bonds
By Rupert Thompson, Chief Investment Officer at Kingswood The equity market rally paused last week with global equities little changed...
Optimising tax reclaim through tech: What wealth managers need to know in trying times
By Christophe Lapaire, Head Advanced Tax Services, Swiss Stock Exchange This has been a year of trials: first, a global...
Young adults lean towards ‘on-the-job’ learning as 6 in 10 say pandemic has impacted educational plans
Six in 10 (61%) of 16-25s agree learning ‘on-the-job’ is the best way to get on the jobs ladder in the current environment 59%...
Five things to consider when organising a remote work Christmas party
By Kate Palmer, HR Advice and Consultancy Director at Peninsula Christmas is usually a time of cheer and celebration, and...
Reasons to remote manage in a socially distanced world
By Paul Routledge Country Manager D-Link UK and Ireland As the world continues to adapt in varying degrees to the...
Barclays announces new trade finance platform for corporate clients
Barclays Corporate Banking has today announced that it is working with CGI to implement the CGI Trade360 platform. This new...
An unprecedented Black Friday: How can retailers prepare?
Retailers must invest heavily in their online presence and fight hard to remain competitive as a second lockdown stirs greater...
What’s the current deal with commodities trading?
By Sylvain Thieullent, CEO of Horizon Software The London Metal Exchange (LME) trading ring has been the noisy home of...