By Alix Melchy, Jumio VP of AI
The application of emerging technologies such as AI, cloud, blockchain and IoT in financial services has altered the traditional operating models of financial institutions, the competitive dynamics of the industry, the role of people in those institutions and the landscape of the financial system as a whole. In fact, AI is positioned as an essential investment, with the World Economic Forum arguing how it is set to become central to the fabric of financial institutions.
While the adoption of AI in financial services may be in its infancy, the use cases are ever growing. From recommending loan and credit offerings to detecting fraud, 94% of financial services in European and Middle Eastern markets believe that AI will disrupt their business. The direction and the awareness of AI is clear but it is essential that companies invest now, as if done too hastily, the process is marred by pitfalls.
Despite the transformative promise of AI and machine learning algorithms, we have seen its application come under scrutiny in other industries. Take the UK A-Level exam grading debacle that dominated headlines back in August. Exam grades of students living in certain UK postcodes were disproportionately and negatively impacted, while other students saw their results inflated. This was down to an algorithm implemented by Ofqual that was set to predict grades using historical data including grades obtained at exams in previous years.
The incident raises the question as to what would happen if the algorithm used in this instance was applied to a financial decision. The same biases could negatively impact the way millions of consumers and businesses borrow, save and manage their money.
It is therefore imperative that financial institutions learn from this scenario, ensuring that when implemented in financial decision-making, AI is nothing short of a success.
AI is no fairy godmother
While many tout the game-changing effects of the looming AI revolution, it’s fundamentally important to understand that AI is not magic. Instead, we need to learn to set reasonable expectations with AI so not to paint an unrealistic picture of its power.
In order to start out on the right track, businesses must first define and align on the task they want the algorithm to perform before it can be developed and implemented. Articulating the problem to be solved is the prerequisite for a solid framework of development and evaluation of your algorithms.
Removing bias in AI
AI is the tool, not the hand that wields it or the eye that guides it. It is a type of learning system that requires data, training integration, and course correction. Just as we would train a young engineer to use a tool correctly, we are training AI systems to become expert learning systems through the data, process and people.
Therefore, in order to solve a problem using AI, the task must be expressed in a form which a machine can understand and the machine must be supplied with the necessary data to perform or otherwise learn to generate predictions that enable it to accomplish its objective. Without strong and relevant data underpinning an AI model, it will never be able to produce strong and relevant results.
To design a fair algorithm, the key is to collect a sufficient amount of data so that the algorithm can be trained to represent an entire community. While it is possible to buy datasets to speed up the process, when doing so, it is essential that the data meets your required criteria rather than simply being a large data set. For the financial services sector, this enables employees to treat customers fairly and, when combined with appropriate modelling and processes, allows them to maintain transparency and accountability in their decision-making processes to avoid legal claims or fines from regulators which can cause deep reputational damage.
Building back better
As the Ofqual issue revealed, a preliminary, small-scale algorithm test is an essential step before applying it into a real-world scenario. A pilot testing phase will help a business to amend the design to identify unnecessary costs and time expenditures, while also better understanding the data. As this was not sufficiently done in the Ofqual case, the algorithm simply did not provide the right answer to the problem it was trying to solve.
Championing ethical AI
More than ever, companies are realising one simple truth: failing to operationalise data and AI ethics is a threat to the bottom line. Missing the mark can expose companies to reputational, regulatory and legal risks. Here are some key areas that businesses should consider when leveraging AI models:
- Usage consent: make sure that all the data you are using has been acquired with the proper consent
- Diversity and representativity: AI practitioners should consider how diverse their programming teams are and whether or not they undertake relevant anti-bias and discrimination training. This will draw upon perspectives of individuals from different genders, backgrounds and faiths which will increase the likelihood that decisions made on purchasing and operating AI solutions are inclusive and not biased
- Transparency and trust building: accurate and robust record keeping is important to assure that those impacted by it know how the model works
The ways AI can be utilised in the financial services industry is increasingly growing. An example is the use of document-centric identity proofing space whereby an identification document, such as a passport, is matched with a selfie of the user to confirm real and virtual identities. This will be an essential area of focus for financial services companies as they look to confirm that users are who they claim to be when the physical branch is diminishing. When analysing if a person is the same as the picture on their documentation, for example, a biased AI model can completely undermine the decision made.
However, it’s reassuring to see that the 2020 Gartner Market Guide for Identity Proofing & Affirmation predicts that by 2022, 95% of RFPs will have introduced clear requirements around minimising demographic bias. This demonstrates how organisations are now becoming more aware of the detrimental impacts that demographic bias in the performance of identity-proofing processes could have on their brand as well as being clear on the legal consequences they risk facing.
In turn, there is a real opportunity to leverage AI solutions to provide the best service, but financial institutions must ensure that they are doing so in an ethical, accurate, and representative way.
The Coming AI Revolution
By H.P Bunaes, CEO and founder of AI Powered Banking.
There is a revolution in AI coming and it’s going to render legacy data and model governance practices obsolete.
The revolution will manifest in three ways:
- Automated machine learning platforms like DataRobot, H2O.ai, Dataiku, and rapidminer are making data scientists more productive. A lot more productive. One company told me that they were seeing 7x as many models from their data science group shortly after the implementation of a leading autoML platform. The increase in model output will quickly reveal bottlenecks in model validation, production implementation, and model operation and management.
- The increasing popularity of tools aimed at “citizen data scientists”, local data literate subject matter experts in the business without formal data science training who nevertheless know a good model and a good use case when they see it, will turn a large percentage of technically savvy business people into model developers. Models developed by citizen data scientists will quickly dwarf the volume of models created by formal data science organizations adding further strain on existing procedures and revealing gaps in governance.
- Availability of nearly unlimited capacity on demand for both data storage and computing power from cloud providers will lead to the proliferation of sophisticated predictive models that can learn from broad swaths of data; structured (your existing databases, for example), semi-structured (your documents), and even unstructured (such as images), sniffing out the data that is relevant to any one particular prediction or population. Demand for more, and different kinds of data for modeling, and the need to integrate model results into downstream dataflows and IT applications, will make data platforms and data flows significantly more complex, harder to manage, and increase points of failure.
What this all adds up to is an explosion in the volume of predictive models and of the data in motion in your organization. Where there were no models, there will suddenly be many. Where there was one model, you may find there are now hundreds. And the pipes providing data into and delivering results out of these models are going to proliferate. Operational and reputational risk from model failure will rise significantly as companies outgrow their existing data and model governance frameworks and legacy procedures.
Making this worse, many banks are starting from a weak position. The demand for more and better models (descriptive and predictive) has already led to a thicket of overlapping, partially inconsistent data flows to a multitude of models. Model outputs themselves have become part of the data flow to downstream data marts, BI, apps and even to other models as inputs. It is the rare organization that knows where all that data is coming from, where it is going, how it is being used, and can identify the potential impacts of changes to data and to the models that consume it.
Certainly there has been much improvement in recent years in data governance at most large organizations. Data quality, data standards, data integration, and data accessibility on robust platforms (increasingly cloud based) have all gotten better. And most organizations now have robust model risk management practices in place, to test and validate models before they go into production use.
But these worlds are about to collide. Data and analytics, once distinct and manageable separately are going to become inextricably intertwined. As brilliantly explained in a paper by several smart people at Google (“The Hidden Technical Debt in Machine Learning Systems”), we will rapidly reach the point where “changing anything changes everything.”
Take a simple example, what differentiates data on a client from a CRM system from data on a client created by a predictive model? The answer: nothing. Yet they are managed today by different groups. The former is typically managed by Data Governance, which is usually led by the Chief Data Officer. The latter is usually the province of Model Risk Management often found in the Corporate Risk Management organization.
But when model outputs become inputs to reports, to business processes, to critical operational or client facing systems, or to other models, they need to be governed just like any other data.
The perfect illustration of this challenge is in change management. Often you will find data change management in the chief data officer’s organization and model change management in the model risk organization. But changes in the data can, and often do, effect models in sometimes unpredictable fashion. And changes to models can change outputs and have major impacts to downstream consumers of those results if they are not prepared for the coming changes.
Managing them separately and distinctly will therefore no longer be sufficient. How to tackle this?
- First and foremost, you must have a complete catalog of all models including metadata describing model inputs and their source and model outputs along with their destination and uses. There are a number of solutions now coming on the market for this purpose including Verta.ai, ModelOp, and Algorithmia.
- Second, data management needs to expand to include not only source data but also all the results (predictions, descriptions) produced by models.
- Third, model management too needs to expand its remit, not just focusing on model testing and validation prior to model implementation but also monitoring model performance and managing model changes after the fact .
- Fourth there must be formal procedures for keeping model management and data management mutually informed and closely coordinated. Data cannot change without assessing model impact, and models cannot change without assessing data impact.
Organizationally, it may be infeasible to combine legacy organizations across traditional lines of responsibility. And it may be better to leverage existing expertise across model management, data engineering, data management, and IT. But a new partnership model, new tools, and new procedures will be needed.
The explosion in AI is upon us. To use AI safely and effectively you need to get your data and analytics house in order and make sure the right mechanisms are in place to keep it so. Regulators have taken note of the risks of poorly managed AI, and it is only a matter of time before they dictate minimum standards. Combining, or at least tightly coupling, data and model governance is where to start.
How financial services organisations are using data to underpin future growth
By John O’Keeffe, Director of Looker EMEA at Google Cloud
In addition to the turmoil caused by the COVID-19 pandemic, a significant decline in venture capital investment has left many financial services organisations feeling deflated, with others struggling to survive. According to figures from trade body Innovate Finance, investment in UK fintech organisations fell 30% in Q2 of this year, with smaller challenger firms and start-ups being the most profoundly hit by our current economic problems.
As a result, both challenger banks and more established players have had to pivot their strategies in order to maintain relevance and market share. Nonetheless, the outlook for fintech in the UK and further afield looks promising for the future. The reality of spending much of our time at home, and out of reach of brick and mortar services, means that many of us are becoming even more accustomed to digital banking for example. Recent analysis of finance application usage from Adjust, found that the average sessions in investment apps surged 88% globally, while payment and banking app sessions increased by 49% and 26%, respectively, during the COVID-19 pandemic.
However, the fact remains that investment in the sector is currently hard to come by. To help regain momentum, a review into the UK’s fintech industry was launched to identify opportunities to support growth across the industry. Data has – and will continue to – play a key role in this push for innovation, helping organisations spot gaps in the market, predict customer behaviours and ensure that the decisions they make are based on real insights. At such a critical time, enabling a data-led approach will help organisations ascertain exactly what is required to accelerate change and ensure the sustainability of the industry.
The financial services industry is a data-rich environment, giving organisations a potential goldmine of customer interactions, product performance and market trends. However, the difficulty often lies in bringing this into a coherent whole, and extracting the business insights required for long-term success. This is as much about strategy and accessibility as it is about technology. Fostering a true “data culture” where employees across the business, whether data experts or not, can access real-time intelligence that informs their day-to-day decision making in a positive way, is crucial. This may mean tweaking your onboarding and training programmes, identifying data evangelists that can catalyse others, or simply making data engaging and relatable for those who are new to the practice.
For many organisations, data is often stored within traditional business intelligence tools, third-party SQL clients or even just a simple spreadsheet, meaning that valuable data insights are siloed and often hindered by a bottleneck between a stretched analytics team and the rest of the business. There is also the all-important General Data Protection Regulation (GDPR) to consider, so data governance and having a clear view of where data is being housed, and for what purpose, is particularly pivotal.
With this in mind, it is crucial to have a “single source of truth” to bring various data streams together and enable real-time, self-serve insights to your whole employee base. As an example of this in practice, data is a great way to understand your existing clients more intimately and nip any problems in the bud early. By building a custom data dashboard incorporating, for example, number of support tickets issued, change in ticket sentiment and number of days to renewal, you can build up an accurate picture of account health and how this has changed over time. In combination with real-time metrics on which products and features are being used and how, sales teams can have more meaningful and accurate conversations with their customers, converting at-risk accounts into potential growth opportunities.
Given the dip in VC investment mentioned earlier, it is more important than ever for startups and scale-ups to do more with less and set a strategic roadmap that supports rapid growth. By using data to measure and action customer feedback, these organisations can be more agile in taking new products to market and making sure these are useful and address specific pain points.
Whether a fintech scale-up or an established name, it has never been more important to shift your operations to a more data-led strategy. With an uncertain outlook ahead for business across all sectors, making data the “single source of truth” can help to navigate market trends, identify new growth opportunities and simply make an organisation’s decision-making smarter and more efficient. Through data-driven innovation and growth, one of Britain’s most valuable industries can continue to thrive in the future.
The Bank of England partners with Appvia to assist in the design, construction and assurance of a new cloud environment
The Bank of England has appointed self-service cloud-native delivery platform Appvia to support the creation of a new cloud environment.
The announcement follows a public procurement process which commenced in January 2020. The Bank of England will work with Appvia on design, construction and assurance of a modern, fit for purpose cloud environment.
During the two-year partnership, Appvia will be supporting development and project teams within the Bank in testing and deploying code in cloud environments, working with security teams to integrate the cloud into existing operational and security processes; and implementing information governance compliance so staff are able to collaborate safely and securely.
Oliver Tweedie, Head of Digital Platforms at the Bank of England, said, “We have selected Appvia as our Cloud Delivery Partner to help us realise the Bank’s cloud ambitions and unlock the potential of the Cloud. Appvia come with a great pedigree and a wealth of experience delivering Cloud services within government. Working in collaboration with Bank Technology teams, Appvia will help us shape and build the future of Cloud services across our organisation – a key part of our Technology strategy.”
Jon Shanks, CEO and Co-Founder of Appvia, said, “This is an exciting opportunity to work with the Bank as it undergoes a step-change in its approach to the cloud. Harnessing innovative cloud solutions, such as containers and Kubernetes is a real business enabler for the Bank to streamline the software development lifecycle, ways of working and cloud operating model. We look forward to working with all stakeholders at the Bank of England to support its digital transformation journey.”
Appvia, which counts the Home Office among its major clients, is a self-service platform that enables organisations to scale their infrastructure quickly, securely and easily using services such as Kubernetes. In September, Appvia launched the world’s first developer-centric tool to enable teams to predict and control cloud costs.
The Coming AI Revolution
By H.P Bunaes, CEO and founder of AI Powered Banking. There is a revolution in AI coming and it’s going...
Q&A with Joe Steele, Head of Workplace Technology at Starling Bank
In just under a year, many businesses had no choice but to go online and with digital transformation on the rise...
How financial services organisations are using data to underpin future growth
By John O’Keeffe, Director of Looker EMEA at Google Cloud In addition to the turmoil caused by the COVID-19 pandemic, a...
Three questions the financial services industry must answer in 2021
Xformative, a Mastercard Start Path recipient, shares what these questions mean for fintech partners and their innovations This year, fintechs...
A quarter of banking customers noted an improvement in customer service over lockdown, research shows
SAS research reveals that banks offered an improved customer experience during lockdown A quarter (27%) of banking customers noted an...
Is Digital Transformation the Key to Business Survival in the New World?
After a turbulent year, enterprises are returning to the prospect of a new world following an unprecedented pandemic. Around the...
Virtual communications: How to handle difficult workplace conversations online
Have potentially difficult conversation at work, like discussing a pay rise, explaining deadline delays or going through performance reviews are...
Black Friday payment data reveals rapid growth of ‘pay later’ methods like Klarna
Payment processor Mollie reveals the most popular payment methods for Black Friday Mollie, one of the fastest-growing payment service providers,...
Brand guidelines: the antidote to your business’ identity crisis
By Andrew Johnson, Creative Director and Co-Founder. How well do you really know your business? Do you know which derivative of your...
COVID-19 creates long and winding road for startups seeking investment
By Jayne Chan, Head of StartmeupHK, Invest Hong Kong Countless technology and other companies describe themselves as innovators, disruptors or...