Connect with us

Global Banking and Finance Review is an online platform offering news, analysis, and opinion on the latest trends, developments, and innovations in the banking and finance industry worldwide. The platform covers a diverse range of topics, including banking, insurance, investment, wealth management, fintech, and regulatory issues. The website publishes news, press releases, opinion and advertorials on various financial organizations, products and services which are commissioned from various Companies, Organizations, PR agencies, Bloggers etc. These commissioned articles are commercial in nature. This is not to be considered as financial advice and should be considered only for information purposes. It does not reflect the views or opinion of our website and is not to be considered an endorsement or a recommendation. We cannot guarantee the accuracy or applicability of any information provided with respect to your individual or personal circumstances. Please seek Professional advice from a qualified professional before making any financial decisions. We link to various third-party websites, affiliate sales networks, and to our advertising partners websites. When you view or click on certain links available on our articles, our partners may compensate us for displaying the content to you or make a purchase or fill a form. This will not incur any additional charges to you. To make things simpler for you to identity or distinguish advertised or sponsored articles or links, you may consider all articles or links hosted on our site as a commercial article placement. We will not be responsible for any loss you may suffer as a result of any omission or inaccuracy on the website. .

Banking

The Value of Explainable AI (XAI) in Financial Services 

Untitled design 2020 10 06T115842.642 - Global Banking | Finance

By Alexei Markovits, AI Team Manager, Element AI

The world around us is constantly changing due to ground-breaking advances in artificial intelligence (AI). AI systems are being used to buy and sell millions of financial instruments, assessing insurance claims, assigning credit scores and optimising investment portfolios. Along with these advancements, we also need a framework for understanding how AI arrives at its findings and suggestions, in order to build trust to use them to their full potential.

The processes behind how modern AI works isn’t always obvious. Many of today’s advanced machine learning algorithms that power AI systems are inspired by the processes of the human brain, but are limited by their lack of human ability to explain actions or reasoning.

With this in mind, an entire research field is now working towards describing the rationale behind AI decision-making: Explainable AI (XAI). While modern AI systems demonstrate performance and capabilities far beyond previous technologies, practicality and legal compliance present hurdles to successful implementation.

For organisations looking to utilise AI effectively, XAI be a key deciding factor due to its ability to help foster innovation, enable compliance with regulations, optimise model performance, and enhance competitive advantage.

The value of explainable AI in financial services 

Explainability techniques are becoming especially valuable in financial services. When it comes to financial data, many service providers and consultants may already be aware of the low signal-to-noise ratio that is typical of this data, which in turn demands a strong feedback loop between user and machine.

Alexei Markovits

Alexei Markovits

AI solutions that are designed without human feedback capabilities run the risk of never being adopted due to the favoured traditional approaches that rely on domain expertise and experience from years gone by. AI-powered products that are not auditable will simply struggle to enter the market as they’ll face regulation issues.

Marketing forecasting and investment management 

Time series forecasting methods have grown in prominence across financial services. They are useful for predicting asset returns, econometric data, market volatility and bid-ask spreads—but are limited by their dependence on historical values. As they can lack disparate, meaningful information of the day, using time series to predict the most likely value of a stock or market volatility is very challenging.

By complementing such models with explainability methods, users can understand the key signals the model uses in its prediction, and interpret the output based on their own complementary view of the market. This then enables synergy between finance specialists’ domain expertise and the big data crunching abilities of modern AI.

Explainability techniques also enable human-in-the-loop AI solutions for portfolio selection. An investor might find that they choose not to pick the suggested portfolio with the highest reward if the associated risk appears too prominent. On the other hand, a system that provides a detailed explanation of the risks, such as how they could be uncorrelated with the market, is a powerful addition to investment planning tools.

Credit scoring 

Assigning or denying credit to an applicant is a consequential decision that is highly regulated to ensure fairness. The success of AI applications in this field is dependent on the ability to provide a detailed explanation of final recommendations.

Beyond compliance, the value of XAI is seen for both the client and financial institution in different ways. Clients can receive explanations that give them the information they need to improve their credit profile, while service providers can better understand predicted client churn and adapt their services.

Through use of XAI, credit scoring can also help with reducing risk. For example, an XAI model might provide an explanation of why a pool of assets has the best distribution to minimise the risk of a covered bond.

Designing for explainability 

As AI solutions evolve past proof-of-concept to deployment at scale, it is pivotal to recognise the importance of prioritising explainability to power human-AI collaboration and to satisfy audit, regulatory and adoption needs. Taking a user-centric approach along with the need for transparency across AI systems naturally behoves explainability to be a part of that cycle—from the initial steps of building a solution to the system integration and use.

Global Banking & Finance Review

 

Why waste money on news and opinions when you can access them for free?

Take advantage of our newsletter subscription and stay informed on the go!


By submitting this form, you are consenting to receive marketing emails from: Global Banking & Finance Review │ Banking │ Finance │ Technology. You can revoke your consent to receive emails at any time by using the SafeUnsubscribe® link, found at the bottom of every email. Emails are serviced by Constant Contact

Recent Post