Connect with us
Our website publishes news, press releases, opinion and advertorials on various financial organizations, products and services which are commissioned from various Companies, Organizations, PR agencies, Bloggers etc. These commissioned articles are commercial in nature. This is not to be considered as financial advice and should be considered only for information purposes. It does not reflect the views or opinion of our website and is not to be considered an endorsement or a recommendation. We cannot guarantee the accuracy or applicability of any information provided with respect to your individual or personal circumstances. Please seek Professional advice from a qualified professional before making any financial decisions. We link to various third-party websites, affiliate sales networks, and to our advertising partners websites. When you view or click on certain links available on our articles, our partners may compensate us for displaying the content to you or make a purchase or fill a form. This will not incur any additional charges to you. To make things simpler for you to identity or distinguish advertised or sponsored articles or links, you may consider all articles or links hosted on our site as a commercial article placement. We will not be responsible for any loss you may suffer as a result of any omission or inaccuracy on the website.

Finance

Algorithmic bias — time to readdress the balance

Algorithmic bias — time to readdress the balance 3

Algorithmic bias — time to readdress the balance 4By Jean Van Vuuren, Associate Vice President – EMEA Commercial of Hyland

In a world of instant gratification, consumers no longer expect to have to wait. So, when they make an application for a loan or mortgage, they’re looking for a swift decision.

Across the financial services sector, artificial intelligence (AI) is playing an increasingly important role in automating processes, making these faster and less resource intensive. However, vigilance is key to prevent unintended bias and while no one doubts the ability of automation to improve efficiency, some are now questioning its capacity for fairness. Are we confident that algorithms being deployed are free of bias against certain groups, leading to their financial exclusion?

Lack of awareness is no excuse

The UK Centre for Data Ethics and Innovation Barometer has previously identified the potential of such bias as “the biggest risk arising from the use of data-driven technology”, with the organisation now working with “partners to facilitate responsible data sharing, better public sector AI and data use, and laying the foundations for a strong AI assurance ecosystem”. Meanwhile, the European Banking Authority (EBA), Bank of England and Financial Conduct Authority are sufficiently concerned to have begun looking at the potential social impact and how new technologies could negatively affect lending decisions.

It’s a challenge that’s unlikely to go away. As to its order of magnitude, no one’s sure, though a first of its kind pilot study by the NHS into how health and care services are allocated and delivered may shed some light on the numbers affected by algorithmic bias.

And if there is a very real issue, what can be done to redress the balance?

The answer lies in ‘algorithmovigilance’ – the systematic monitoring by financial institutions of their algorithms to ensure that the plethora of insights, from credit referencing checks to fraud and AML processes don’t unconsciously or otherwise discriminate against certain individuals or groups.

By being algorithmovigilant, organisations are better equipped to recognise and root out the human biases that can all too readily become part of the AI process and lead to unfair decision making.

Data validation

How does this bias arise in the first place? It’s usually the result of systems being based on imperfect datasets, that are either incomplete, incorrect or not up to date, to which programmers, managers and other stakeholders add their own assumptions and prejudices. Out of this process emerge algorithms that skew assessments.

It should go without saying that AI platforms should be designed so that they meet all legal, social and ethical standards. Which means that firms need to make algorithmovigilance a priority, to steer clear of legal and regulatory risks, as well as long-term reputational damage.

Removing this kind of bias is paramount, given that trust in so many organisations has been eroded. Consumers are actively searching out companies that do ‘the right thing’ and deliver on their promises. What they want and need is transparency, not to fall prey to decisions that can’t be justified or challenged because they’ve been made by an algorithm that no one understands. The computer may say ‘no’, but that doesn’t mean it’s right.

Levelling the playing field

If they are to play their part in creating a level playing field, senior industry leaders must ensure that algorithmovigilance is embedded in their corporate and governance processes, with staff trained to be alert to unintended bias.

Therefore, continuous monitoring is required, with algorithms adjusted accordingly, as market and social conditions change. Only then can organisations truly say that their application procedures are as unbiased as they can be.

Setting up a team of subject experts to form a centre of excellence, to help ensure consistency of approach throughout the organisation, is a good starting point, as is the regular monitoring of customer data to ensure it is as complete and accurate as possible.

Organisations should also look to actively work with regulators to keep up with best practice.

Algorithmovigilance touches on our fundamental relationship with technology, a tool that should serve us, as long as the processes are managed appropriately. This is an industry-wide challenge, to which no one has all the answers, and it’s down to individual institutions and their technology partners to build the strategies, platforms and insights, fit for tomorrow’s marketplace.

Global Banking and Finance Review Awards Nominations 2022
2022 Awards now open. Click Here to Nominate

Advertisement

Newsletters with Secrets & Analysis. Subscribe Now