Search
00
GBAF Logo
trophy
Top StoriesInterviewsBusinessFinanceBankingTechnologyInvestingTradingVideosAwardsMagazinesHeadlinesTrends

Subscribe to our newsletter

Get the latest news and updates from our team.

Global Banking & Finance Review®

Global Banking & Finance Review® - Subscribe to our newsletter

Company

    GBAF Logo
    • About Us
    • Advertising and Sponsorship
    • Profile & Readership
    • Contact Us
    • Latest News
    • Privacy & Cookies Policies
    • Terms of Use
    • Advertising Terms
    • Issue 81
    • Issue 80
    • Issue 79
    • Issue 78
    • Issue 77
    • Issue 76
    • Issue 75
    • Issue 74
    • Issue 73
    • Issue 72
    • Issue 71
    • Issue 70
    • View All
    • About the Awards
    • Awards Timetable
    • Awards Winners
    • Submit Nominations
    • Testimonials
    • Media Room
    • FAQ
    • Asset Management Awards
    • Brand of the Year Awards
    • Business Awards
    • Cash Management Banking Awards
    • Banking Technology Awards
    • CEO Awards
    • Customer Service Awards
    • CSR Awards
    • Deal of the Year Awards
    • Corporate Governance Awards
    • Corporate Banking Awards
    • Digital Transformation Awards
    • Fintech Awards
    • Education & Training Awards
    • ESG & Sustainability Awards
    • ESG Awards
    • Forex Banking Awards
    • Innovation Awards
    • Insurance & Takaful Awards
    • Investment Banking Awards
    • Investor Relations Awards
    • Leadership Awards
    • Islamic Banking Awards
    • Real Estate Awards
    • Project Finance Awards
    • Process & Product Awards
    • Telecommunication Awards
    • HR & Recruitment Awards
    • Trade Finance Awards
    • The Next 100 Global Awards
    • Wealth Management Awards
    • Travel Awards
    • Years of Excellence Awards
    • Publishing Principles
    • Ownership & Funding
    • Corrections Policy
    • Editorial Code of Ethics
    • Diversity & Inclusion Policy
    • Fact Checking Policy
    Original content: Global Banking and Finance Review - https://www.globalbankingandfinance.com

    A global financial intelligence and recognition platform delivering authoritative insights, data-driven analysis, and institutional benchmarking across Banking, Capital Markets, Investment, Technology, and Financial Infrastructure.

    Copyright © 2010-2026 - All Rights Reserved. | Sitemap | Tags

    Editorial & Advertiser disclosure

    Global Banking & Finance Review® is an online platform offering news, analysis, and opinion on the latest trends, developments, and innovations in the banking and finance industry worldwide. The platform covers a diverse range of topics, including banking, insurance, investment, wealth management, fintech, and regulatory issues. The website publishes news, press releases, opinion and advertorials on various financial organizations, products and services which are commissioned from various Companies, Organizations, PR agencies, Bloggers etc. These commissioned articles are commercial in nature. This is not to be considered as financial advice and should be considered only for information purposes. It does not reflect the views or opinion of our website and is not to be considered an endorsement or a recommendation. We cannot guarantee the accuracy or applicability of any information provided with respect to your individual or personal circumstances. Please seek Professional advice from a qualified professional before making any financial decisions. We link to various third-party websites, affiliate sales networks, and to our advertising partners websites. When you view or click on certain links available on our articles, our partners may compensate us for displaying the content to you or make a purchase or fill a form. This will not incur any additional charges to you. To make things simpler for you to identity or distinguish advertised or sponsored articles or links, you may consider all articles or links hosted on our site as a commercial article placement. We will not be responsible for any loss you may suffer as a result of any omission or inaccuracy on the website.

    1. Home
    2. >Technology
    3. >IT’S (STILL) NOT FAIR
    Technology

    It’s (still) Not Fair

    Published by Gbaf News

    Posted on March 6, 2018

    7 min read

    Last updated: January 21, 2026

    Add as preferred source on Google
    An industrial vibrating screen utilized in mining operations, showcasing its significance in material classification. This image highlights the growing demand for vibrating screens as outlined in the FMI report on market trends from 2019 to 2029.
    Vibrating screen machinery used in mining and construction - Global Banking & Finance Review
    Why waste money on news and opinion when you can access them for free?

    Take advantage of our newsletter subscription and stay informed on the go!

    Subscribe

    Colin Gray, principal consultant SAS UK & Ireland

    As we progress into 2018, the General Data Protection Regulation (GDPR) is looming on most organisations’ radar. The requirements are becoming clearer, but there is still some ambiguity about precisely what needs doing. There is, however, no question that algorithms are an area of concern, and in particular, whether their effects are clearly understood — and more importantly, are fair.

    In this article, I’m following up on a previous blog where I discussed the fairness of algorithms used by businesses to help them make decisions. I described an example of an algorithm using age, income and number of dependants to determine whether to offer a loan to an individual. In this blog, I want to discuss the nature of risk and how algorithms use an assessment of risk to determine the price individuals pay.

    The case of motor insurance

    Let’s use motor insurance as an example, around 5%–10% of policyholders will claim each year, which means that 90%–95% of policyholders won’t claim. The claims are spread across all policyholders. We know that certain drivers pay more than others; young drivers, drivers with previous claims, those with expensive cars. Actuaries identify the factors most strongly correlated with accidents and set the premiums accordingly. Given that most people won’t have an accident in any given year, is it fair to charge different premiums?

    You could argue that some of the factors used are inherent, such as age, other factors are more choice-based; where you live, how fast you drive, the cost of your car. What would it mean if the regulations said that everyone had to be charged the same premium?

    • Young drivers à lower premium
    • More expensive cars à lower premium
    • More reckless driving à lower premium

    In practice, therefore, drivers wouldn’t be paying for the risk they represent. There would be subsidies across these groups, which as a society, we may or may not be happy with.

    However, I think we should be more concerned about second-order effects, that is, drivers are not being charged — some might call it penalised — for their risky behaviour or risky characteristics. This could encourage some drivers to take greater risks, knowing that the cost would be shared with others, and they would not personally face any financial penalty. There is very little question that the number of accidents and ultimately deaths would go up. This is presumably not something that we would want.

    Algorithms as a force for good

    I’m not suggesting that the regime above is being proposed by GDPR, far from it. I am just noting that algorithms (in this case pricing models) are used to make sound financial decisions that benefit society. Removing the ability to distinguish levels of risk in the interests of ‘fairness’ may have unforeseen or unexpected consequences.

    We’ve only considered the insurance industry, but the same could be applied to other industries like credit lending. A single price regime would probably increase the number of business failures and individual defaults, because it could encourage high-risk individuals and businesses to apply for credit.

    Algorithms aren’t exact — they can’t (yet) distinguish completely between those who will have an accident and those who won’t and I’m not sure we’d want them to anyway as this could create a class of uninsurable drivers. They do, however, help to spread risk across multiple groups, in a relatively equitable manner. They can be used to penalise high-risk behaviours that are detrimental to society, and therefore, hopefully, reduce those behaviours. Equally, like any tool, they can be misused and exclude sections of society and prevent people from fully participating.

    As analysts, data scientists and statisticians, we need to know the difference — and we need to ask the right questions to make sure that organisations are doing the right thing.

    Colin Gray, principal consultant SAS UK & Ireland

    As we progress into 2018, the General Data Protection Regulation (GDPR) is looming on most organisations’ radar. The requirements are becoming clearer, but there is still some ambiguity about precisely what needs doing. There is, however, no question that algorithms are an area of concern, and in particular, whether their effects are clearly understood — and more importantly, are fair.

    In this article, I’m following up on a previous blog where I discussed the fairness of algorithms used by businesses to help them make decisions. I described an example of an algorithm using age, income and number of dependants to determine whether to offer a loan to an individual. In this blog, I want to discuss the nature of risk and how algorithms use an assessment of risk to determine the price individuals pay.

    The case of motor insurance

    Let’s use motor insurance as an example, around 5%–10% of policyholders will claim each year, which means that 90%–95% of policyholders won’t claim. The claims are spread across all policyholders. We know that certain drivers pay more than others; young drivers, drivers with previous claims, those with expensive cars. Actuaries identify the factors most strongly correlated with accidents and set the premiums accordingly. Given that most people won’t have an accident in any given year, is it fair to charge different premiums?

    You could argue that some of the factors used are inherent, such as age, other factors are more choice-based; where you live, how fast you drive, the cost of your car. What would it mean if the regulations said that everyone had to be charged the same premium?

    • Young drivers à lower premium
    • More expensive cars à lower premium
    • More reckless driving à lower premium

    In practice, therefore, drivers wouldn’t be paying for the risk they represent. There would be subsidies across these groups, which as a society, we may or may not be happy with.

    However, I think we should be more concerned about second-order effects, that is, drivers are not being charged — some might call it penalised — for their risky behaviour or risky characteristics. This could encourage some drivers to take greater risks, knowing that the cost would be shared with others, and they would not personally face any financial penalty. There is very little question that the number of accidents and ultimately deaths would go up. This is presumably not something that we would want.

    Algorithms as a force for good

    I’m not suggesting that the regime above is being proposed by GDPR, far from it. I am just noting that algorithms (in this case pricing models) are used to make sound financial decisions that benefit society. Removing the ability to distinguish levels of risk in the interests of ‘fairness’ may have unforeseen or unexpected consequences.

    We’ve only considered the insurance industry, but the same could be applied to other industries like credit lending. A single price regime would probably increase the number of business failures and individual defaults, because it could encourage high-risk individuals and businesses to apply for credit.

    Algorithms aren’t exact — they can’t (yet) distinguish completely between those who will have an accident and those who won’t and I’m not sure we’d want them to anyway as this could create a class of uninsurable drivers. They do, however, help to spread risk across multiple groups, in a relatively equitable manner. They can be used to penalise high-risk behaviours that are detrimental to society, and therefore, hopefully, reduce those behaviours. Equally, like any tool, they can be misused and exclude sections of society and prevent people from fully participating.

    As analysts, data scientists and statisticians, we need to know the difference — and we need to ask the right questions to make sure that organisations are doing the right thing.

    More from Technology

    Explore more articles in the Technology category

    Image for Innovation Through Partnership: The Role of External Tech Teams
    Innovation Through Partnership: The Role of External Tech Teams
    Image for Nominations Open for Technology Awards 2026
    Nominations Open for Technology Awards 2026
    Image for Nominations Open for Innovation Awards 2026
    Nominations Open for Innovation Awards 2026
    Image for Archie earns industry recognition across G2, Capterra, and SoftwareReviews
    Archie Earns Industry Recognition Across G2, Capterra, and SoftwareReviews
    Image for The Bankaool Transformation: How a Regional Mexican Bank Became a Fintech Disruptor
    The Bankaool Transformation: How a Regional Mexican Bank Became a FinTech Disruptor
    Image for Submit Your Entry Today for Digital Banking Awards 2026
    Submit Your Entry Today for Digital Banking Awards 2026
    Image for Behavioral AI in Financial Services: Moving Beyond Automation Toward Human Understanding
    Behavioral AI in Financial Services: Moving Beyond Automation Toward Human Understanding
    Image for Submit Your Entry for Brand of the Year Awards Technology Bahrain 2026
    Submit Your Entry for Brand of the Year Awards Technology Bahrain 2026
    Image for Entries Now Open for Best Islamic Open Banking Burkina Faso APIs 2026
    Entries Now Open for Best Islamic Open Banking Burkina Faso APIs 2026
    Image for Entrepreneurial Discipline in the AI Economy: Insights from Dmytro Lavryniuk
    Entrepreneurial Discipline in the AI Economy: Insights From Dmytro Lavryniuk
    Image for Entries Now Open for Best New Digital Wallet Innovation Award 2026
    Entries Now Open for Best New Digital Wallet Innovation Award 2026
    Image for Call for Entries: Best Digital Wallet 2026
    Call for Entries: Best Digital Wallet 2026
    View All Technology Posts
    Previous Technology PostFinancial Services Sector Cashing in on Technology-Driven Change
    Next Technology PostKNOWBE4 Issues 2018 Threat Impact and Endpoint Protection Report