Editorial & Advertiser Disclosure Global Banking And Finance Review is an independent publisher which offers News, information, Analysis, Opinion, Press Releases, Reviews, Research reports covering various economies, industries, products, services and companies. The content available on globalbankingandfinance.com is sourced by a mixture of different methods which is not limited to content produced and supplied by various staff writers, journalists, freelancers, individuals, organizations, companies, PR agencies etc. The information available on this website is purely for educational and informational purposes only. We cannot guarantee the accuracy or applicability of any of the information provided at globalbankingandfinance.com with respect to your individual or personal circumstances. Please seek professional advice from a qualified professional before making any financial decisions. Globalbankingandfinance.com also links to various third party websites and we cannot guarantee the accuracy or applicability of the information provided by third party websites.
Links from various articles on our site to third party websites are a mixture of non-sponsored links and sponsored links. Only a very small fraction of the links which point to external websites are affiliate links. Some of the links which you may click on our website may link to various products and services from our partners who may compensate us if you buy a service or product or fill a form or install an app. This will not incur additional cost to you. For avoidance of any doubts and to make it easier, you may consider any links to external websites as sponsored links. Please note that some of the services or products which we talk about carry a high level of risk and may not be suitable for everyone. These may be complex services or products and we request the readers to consider this purely from an educational standpoint. The information provided on this website is general in nature. Global Banking & Finance Review expressly disclaims any liability without any limitation which may arise directly or indirectly from the use of such information.

Biometrics and data protection in financial services

By Emma ErskineFox, associate at UK law firm TLT LLP 

When discussing biometrics with others, I find that two television programmes are inevitably mentioned: BBC drama “The Capture” and Charlie Brooker’s dystopian “Black Mirror”. But biometrics are no longer the realm of futuristic, TV production imaginings. They are increasingly forming part of our everyday life, from unlocking our phones with our fingerprints or faces, to iris recognition in airport security, to voice recognition when we talk to Alexa and Siri.

Emma Erskine-Fox
Emma Erskine-Fox

For the financial services sector, biometrics form a key part of upcoming regulatory requirements. The introduction of strong customer authentication (SCA) requirements in the Second Payment Services Directive (PSD2) puts biometrics front and centre in authenticating customer identity. When the SCA requirements come into force, payment service providers will need to authenticate customer identity using two or more of the following elements: knowledge (something only the user knows, such as a password or PIN); possession (something only the user possesses, such as a card reader); and inherence (something only the user is, i.e. a piece of biometric data).

The advantages of biometrics, both for businesses and users, are clear (and we’ll touch on some of these below). However, no conversation about biometrics would be complete without digging into the challenges posed by the General Data Protection Regulation 2016 (GDPR). Biometric data is a “special category” of personal data under the GDPR, meaning it is afforded higher levels of protection. Financial services organisations need to be keenly aware of the GDPR implications of processing biometric data to avoid weighty fines and reputational damage.

What exactly are “biometrics”? 

The mention of “biometrics” immediately brings to mind dusting for fingerprints and scanning faces in crowds. Facial and fingerprint recognition are certainly prime examples of biometric technology at work, but the concept of “biometrics” extends much further than this.

The GDPR definition of “biometric data” refers to both “physical and physiological characteristics” (encompassing the traditional examples of fingerprints and facial images, as well as (for example), iris and retina scanning, palm veins, voice recognition and DNA) and “behavioural characteristics”. The GDPR does not define this concept further, but the European Banking Authority’s (EBA’s) opinion on SCA, released in June 2019, gives an indication of how broadly this may be construed. When examining what would constitute “inherence”, the EBA refers to “behavioural biometrics” as including behavioural processes created by the body. In a non-exhaustive list of characteristics that may fall within the concept of “inherence”, the EBA identifies (among others) heart rate, keystroke dynamics (the way a user types) and even the angle at which a user holds their device.

Biometrics use cases in financial services 

SCA is the obvious example of where biometrics is already coming into play in banking and financial services. But the potential of biometrics in this arena is vast. In a world where security is key, the value of using a part of yourself as your password cannot be underestimated. After all, you can’t forget or lose your fingerprint. NatWest became the first bank, in October 2019, to issue a biometric credit card, using fingerprint recognition to authenticate identity and allow payments to be made. China has taken this a step further with “Smile-to-Pay”, which allows users to pay for goods simply by (you guessed it) smiling at a point-of-sale machine.

Biometrics also lend themselves easily to fraud detection and prevention. Take keystroke patterns; if my bank can detect that I always pause for a microsecond before the asterisk in my online banking password to find the right key, any failure to do so can trigger further authentication methods to make sure that it’s not a more adept, yet fraudulent, typist trying to access my account.

There’s a space for biometrics in customer service, too. Customers are increasingly expecting a smoother and more technology-enabled service from the organisations they engage with. It’s not infeasible to imagine voice recognition being used on customer service lines both to identify the customer without having to ask for authentication information, and potentially to inform how that customer is dealt with based on the customer’s tone and perceived mood.

Privacy challenges of biometrics

Despite the clear advantages of biometrics, organisations need to exercise caution when deploying biometric technology into their businesses. As mentioned above, biometric data is a “special category” of personal data within the GDPR definition, which means that it must be handled even more carefully than “standard” personal data. Just some of the privacy implications of using biometric technologies are as follows:

  • Transparency: Any processing of personal data requires organisations to inform individuals how their personal data is being processed, in a “clear” and “easily accessible” form. Biometric technologies often don’t have a typical user interface that would generally be used to enable access to a privacy notice; for example, voice recognition on a customer service line. Businesses will need to think about how they give customers access to appropriate privacy information within those constraints and how they can make sure that users understand the information provided to them. Particularly when biometrics are combined with other privacy-intrusive technologies, such as AI, the standard for demonstrating that comprehensive information has been provided is likely to be high. 
  • Lawful basis: Businesses need to identify an appropriate lawful basis for the processing of biometric data. As biometric data is special category personal data, a processing condition will also be required. For some use cases, this will be straightforward; where there is a legal obligation to process this type of data, such as in the SCA example, organisations will have a clear basis for that processing. In other cases, this may be trickier to demonstrate. Generally, the business will need to be able to demonstrate that the processing of biometric data is “necessary” for a particular purpose, and it could be debated whether using biometric data will ever be “necessary”, where there are usually other ways to achieve the same means. Consent to use biometric data may well be required and where this is the case, businesses need to be careful to provide users with appropriate choice and not to make the provision of a service conditional on consenting to the use of biometric data. 
  • Accuracy and bias: The risks of bias in facial recognition technology are well-documented, but all processing of biometric data is subject to obligations to ensure that the data, including decisions generated using that data, are accurate. Businesses will need to think about what processes can be put in place to allow the accuracy of data to be challenged (for example, if an individual is incorrectly identified as a fraudster through keystroke patterns) and should continually test and audit biometric technologies to ensure inaccurate decisions are not consistently being made. 
  • Automated decision-making: Wholly automated decisions that have a legal or significant effect on an individual cannot be based on biometric data except in very limited circumstances or with the individual’s consent. Businesses should build mechanisms into their biometric technologies to ensure that there is a human review of a decision; for example, using the technology to flag suspicious activity which is then reviewed manually to determine if action needs to be taken, rather than freezing accounts immediately. 
  • Security: One of the key advantages of biometrics is simultaneously one of the key challenges. Whilst you cannot forget your biometric data, you also cannot change it, unlike a password or a payment card. Once jeopardised, there is a limited amount that a user can do to regain protection and control over that data. Security measures therefore need to be of the highest standards and businesses need to ensure that third party technology providers involved in the processing of biometric data implement and monitor equivalent security measures.

Addressing the challenges 

A ‘privacy by design’ approach is key when designing and implementing biometric technology solutions. Data protection impact assessments (DPIAs) are mandatory for “high-risk” processing, particularly using new technologies. A DPIA will be indispensable not just to demonstrate compliance but to help businesses flush out where the key risks lie and determine and implement solutions to mitigate those risks.

Whatever the scenario, the processing of biometric data will always need to be proportionate, fair and justified. Businesses should think about the purposes they are intending to achieve; can those purposes be achieved using less intrusive means. If the answer is “yes”, it will be a challenge to demonstrate that using biometric data to achieve those purposes is proportionate. Ethical considerations should also be taken into account throughout the design and implementation process to ensure compliance with the overarching GDPR requirement that processing be “fair”.

The processing of biometric data will not always be at odds with the privacy legal framework, but a failure to consider the GDPR implications can land businesses in hot water. Thinking through the privacy risks from the outset can help organisations to design effective biometric solutions that respect individuals’ privacy and comply with the legislative requirements in place.