Connect with us
Our website publishes news, press releases, opinion and advertorials on various financial organizations, products and services which are commissioned from various Companies, Organizations, PR agencies, Bloggers etc. These commissioned articles are commercial in nature. This is not to be considered as financial advice and should be considered only for information purposes. It does not reflect the views or opinion of our website and is not to be considered an endorsement or a recommendation. We cannot guarantee the accuracy or applicability of any information provided with respect to your individual or personal circumstances. Please seek Professional advice from a qualified professional before making any financial decisions. We link to various third-party websites, affiliate sales networks, and to our advertising partners websites. When you view or click on certain links available on our articles, our partners may compensate us for displaying the content to you or make a purchase or fill a form. This will not incur any additional charges to you. To make things simpler for you to identity or distinguish advertised or sponsored articles or links, you may consider all articles or links hosted on our site as a commercial article placement. We will not be responsible for any loss you may suffer as a result of any omission or inaccuracy on the website.


Biometrics Is the Fraud-Fighting Superhero Banks Deserve

Biometrics Is the Fraud-Fighting Superhero Banks Deserve 3

Biometrics Is the Fraud-Fighting Superhero Banks Deserve 4By Sanjay Gupta, SVP and Managing Director, HooYu at Mitek Systems

You may have received a call recently from a breathless IRS employee. They claim you owe back taxes and that if you don’t pay soon the IRS will come after you. The caller, of course, is a fraudulent actor. The IRS only contacts taxpayers via mail correspondence. Unfortunately, plenty of people proceed to give these fraudsters personal information. Some even transfer the requested dollar amount.

Even if one stops short of wiring money to the fraudster’s account, the damage is already done if the criminal acquires personally identifiable information (PII). Today’s criminals are becoming more adept at collecting PII from unwitting strangers to commit identity theft and eventually crimes such as credit card, loan, or phone or utilities fraud.

Attempts such as phony IRS calls are particularly effective because they target individuals during moments of duress: “Oh, no! I owe how much in back taxes?” Someone in a state of panic is more likely to hand over the proverbial keys to their account. They’re focused on finding a way out of a bad situation and allow security practices to fall by the wayside.

Fraudsters can then use that PII to mimic a real person and open a credit card account or take over an existing account. After a few months of regular payments, they max out the line of credit and ditch the account. Now, there are two victims of identity theft: the individual whose identity was stolen and the bank, which is owed a debt it cannot collect.

Financial services firms are incentivized to protect both themselves and their customers from fraud. But criminals are becoming more sophisticated in their attack methods. Increasingly, todays’ fraudsters use tactics like the phony IRS call to create synthetic identities. By combining made-up information, stolen real or slightly modified PII and information they purchase on the dark web, fraudulent actors can create legitimate-looking online identities. Unfortunately for banks trying to protect customers and their own lines of credit, up to 95% of synthetic identities beat the systems designed to stop fraud.

Financial services firms must rethink their approach to fraud detection and leverage technology that can stop methods like synthetic fraud in its tracks.

Right now, banks often rely solely on static PII, such as social security numbers, at account origination. Or they lean on credit bureau information for fraud detection. Most have enabled two-factor identification for logging into digital platforms. But the two factors are typically a static password and texted numerical code. These approaches will not work when criminals have that information readily available.

Instead, financial services firms should employ solutions that incorporate AI-driven verification technology like biometrics to continuously authenticate their customers’ identities and catch fraud as it happens, if not before. This might be as simple as two-factor authentication that requires a customer to take a selfie, along with inputting their password, to log into their banking app. Liveness detection technology can also analyze voice cues in real-time to verify that an actual human is logging into an app or using an ATM. Fraudsters would need to obtain PII and look and speak exactly like the person they’re mimicking to breach that sort of biometric firewall.

Other biometric-based approaches analyze behavior, such as how a person types, what words and phrases they use often in correspondence and their internet activity to develop a unique digital fingerprint. Even if a phony IRS call extracts PII, the criminal with that information has no way of replicating typing patterns or someone’s unique lexicon.

Whether behavioral analytics or liveness detection tools, financial services firms should consider employing biometrics to fight fraud and reduce account takeover because facial patterns are infinitely more complex than six-digit passcodes. Criminals are too sophisticated and PII is too readily available for yesterday’s fraud-detection models to withstand modern defrauders’ attempts. Moving forward, banks must focus on getting their technology right. They must proactively future proof their identity verification processes against modern fraudsters. It’s the only way to protect themselves and their customers.

Global Banking and Finance Review Awards Nominations 2022
2022 Awards now open. Click Here to Nominate


Newsletters with Secrets & Analysis. Subscribe Now