By James Fisher, SVP of data firm Qlik
By now we were all supposed to be more connected, but instead we’re getting more fragmented and siloed. “Likes” in social media polarize us, where algorithms favor inflammatory content, evoke stronger reactions and keep us hooked longer. We’ve seen fragmentation when it comes to local laws, regulations and privacy. In the private sector, business schools, strategy heads and activist investors preach to divest anything that’s not a core competency but in a fragmented world, with digital giants lurking around the corner, do we need to think different? For regulations, business models, and data – which increasingly is the same thing – we can turn a fragmenting landscape into an opportunity. But analysis isn’t enough. We need synthesis AND analysis to connect distributed data to the analytic supply chain – with catalogues as the connective tissue. The tech is there today but it also needs to be followed by the right processes and people. Synthesis and analysis is critical to make use of pervasive data and facilitate the evolution towards what we call “laying the data mosaic.” Below is a curated subset of the top 10 Trends we see being most important in the coming year, a full version of which we will publish in January 2020.
- Big Data is Just Data. Next up – “Wide Data”
Big Data is a relative term, and a moving target. One way to define big data is if it’s beyond what you can achieve with your current technology. If you need to replace, or significantly invest in extra infrastructure to handle data amounts, then you have a big data challenge. With infinitely scalable cloud storage, that shortcoming is gone. It’s easier now than ever to do in-database indexing and analytics, and we have tools to make sure data can be moved to the right place. The mysticism of data is gone – consolidation and the rapid demise of Hadoop distributors in 2019 is a signal of this shift.
The next focus area will be very distributed, or “wide data.” Data formats are becoming more varied and fragmented, and as a result different types of databases suitable for different flavors of data have more than doubled – from 162 in 2013, to 342 in 2019.* Combinations of wide data “eat big data for breakfast” and those companies that can achieve synthesis of these fragmented and varied data sources will gain an advantage.
- DataOps + Analytic Self-Service Brings Data Agility Through-out the Organization
Self-service analytics has been on the agenda for a long time, and has brought answers closer to the business users, enabled by “modern BI” technology. That same agility hasn’t happened on the data management side – until now. “DataOps” has come onto the scene as an automated, process-oriented methodology aimed at improving the quality and reducing the cycle time of data management for analytics. It focuses on continuous delivery and does this by leveraging on-demand IT resources and automating test and deployment of data. Technology like real-time data integration, change data capture (CDC) and streaming data pipelines are the enablers. Through DataOps, 80% of core data can be delivered in a systematic way to business users, with self-service data preparation as a standalone area needed in fewer situations. With DataOps on the operational side, and analytic self-service on the business user side, fluidity across the whole information value chain is achieved, connecting synthesis with analysis.
- Active Metadata Catalogues – the Connective Tissue for Data and Analytics
Demand for data catalogues is soaring as organizations continue to struggle with finding, inventorying and synthesizing vastly distributed and diverse data assets. In 2020, we’ll see more AI infused metadata catalogues that will help shift this gargantuan task from manual and passive to active, adaptive and changing. This will be the connective tissue and governance for the agility that DataOps and self-service analytics provides. Active metadata catalogues also include information personalization, which is an essential component for relevant insights generation and tailoring content. But for this to happen, a catalogue also needs to work not just “inside” one analytical tool, but incorporating the fragmented estate of tools that most organizations have.
- Data Literacy as a Service
Connecting synthesis and analysis to form an inclusive system will help drive data usage, but no data and analytic technology or process in the world can function if people aren’t on board. And dropping tools on users and hoping for the best is no longer enough. A critical component for overcoming industry standard 35% analytics adoption rates is to help people become confident in reading, working with, analyzing and communicating with data. In 2020, companies expect data literacy to scale, and want to partner with vendors on this journey. This is achieved through a combined software, education and support partnership – as a service – with outcomes in mind. The goal could be to drive adoption to 100%, helping combine DataOps with self-service analytics, or to make data part of every decision. For this to be effective, one needs to self-diagnose where the organization is and where it wants to get to, and then symbiotically work out how those outcomes can be achieved.
- “Shazaming” Data, and Computer/Human Interactions
The effects of data analysis on vast amounts of data have now reached a tipping point, bringing us landmark achievements. We all know Shazam, the famous musical service where you can record sound and get info about the identified song. More recently, this has been expanded to more use cases, such as clothes where you shop simply by analyzing a photo, and identifying plants or animals. In 2020, we’ll see more use-cases for “shazaming” data in the enterprise, e.g. pointing to a data-source and getting telemetry such as where it comes from, who is using it, what the data quality is, and how much of the data has changed today. Algorithms will help analytic systems fingerprint the data, find anomalies and insights, and suggest new data that should be analyzed with it. This will make data and analytics leaner and enable us to consume the right data at the right time.
We will see this combined with breakthroughs in interacting with data – going beyond search, dashboards and visualization. Increasingly we’ll be able to interact sensorally through movements and expressions, and even with the mind. Facebook’s recent buy of CTRL Labs – a mindreading wristband, and Elon Musk’s Neuralink project, are early signals of what’s to come. In 2020, some of these breakthrough innovations will begin to change the experience of how we interact with data. This holds great human benefits for all, but can also be used for ill, and must be used responsibly.
Turn fragmentation to your advantage, by connecting synthesis and analysis – to form a dynamic system. DataOps and Self Service will be the process and method. Data Literacy and Ethics will guide people to do the right thing. Innovative technologies powered by AI will facilitate through-out the entire chain to enhance and accelerate data use. These trends form tiles for laying a data mosaic in a complex fragmented world, making the use of data pervasive across the enterprise and ushering us into the next phase of success in the digital age.
Entersekt provides clarity on Secure Remote Commerce authentication techniques for financial institutions
New whitepaper from Mercator available: Revisiting Authentication in the Age of SRC and EMV 3-D Secure
Is it time for a new authentication strategy in light of international mandates for Secure Remote Commerce (SRC) and EMV 3-D Secure? This is the question posed to financial institutions (FIs) in a new Mercator Advisory Group whitepaper entitled Revisiting Authentication in the Age of SRC and EMV 3-D Secure.
The paper, licensed by Entersekt for public distribution, delves into the role SRC and EMV 3-D Secure will play in the European Union’s Strong Customer Authentication (SCA) requirements under the revised Payment Services Directive (PSD2). It finds that now would be the ideal time for FIs to rethink customer authentication strategies, particularly with the deadline for full SCA compliance approaching on the 1st of January 2021.
“Consumers face an increasingly complex authentication landscape, which can vary greatly depending on the communication channels they use,” said Frans Labuschagne, UK&I country manager at Entersekt. “Multiple authentication techniques create unwanted friction and uncertainty. This paper gives actionable advice to FIs that need to keep security top of mind while also providing a good user experience.”
All card issuers competing for top of wallet will find useful insights in this whitepaper, which states that, “Since it is well recognised that convenience is critical to consumer adoption, it is time for financial institutions to rein in the multiplicity of authentication methods they use to identify account holders and even employees.”
Some of the key findings include:
- The lack of an integrated solution results in an inconsistent user interface.
- Inconsistency not only detracts from a customer’s experience but is likely to disrupt any cross-channel implementation plans an organisation might have.
- A customer who is presented with the same authentication technique for every interaction becomes more familiar with that technique.
- The authentication technique should be implemented on a smartphone, which 89% of UK residents between 16 and 75 already have.
- Consumers increasingly trust smartphone-based biometrics and are growing accustomed to using smart speakers for a range of use cases.
To download the whitepaper in full, please visit: https://www.entersekt.com/resources/white-papers/revisiting-authentication-src-3ds
This is a Sponsored Feature
Using AI to identify public sector fraud
When it comes to audits in the public sector, both accountability and transparency are essential. Not only is the public sector under increasing scrutiny to provide assurance that finances are being managed appropriately, but it is also vital to be able to give early warnings of financial pressures or failures. Right now, given the huge value of funds flowing from the public purse into the hands of individuals and companies due to COVID measures, renewed focus on audit is essential to ensure that these funds are used for the purposes intended by parliament.
As Rachel Kirkham, former Head of Data Analytics Research at the UK National Audit Office and now Director of AI Solutions at MindBridge, discusses, introducing AI to identify and rectify potential problems before they become an issue is a key way for public sector organisations and bodies to ensure public funds are being administered efficiently, effectively and economically.
The National Crime Agency has warned repeatedly that criminals are seeking to capitalise on the Covid crisis and the latest warnings suggest that coronavirus-related fraud could end up costing the taxpayer £4bn. From the rise in company registrations associated with Bounce Back loan fraud, to job retention scheme (furlough) misuse, what plans are in place for government departments to identify the scale of fraud and error and then recoup lost funds?
There is no doubt that the speed with which these schemes were deployed, when the public sector was also dealing with a fundamental shift in service delivery, created both opportunities for fraud and risk of systematic error. But six months on, while the pandemic is still creating economic challenges, the peak of the financial crisis has passed. Ongoing financial support for businesses and individuals remains important and it is now essential to learn lessons in order to both target fraudulent activity and, critically, minimise the potential loss of public funds in the future.
Timing is everything. Government has an opportunity to review the last 6 months’ performance and strengthen internal controls to ensure that further use of public funds is appropriate. Technology should play a critical role in detecting and preventing future fraud and error.
If the public sector is to move beyond the current estimates of fraudulent activity and gain real insight into both the true level of fraud and the primary areas to address, an intelligent, data-led approach will be critical. The use of Artificial Intelligence (AI) in public sector IT systems can be used to detect errors, fraud or mismanagement of funds, and enable the process changes required to prevent further issues.
HMRC is leading the way, using its extensive experience in identifying and tackling tax fraud to address the misuse of furlough – an approach that has led to many companies making use of the amnesty to repay erroneous claims. Other public sector bodies, especially smaller local authorities, are less likely to have the skills or resources in place to undertake the required analysis. If public money is to be both recouped and safeguarded in the future, it is likely that a central government initiative will be required.
Data resources are key; the government holds a vast amount of data that could be used, although this will require cross-government collaboration and co-operation. It is possible that the delivery speed of COVID-19 responses will have led to data collection gaps – an issue that will need rapid exploration and resolution. It should be a priority to take stock of existing data holdings to identify any gaps and, at the same time, use Machine Learning to identify anomalies that could reveal either fraud or systematic error.
In addition to identifying fraud, this insight can also feed back into claims processes providing public sector bodies with a chance to move away from retrospective review towards the use of predictive analytics to improve control. With an understanding of the key indicators of fraud, the application process can automatically raise an alert when a claim looks unusual, minimising the risk of such claims being processed.
While many public sector bodies may still feel overwhelmed, it is essential to take these steps quickly. Even at a time of crisis, good processes are important – failing to learn from the mistakes of the past few months will simply compound the problem and lead to greater misuse of public funds. The public sector, businesses, and individuals need to learn how to operate in this environment, and that requires the right people to spend time looking at the data, identifying problems and putting in place new controls. With an AI-led approach, these individuals will learn lessons about what worked and what didn’t work in this unprecedented release of public funds. And they will gain invaluable insight into the identification of fraud – something that will provide on-going benefit for all public sector bodies.
Why dependency on SMS OTPs should not be the universal solution
By Chris Stephens, Head of Banking Solutions at Callsign
In our day-to-day lives, SMS one-time passwords, also known as OTPs, have unintentionally become the default authentication factor when carrying out high risk and confidential transactions online. Banks, telcos, and businesses are opting for this method as SMS OTPs are relatively quick and simple to put in place. In our digital age, this solution works for the majority of users, who more often than not possess a mobile phone and are familiar with the user experience. As a result, companies are using them to securely authenticate both their customers and employees.
When looking into SMS OTPs, businesses should consider the bigger picture and how time- and cost-efficient solutions are as a whole by taking into account other key elements that might have been neglected in the past, such as hidden fees and security vulnerabilities. Apart from this approach, there are also other options better suited to different business needs – the European Authority (EBA) has already recognised other forms, such as employing the secure binding of a device to achieve possession and the use of behavioural biometrics as an inherence factor. For example, earlier this year Google officially began moving away from SMS OTP-based authentication. Whilst in the UK both the Financial Conduct Authority (FCA) and UK Finance have recommended banks ought to reduce their dependence on its use in the longer-term. Whereas, in the past, financial institutions were choosing to use this solution because it enabled them to save time on becoming compliant with the PSD2 Strong Customer Authentication (SCA) regulation.
It is common knowledge that SMS OTPs are not without their flaws, and with the extended deadline for SCA for e-commerce less than a year away (September 2021) – is now the best time for the industry to look elsewhere for more intelligent approaches to authentication?
SMS as the go-to solution
Fraudsters are sophisticated criminals, who attack the weakest points in the system – they have observed that banks and businesses heavily rely on SMS OTPs for 2FA (two-factor authentication) transactions, which is why they continue to abuse and weaken existing systems and exploit these solutions for their own benefit. Fraudsters commonly practise SIM-swap – where they steal personal information about the victim and then contact the target’s mobile operator pretending that their phone has been lost or stolen. With lockdown rules constantly changing, not all customers are able to easily visit stores right now, therefore operators are dependent on mobile-authentication channels that are more susceptible to this type of manipulation to service their customers.
SIM-swap fraud can easily be done. As soon as the fraudster has duped the mobile operator, a number transfer is authorised and then activated on a new SIM card – it works by granting cybercriminals access to the victim’s number and consequently all one-time passwords and authentication codes that are sent to that number. In March 2020, Europol warned that SIM-swap scams are a growing problem across Europe, following an investigation that resulted in the arrest of 12 suspects associated with the theft of more than €3 million ($3.3 million).
However, consumers and businesses need to be aware that SIM-swap fraud is not the only method cybercriminals are deploying to intercept OTPs from their victims during the pandemic and beyond.
Spotting a scam
SIM-swap attacks are not the only method scammers are using, there is also a growing number of cases that take advantage of malware and remote access applications to steal SMS OTPs. They do this by socially engineering individuals to download remote access apps or hidden surveillance apps to grant access to the victim’s device, without coming into contact with it. The cybercriminals can, therefore, directly read their messages or secretly record all their texts and phone calls to another device. The unknowing victim’s personal messages, including OTPs, are tapped into by the fraudster using the same approach as SIM-swap attacks. However, this time they also have direct access to the target’s device.
Several different parties are involved in the delivery of OTPs and at each stage of the process there is an opportunity for fraudsters to capture messages. There is also the potential mass compromise as a result of hidden vulnerabilities in the SS7 network and the attack surface to consider. With all these in mind, banks need to have a good overview of all data sub-processors to allow them to adopt the most suitable security controls, such as multi-factor authentication (MFA), audit logs, and dashboards.
Watch out for hidden costs
It comes as no surprise that intercepted OTPs result in fraud losses, which quickly increase as hidden fees go unnoticed over time. Beyond the upfront costs of SMS OTPs, such as cost per text, there are also several hidden costs that are difficult to budget for and avoid. They are typically the result of the domino effect of the aforementioned issues – forcing businesses into a reactive mode that is tricky to handle.
As an example, where drop-offs take place in an authentication journey, including when SMS texts are not received, financial institutions need to be ready to manage an influx in calls to their customer service helplines and the associated fees. Or else the customer may decide to use another card to make the payment, which is worse for the bank. This is due to the fact that customers are likely to abandon the use of a card when they are fed up with a customer journey that involves too much unnecessary friction. These abandonments lead to a decrease in interchange fees for banks and could even potentially reduce the customer base for merchants.
Evaluating the user experience
Whilst most consumers possess a mobile phone, SMS is not a reliable solution for everybody. For instance, SMS OTPs are not accessible to those living in remote or low-service locations, who may struggle to receive SMS alerts. This overall experience is also cumbersome as it takes roughly 30 seconds of transaction time for the text to be delivered, compared with the almost instantaneous transactions experienced by alternative authentication approaches, such as biometrics.
In this digital age, businesses are constantly adapting to accommodate different generations including Gen Z who are digital natives – so mobile use is only going to increase and, along with it, the volume of transactions taking place on these devices will also grow. This goes hand in hand with the ever-changing needs and expectations of customers as they look for hyper-personalised online experiences as the new norm. Yes, SMS OTPs are mobile-first, but they do still require the user to switch to another app to view the SMS so they can complete the transaction, which can be annoying for the customer as it interrupts the e-commerce user journey. After a friction-filled experience, it would be unsurprising if the user then decides to abandon the transaction. With this and other existing security implications in mind, the EBA recommends banks adopt other options.
Benefits of behavioural biometrics
Every person has their own unique behaviour and habits when swiping across the screen, which can be tracked through the analysis of the data signals captured from hardware sensors when the user engages with their device. These signals are crucial to designing user features such as finger movement, hand orientation, and wrist strength. Together, artificial intelligence and machine learning provide us with the capability to analyse this information to develop a personalised prototype of that user’s swipe behaviour, which only takes milliseconds to confirm whether the customer is who they say they are. This immediately allows the bank to seamlessly carry out appropriate security actions and stop fraudsters in their path before they can even begin using a target’s device.
Behavioural biometrics is ideal for positively identifying an individual and also effectively identifies bad actors. Including when cybercriminals use technologies, such as bots or remote access Trojan (RAT) software, to control transactional flows without the user being aware. This approach to biometrics works on both high- and low-end devices and helps to protect potential victims against both blind (where the fraudster has never observed how the user swipes their phone) and over-the-shoulder attacks (where the fraudster has been able to observe the victim’s swipe movements). Both forms of attack can be detected unique algorithms, with an accuracy rate of 98%; by layering in device intelligence and locational habits it is the most accurate and robust identification method currently available on the market. By preventing criminal access, even when the attacker has observed the user’s behaviour, it offers an added level of security to businesses and banks that other traditional methods, such as a PIN or password, cannot.
In order for organisations to maintain a competitive edge and successfully navigate through the pandemic, they will need to deliver hyper-personalised journeys to meet consumers’ expectations. They are increasingly looking to bank with or sign-up to services that offer a secure and bespoke service that meets their daily needs during and beyond the pandemic.
Therefore, a holistic approach to security empowers businesses to take back control of their fraud and authentication management. Unfortunately, single point solutions, like SMS OTPs, do not allow businesses to scale or provide enough flexibility to meet these requirements. By adopting a strategic, and intelligence-based, approach financial institutions and organisations will be able to upgrade security measures and enhance the user experience – whilst keeping IT spend low.
Entersekt provides clarity on Secure Remote Commerce authentication techniques for financial institutions
New whitepaper from Mercator available: Revisiting Authentication in the Age of SRC and EMV 3-D Secure Is it time for...
Thinking Long-Term When Your Shareholders Won’t Let You
By MaryLee Sachs, US CEO, Brandpie In a recent study of nearly 700 CEOs across the US and Europe, my...
Are clients truly getting value from their BR solution?
By Matt Dickens, Senior Business Development Director at Ingenious Financial planners and wealth managers strive to deliver on the needs...
New TransUnion Study Finds Smooth Digital Transactions “Essential to Business Survival” During and After Pandemic
Economist Intelligence Unit report for TransUnion highlights the crucial role emerging technologies will play in balancing fraud prevention and customer...
How technology has made us communicate better in crisis
By Pete Hanlon, CTO of Moneypenny COVID-19 has taught us a lot. We have embraced technology, some might say, survived...
Futureproofing Your Credit Management Now
By Marieke Saeij, CEO, Onguard The pandemic has forced a shift in day-to-day operations for the majority of businesses. In...
Will covid-19 end the dominance of the big four?
By Campbell Shaw, Head of Bank Partnerships, Cardlytics Across the country, we are readjusting to refreshed restrictions on our daily...
Why cybercriminals have ‘Gone Vishing’ during the COVID-19 Pandemic
More than 215,000 vishing attempts in the last year alone As new coronavirus restrictions look set to confine much of...
Risk Mitigation vs. Risk Avoidance: Why FIs Need to Maintain Risk Appetite and Not Place All Bets on De-Risking
De-risking aims to protect financial institutions from the increasing pressures placed by regulators and threats, associated with clients operating in...
Using AI to identify public sector fraud
When it comes to audits in the public sector, both accountability and transparency are essential. Not only is the public...