By Jason Monger, Senior Systems Engineer, Financial Services, Nimble Storage
Fraud is on the rise. In the first half of 2016 alone, there were more than a million cases of fraud in Britain accounting to £399.5 million. This poses a massive threat the financial services industry, often seriously undermining confidence in the affected institutions.
While new digital approaches to financial services have surely changed the game for committing fraud, so too has it provided a great opportunity to help combat it. By deploying the massive amount of financial data generated every day, financial services companies can get better at spotting fraud and eliminate false positives.
Need for speed
Many banks and financial institutions are now routinely analysing a number of aspects of clients’ behaviour to help determine when transactions are fraudulent. From monitoring account balances, location, employment details, spending patterns, and even the speed at which they slide their credit card, behavioural analysis can determine if the card is being used by its owner.
However, this analysis is only effective if the information can be analysed fast enough to make decisions in time to prevent, or at least reduce the impact, of the fraud. Identifying a month later that a transaction was fraudulent does little more for the bank than help verify a customer complaint.
It is for this reason that financial services organisations have always been at the forefront of the big data analytics revolution. It is now, and has been for a long time, the cornerstone of a successful firm.
But with ever-growing data sets, organisations face increasingly complex challenges around both data storage and ensuring the latency of data so they can analyse it in real-time to prevent fraud.
Closing the app data gap
This process of behavioural analysis is only as fast as the slowest component in an organisation’s data centre.
Financial services institutions often have hundreds of terabytes – and in some cases even petabytes – of market databases. When analysing this data, organisations often face performance issues because the applications running the analysis are not able to access the relevant data fast enough. This creates an app data gap.
To understand the issue that the app data gap poses, think of the application experiencing the same performance issues that you face when software on your computer stutters, or struggles to bring up a document off your hard drive or on your organisation’s server. In the same way, the application isn’t accessing the data fast enough to run at speed.
In the case of monitoring for fraud, the slow data delivery to analytics applications reduces the speed of the analysis – and even short delays can result in missing the opportunity to block fraudulent transactions. It’s therefore crucial that financial services institutions remove the barriers to data velocity to improve speed of their big data analytics
Barriers to data velocity
Storage is often assumed to be the cause of application breakdowns, but Nimble Storage’s analysis of 7,500 companies found that 54 per cent of cases rise from issues with the interoperability, configuration, and/or not following best practice steps unrelated to storage.
One underlying reason for this is that the majority of data centre components are designed independently. As such, even ‘best of breed’ components may hinder the interoperability of overall infrastructure.
Even buying all components from one IT vendor’s portfolio can’t protect against this challenge, with so many large company’s solutions being made up of smaller acquired businesses’ solutions.
Optimisation through machine learning and predictive analytics
To remove the barriers to data velocity across many organisations’ increasingly complex infrastructure, company’s should be looking to deploy solutions incorporating machine learning and predictive analytics to address interoperability or capacity issues before they create an app data gap.
One advantage of adopting such solutions is that IT teams can analyse the performance metrics gathered from a large volume of high performing environments to create a baseline. This then helps them identify poor performance early, reducing the impact on the application.
By using sensors to monitor activity of multiple elements across the infrastructure at the time of an event, IT teams can also identify cause and effect relationships. This can help them prevent problems arising from interoperability issues between the different releases for different components by comparing results to those of other environments, and provide smart recommendations on how to avoid conflicts.
Using machine learning to evolve software releases also enables teams to optimise availability and performance from correlations across the stack.
Time is money
With the cost of fraud high and rising, financial services companies need to be reflecting on how they can increase data velocity for the data analysis to ensure that its infrastructure isn’t hindering the powerful analytics applications that identify fraudulent activity.
It’s essential that they look at their entire infrastructure stack to eliminate the diverse and complex operations that can slow down data delivery, and, in turn, analysis. Because when it comes to fraud, time truly can cost money.
Track and Trace and Other Lost Data
By Ian Smith, General Manager and Finance Director at Invu
You, like me, were probably amazed by the now infamous loss of the over 16,000 positive test results in the track and trace system due to an Excel spreadsheet error.
You, like me, probably wondered how the Government could get something so important so wrong?
But perhaps we should aks are standing in a greenhouse launching stones?
Data risks from software
Today we are spoilt with software offerings that help us with both our personal and our work lives.
Microsoft Excel is a powerful application and offers many functions now that required moderately complex macro writing in the past, seducing all of us into submitting more data for it to analyse. In finance, we tend to solve all those problems our applications cannot address using Excel.
In finance, we also know the risks of formula errors, and if we have relied on it enough, we will have our own war stories to go with these risks. Yet, we often continue to use the tool for operations that make those folks with an information technology background shake their heads.
These Excel files nowadays may find themselves resident on a local file server or one of the many file servers in the cloud (like those from the big three, DropBox, Google Drive and Microsoft OneDrive or other less well-known file sharing applications). Many of us use these in multiple ways.
Beyond finance and Excel, there are now many applications that we run our data through and leave data stored in the form of documents, comments and notes.
The long-standing example is email. We today receive many documents via email, with content in the body often providing context. Email systems then become the store for that data. While this works from a personal point of view, for a business working at scale, the information stored this way can be lost to the rest of the business. Just like data falling off a spreadsheet when there are not enough rows to capture the results.
More recently, we have seen easy to consume applications develop in many areas like chat and productivity. Take for example task management apps, my own preference being Monday.com (I am sparing you the long list of these). The result of the task and how we got there, in the form of attachments or comments, are often stored in the application. Each application we touch encourages us to leave a bit of data behind in its store.
Many of these applications can have a personal use and an initial personal dalliance is what sparks up the motivation to apply the application to a business purpose. Just like the “Track and Trace System”, they can often find themselves being used in an environment where the scale of the operation overwhelms their intended use.
In our business lives, combining the use of applications in this way by liberally sprinkling our data across multiple systems often stored in documents (be they Microsoft Word, email, scans or comments and notes) puts us on the pathway to trouble.
Imagine how Matt Hancock felt explaining to Parliament that the world-class track and trace system depended on a spreadsheet.
Can you imagine a similar situation in your business life? Say, for example, that documents or data in some form was lost because of the use of disparate systems and/or applications that were not really designed for the task you assigned to them.
Who would be your Parliament?
Now you can see yourself in the greenhouse, you may not want to reach for that metaphorical stone.
If these observations create some concerns for you, you may want to consider the information management strategy at your business. You have a strategy, even if it is not addressed specifically in documents, plans or thought processes.
These steps may help figure out where you are and where you want to go.
- Assess your current environment.
Are you a centraliser, with all the information collected in one place? Or is all your data spread across multiple stores, as identified above? Are you storing your key business information on paper documents, or digitally or a mix of both.
- Assess your current processes.
Do your processes run on a limited number of software applications? Or do you enable staff to pick their own tools to get things done? The answer to this question is often a mix of both where staff bridge the gaps in those applications using tools like MS excel. A key application to think about is how the data in email, particularly the attachments, is made available to the business.
- Design a pathway for change and implement it.
Start with the end in mind. I suggest the goal is to enable the right people to have the right access to the information they require to do their job in real-time. I believe the way to effectively do this is to go digital. The fork in the road is then whether to centralise your information store or adopt a decentralised approach.
My own preferred route is to centralise using document management software that enables all your documents to be stored in one place. Applications like email can be integrated with it, significantly reducing the workload required to file and store the data. The data can then be used in business applications using workflows. Thinking these workflows through will help you assess the gaps between your key business applications and consider whether tools like excel are being stretched too far.
NICE Unveils ENLIGHTEN Fraud Prevention Powered by AI and Voice Biometrics to Empower Contact Centers in Safeguarding Consumers
Using AI-enabled interpretive and predictive models and advanced voice biometrics, the new solution continuously scans millions of calls to proactively identify fraudulent behavior and protect brand reputation
NICE (Nasdaq: NICE) today unveiled ENLIGHTEN Fraud Prevention, an innovative new solution for automatic and continuous fraudster detection and exposure. Bringing together NICE ENLIGHTEN’s comprehensive Customer Engagement AI platform with the company’s voice biometrics capabilities, the solution continuously scans millions of calls to accurately pinpoint suspicious behavior and uncover previously unidentified fraudsters. Adopting a proactive approach, NICE ENLIGHTEN Fraud Prevention significantly reduces fraud losses and handling time while protecting consumers and improving their experience.
“Contact center fraud is growing in frequency, breadth and sophistication,” observes Dan Miller, Lead Analyst at Opus Research. “NICE ENLIGHTEN Fraud Prevention stands out as an integrated, pre-emptive AI-based Fraud Prevention solution that actively prevents malicious activities with minimum additional effort from customers.”
Unlike most technologies that focus on a single call, NICE ENLIGHTEN Fraud Prevention includes powerful AI interpretive and predictive models that scan millions of voice interactions over time to detect abnormal, risky behavior including requests to change addresses or authentication methods without relying on agents to manually capture dispositions. NICE’s Proactive Fraudster Exposure voice biometrics capability included within the solution is then used to expose perpetrators and create a ranked and prioritized list of suspected fraudsters. Importantly, the solution is self-training, constantly learning from identified behaviors, continuously updating its AI models and thus consistently improving results. With this novel solution, organizations can protect customers from account takeover and prevent exposure of personally identifiable information, reduce fraud losses, optimize fraud analyst team efficiency and safeguard brand loyalty.
“We are proud to bring yet another market-first offering with NICE ENLIGHTEN Fraud Prevention,” Barry Cooper, President, NICE Enterprise Group, said. “NICE ENLIGHTEN is NICE’s AI platform with models specific to the Customer Engagement domain. A number of solutions across our portfolio are being infused with AI from NICE ENLIGHTEN including our Proactive Fraudster Exposure solution. NICE ENLIGHTEN Fraud Prevention ensures that fraudsters are rapidly and proactively stopped in their tracks so organizations can protect their customers and their brand. We believe that by bringing AI to Fraud Prevention we provide organizations with the agility that makes it even more difficult for the fraudsters to win.”
Financial Services Sector Leads in Fixing Application Flaws, Lags in Time to Remediate
Veracode, the largest global provider of application security testing (AST) solutions, today released findings revealing that the financial services industry has the best flaw fix rate across six industries and leads a majority of industries in uncovering flaws within open source components. Fixing open source flaws is critical because the attack surface of applications is much larger than developers expect when open source libraries are included indirectly.
The findings came as a result of Veracode’s State of Software Security Volume 11, which analysed 130,000 applications from 2,500 companies. The research found that financial services organizations have the smallest proportion of applications with flaws and the second-lowest prevalence of severe flaws behind the manufacturing sector. It also has the highest fix rate among all industries, fixing 75% of flaws. Still, the research found that financial services firms require about six and a half months to resolve half of the flaws they find, indicating it is slower than other industries to remediate.
“Financial services firms have a median time to remediation of more than six months, despite having a high fix rate compared to other sectors,” said Chris Wysopal, Chief Technology Officer at Veracode. “However, developers in the financial services industry are often limited by the nature of the environments they are working in, as applications tend to be older, have a medium flaw density, and aren’t consistently following DevSecOps practices compared to other industries. With some additional training and sticking to best practices, they can quickly remediate issues and start to reduce security debt.”
Financial Services Specific Findings
Veracode’s research found compelling evidence that certain developer behaviours associated with DevSecOps yield substantial benefits to software security. The findings detail that financial services firms:
- Are a leading industry when it comes to fixing flaws in their open source software and establishing strong scan cadences.
- Fall to middle-of-the-road for scanning frequency and integrating security testing, and are not likely to be using dynamic analysis (DAST) scanning technology to uncover vulnerabilities.
- Outperform averages across all industries in dealing with issues related to cryptography, input validation, Cross-Site Scripting, and credentials management – all things related to protecting users of financial applications.
Predictions 2021: The Path To a New Normal Demands Increased Business Resilience and Cost Efficiency
By Jussi Karjalainen at Valtatech A global pandemic, wild bush fires, a stock market crash, a presidential impeachment, and presidential...
Is now a good time to consider art as an investment?
By Anita Choudhrie, Founder of Stellar International Art Foundation Back in April, as Covid-19 began to have a significant impact...
DAC 6 – D Day is imminent – Update of key elements
By Andrew Knight is managing partner of Harneys Luxembourg office and head of its Tax and Tax Regulatory team in...
5 steps for SMEs to budget properly for the coming year
By Fabio Comminot, Head of Dealing, Switzerland at Ebury, one of Europe’s largest Fintechs, has provided a five-step guide to...
Cash in the time of Covid-19: A tale of financial exclusion
By Matt Adam, company’s chief executive, We Are Digital Financial exclusion rates are on the rise thanks to Covid-19. But...
Track and Trace and Other Lost Data
By Ian Smith, General Manager and Finance Director at Invu You, like me, were probably amazed by the now infamous...
Why ID verification is no longer a barrier to global growth in banking
By Barley Laing, UK Managing Director at Melissa Issues related to effective identity (ID) verification have restricted the global growth...
Digital Finance: Unlocking New Capital in Disrupted Markets
By Krishnan Raghunathan, Head of Finance & Accounting Services at WNS, explores how a digitally transformed finance department can give enterprises...
Beyond the bottom line: why brands must show they care to connect with customers
By Vadim Grigoryan, Partner, Lunu Over the past few years, we’ve witnessed an ever-growing activism among consumers, with public opinion...
O-CITY enters Kenya to drive contactless payments across Matatu bus service
Up to 10,000 buses to become cashless with O-CITY’s M-Pesa-based ticketing solution O-CITY, the automated fare collection provider by BPC,...