Anand Vyas, Head of Banking, Financial Services and Insurance at SQS
Instead of just adding value to the insurance sector, technology underpins its very growth and evolution. In the last few years alone, the use of mobile devices, GPS, social media and CCTV footage have all impacted hugely upon the way claims are processed and policies assessed. The analysis and value of “big data” gleaned through customer interactions has become more important than ever, as insurers look to maximise efficiencies and profits whilst keeping customers happy.
The impact of technology
With e-commerce giants impacting the way consumers shop for insurance, one of the biggest trends has been the adoption of multiple channels by Insurers to market and sell their policies. Technology now allows insurers to move from the traditional broker scenario towards a direct-to-market approach – cutting out the middleman and going straight to the customer.
Underpinned by mobile and internet-based offerings, this model is only set to accelerate with 45 per cent of C-level executives within the insurance industry expecting “distribution destruction”, where customers buy direct and even form groups to negotiate bulk purchases. Favoured by new entrants, this model has the potential to shake-up the market further by bringing down premiums and improving claims processing as well as disruptively changing the customer experience criteria.
When it comes to policy underwriting, firms are now able to transform customer data into actionable insights to make more informed individual risk assessments, rather than relying on responses to standard questions. This is demonstrated with the advent of black box car insurance, where premiums are based on the quality and quantum of driving by the policy holder.
This level of “big data” collection and analysis has become possible only through advances in software and hardware and is fast becoming integral to increasing revenues and improving the customer experience.
Add to this, constantly changing industry regulations and high customer expectations, insurers need to stay on their toes when it comes to technology as an enabler, by making it a central and successful part of their operation. How well the technology performs for both staff and customers is vital for future reputation and growth, as insurers vie for business amidst an online price and policy war.
Technology is no longer a nice to have but a differentiator – keeping up with the pace of change and future proofing the technology is key to making it work. With aging legacy systems rife among the insurance industry, modernisation is a necessity to ensure they are fit for purpose. Whatever upgrade approach is taken – be it extending existing systems or a full transformation project involving customisation of an off-the-shelf system – the same core best practice guidelines apply, to ensure the technology is ultimately meeting the business need.
- Understand demand and business requirements
By having a grasp on current systems and the demands placed on them now and in the future, insurers can assess whether a complete transformation is needed or if an existing system can be updated. It also gives insurers the opportunity to assess the efficiency of business processes and update if necessary. Can your CRM system provide 360 degree view of customers? Can your fraud management system flag transactions based on a set of business rules? Can your website be optimised for mobile devices? Failure to understand this could result in gaps being uncovered during implementation and testing stage, causing lengthy and costly delays.
- Create a strategic roadmap and business case
Be realistic about limitations, complexities, effort and associated costs of the project. Underestimation here can be the downfall of any transformation. Involvement of the quality assurance team can result in more realistic timescales, tailored to meet project and quality objectives. This can also help to determine if the current systems would support the new demands placed on it without having to invest in new infrastructure.
- Analyse applications and business capabilities
Extensive analysis of current and potential applications, along with the examination of business and technical capabilities is vital in arriving at the best solution. Integration with existing solutions must be assessed and a detailed quality management and test strategy drawn up to ensure a smooth transition.
- Adopt a risk-based testing approach to implementation
To overcome the complexities of an upgrade or overhaul and ensure reasonable ROI, legacy transformation projects need to adopt a risk based testing approach. Business priorities identified earlier in the process need to be the primary drivers for testing, with skilled, quality assessment teams needed to ensure effective implementation.
- Take an agile approach
An Agile approach enables early and iterative validation of the business requirements through business acceptance and Retrospective sessions. Taking an agile, iterative approach is key to ensuring that any changes take place smoothly and successfully. This will also ensure that testing becomes a key part of the process, by becoming integral as the project is broken down into bite sized chunks and quality assessed at each stage.
Technology will continue to evolve, so it is imperative that insurers don’t stand still and have solid and robust procedures in place to deal with the next trend. Whilst the technologies will change, the processes that underpin their success will stand the test of time and by putting a quality assurance framework at its heart, the industry will be ready and well placed to take advantage of the next big change to come its way.
Track and Trace and Other Lost Data
By Ian Smith, General Manager and Finance Director at Invu
You, like me, were probably amazed by the now infamous loss of the over 16,000 positive test results in the track and trace system due to an Excel spreadsheet error.
You, like me, probably wondered how the Government could get something so important so wrong?
But perhaps we should aks are standing in a greenhouse launching stones?
Data risks from software
Today we are spoilt with software offerings that help us with both our personal and our work lives.
Microsoft Excel is a powerful application and offers many functions now that required moderately complex macro writing in the past, seducing all of us into submitting more data for it to analyse. In finance, we tend to solve all those problems our applications cannot address using Excel.
In finance, we also know the risks of formula errors, and if we have relied on it enough, we will have our own war stories to go with these risks. Yet, we often continue to use the tool for operations that make those folks with an information technology background shake their heads.
These Excel files nowadays may find themselves resident on a local file server or one of the many file servers in the cloud (like those from the big three, DropBox, Google Drive and Microsoft OneDrive or other less well-known file sharing applications). Many of us use these in multiple ways.
Beyond finance and Excel, there are now many applications that we run our data through and leave data stored in the form of documents, comments and notes.
The long-standing example is email. We today receive many documents via email, with content in the body often providing context. Email systems then become the store for that data. While this works from a personal point of view, for a business working at scale, the information stored this way can be lost to the rest of the business. Just like data falling off a spreadsheet when there are not enough rows to capture the results.
More recently, we have seen easy to consume applications develop in many areas like chat and productivity. Take for example task management apps, my own preference being Monday.com (I am sparing you the long list of these). The result of the task and how we got there, in the form of attachments or comments, are often stored in the application. Each application we touch encourages us to leave a bit of data behind in its store.
Many of these applications can have a personal use and an initial personal dalliance is what sparks up the motivation to apply the application to a business purpose. Just like the “Track and Trace System”, they can often find themselves being used in an environment where the scale of the operation overwhelms their intended use.
In our business lives, combining the use of applications in this way by liberally sprinkling our data across multiple systems often stored in documents (be they Microsoft Word, email, scans or comments and notes) puts us on the pathway to trouble.
Imagine how Matt Hancock felt explaining to Parliament that the world-class track and trace system depended on a spreadsheet.
Can you imagine a similar situation in your business life? Say, for example, that documents or data in some form was lost because of the use of disparate systems and/or applications that were not really designed for the task you assigned to them.
Who would be your Parliament?
Now you can see yourself in the greenhouse, you may not want to reach for that metaphorical stone.
If these observations create some concerns for you, you may want to consider the information management strategy at your business. You have a strategy, even if it is not addressed specifically in documents, plans or thought processes.
These steps may help figure out where you are and where you want to go.
- Assess your current environment.
Are you a centraliser, with all the information collected in one place? Or is all your data spread across multiple stores, as identified above? Are you storing your key business information on paper documents, or digitally or a mix of both.
- Assess your current processes.
Do your processes run on a limited number of software applications? Or do you enable staff to pick their own tools to get things done? The answer to this question is often a mix of both where staff bridge the gaps in those applications using tools like MS excel. A key application to think about is how the data in email, particularly the attachments, is made available to the business.
- Design a pathway for change and implement it.
Start with the end in mind. I suggest the goal is to enable the right people to have the right access to the information they require to do their job in real-time. I believe the way to effectively do this is to go digital. The fork in the road is then whether to centralise your information store or adopt a decentralised approach.
My own preferred route is to centralise using document management software that enables all your documents to be stored in one place. Applications like email can be integrated with it, significantly reducing the workload required to file and store the data. The data can then be used in business applications using workflows. Thinking these workflows through will help you assess the gaps between your key business applications and consider whether tools like excel are being stretched too far.
NICE Unveils ENLIGHTEN Fraud Prevention Powered by AI and Voice Biometrics to Empower Contact Centers in Safeguarding Consumers
Using AI-enabled interpretive and predictive models and advanced voice biometrics, the new solution continuously scans millions of calls to proactively identify fraudulent behavior and protect brand reputation
NICE (Nasdaq: NICE) today unveiled ENLIGHTEN Fraud Prevention, an innovative new solution for automatic and continuous fraudster detection and exposure. Bringing together NICE ENLIGHTEN’s comprehensive Customer Engagement AI platform with the company’s voice biometrics capabilities, the solution continuously scans millions of calls to accurately pinpoint suspicious behavior and uncover previously unidentified fraudsters. Adopting a proactive approach, NICE ENLIGHTEN Fraud Prevention significantly reduces fraud losses and handling time while protecting consumers and improving their experience.
“Contact center fraud is growing in frequency, breadth and sophistication,” observes Dan Miller, Lead Analyst at Opus Research. “NICE ENLIGHTEN Fraud Prevention stands out as an integrated, pre-emptive AI-based Fraud Prevention solution that actively prevents malicious activities with minimum additional effort from customers.”
Unlike most technologies that focus on a single call, NICE ENLIGHTEN Fraud Prevention includes powerful AI interpretive and predictive models that scan millions of voice interactions over time to detect abnormal, risky behavior including requests to change addresses or authentication methods without relying on agents to manually capture dispositions. NICE’s Proactive Fraudster Exposure voice biometrics capability included within the solution is then used to expose perpetrators and create a ranked and prioritized list of suspected fraudsters. Importantly, the solution is self-training, constantly learning from identified behaviors, continuously updating its AI models and thus consistently improving results. With this novel solution, organizations can protect customers from account takeover and prevent exposure of personally identifiable information, reduce fraud losses, optimize fraud analyst team efficiency and safeguard brand loyalty.
“We are proud to bring yet another market-first offering with NICE ENLIGHTEN Fraud Prevention,” Barry Cooper, President, NICE Enterprise Group, said. “NICE ENLIGHTEN is NICE’s AI platform with models specific to the Customer Engagement domain. A number of solutions across our portfolio are being infused with AI from NICE ENLIGHTEN including our Proactive Fraudster Exposure solution. NICE ENLIGHTEN Fraud Prevention ensures that fraudsters are rapidly and proactively stopped in their tracks so organizations can protect their customers and their brand. We believe that by bringing AI to Fraud Prevention we provide organizations with the agility that makes it even more difficult for the fraudsters to win.”
Financial Services Sector Leads in Fixing Application Flaws, Lags in Time to Remediate
Veracode, the largest global provider of application security testing (AST) solutions, today released findings revealing that the financial services industry has the best flaw fix rate across six industries and leads a majority of industries in uncovering flaws within open source components. Fixing open source flaws is critical because the attack surface of applications is much larger than developers expect when open source libraries are included indirectly.
The findings came as a result of Veracode’s State of Software Security Volume 11, which analysed 130,000 applications from 2,500 companies. The research found that financial services organizations have the smallest proportion of applications with flaws and the second-lowest prevalence of severe flaws behind the manufacturing sector. It also has the highest fix rate among all industries, fixing 75% of flaws. Still, the research found that financial services firms require about six and a half months to resolve half of the flaws they find, indicating it is slower than other industries to remediate.
“Financial services firms have a median time to remediation of more than six months, despite having a high fix rate compared to other sectors,” said Chris Wysopal, Chief Technology Officer at Veracode. “However, developers in the financial services industry are often limited by the nature of the environments they are working in, as applications tend to be older, have a medium flaw density, and aren’t consistently following DevSecOps practices compared to other industries. With some additional training and sticking to best practices, they can quickly remediate issues and start to reduce security debt.”
Financial Services Specific Findings
Veracode’s research found compelling evidence that certain developer behaviours associated with DevSecOps yield substantial benefits to software security. The findings detail that financial services firms:
- Are a leading industry when it comes to fixing flaws in their open source software and establishing strong scan cadences.
- Fall to middle-of-the-road for scanning frequency and integrating security testing, and are not likely to be using dynamic analysis (DAST) scanning technology to uncover vulnerabilities.
- Outperform averages across all industries in dealing with issues related to cryptography, input validation, Cross-Site Scripting, and credentials management – all things related to protecting users of financial applications.
Track and Trace and Other Lost Data
By Ian Smith, General Manager and Finance Director at Invu You, like me, were probably amazed by the now infamous...
Why ID verification is no longer a barrier to global growth in banking
By Barley Laing, UK Managing Director at Melissa Issues related to effective identity (ID) verification have restricted the global growth...
Digital Finance: Unlocking New Capital in Disrupted Markets
By Krishnan Raghunathan, Head of Finance & Accounting Services at WNS, explores how a digitally transformed finance department can give enterprises...
Beyond the bottom line: why brands must show they care to connect with customers
By Vadim Grigoryan, Partner, Lunu Over the past few years, we’ve witnessed an ever-growing activism among consumers, with public opinion...
O-CITY enters Kenya to drive contactless payments across Matatu bus service
Up to 10,000 buses to become cashless with O-CITY’s M-Pesa-based ticketing solution O-CITY, the automated fare collection provider by BPC,...
Nearly 14 Million1 UK adults more likely to spend on Black Friday than they were last year
Yolt launches evolved app to help shoppers save whilst they spend Across the UK, consumers are set to spend £6.4bn...
Christmas isn’t cancelled: European shoppers plan to spend more online this Black Friday
Half (52%) of European consumers plan to do Christmas shopping around holiday sales, including Black Friday, compared to previous years...
The largest event in e-commerce history? ‘Tis the season
By James Booth, VP Head of Partnerships for EMEA, at PPRO Sometimes, change happens slowly. Other times it chases you...
Optimum Finance bolsters its offering in three regions with two new sales directors and commercial director promotion
Leading invoice finance provider and fintech firm Optimum Finance has appointed two regional sales directors to fulfil the funding needs of SMEs...
Bank of Idaho Selects Teslar Software to Enhance Customer Service
Partnership enables bank to spend more time with borrowers, better meet their needs Teslar Software, a provider of automated workflow...