Anand Vyas, Head of Banking, Financial Services and Insurance at SQS
Instead of just adding value to the insurance sector, technology underpins its very growth and evolution. In the last few years alone, the use of mobile devices, GPS, social media and CCTV footage have all impacted hugely upon the way claims are processed and policies assessed. The analysis and value of “big data” gleaned through customer interactions has become more important than ever, as insurers look to maximise efficiencies and profits whilst keeping customers happy.
The impact of technology
With e-commerce giants impacting the way consumers shop for insurance, one of the biggest trends has been the adoption of multiple channels by Insurers to market and sell their policies. Technology now allows insurers to move from the traditional broker scenario towards a direct-to-market approach – cutting out the middleman and going straight to the customer.
Underpinned by mobile and internet-based offerings, this model is only set to accelerate with 45 per cent of C-level executives within the insurance industry expecting “distribution destruction”, where customers buy direct and even form groups to negotiate bulk purchases. Favoured by new entrants, this model has the potential to shake-up the market further by bringing down premiums and improving claims processing as well as disruptively changing the customer experience criteria.
When it comes to policy underwriting, firms are now able to transform customer data into actionable insights to make more informed individual risk assessments, rather than relying on responses to standard questions. This is demonstrated with the advent of black box car insurance, where premiums are based on the quality and quantum of driving by the policy holder.
This level of “big data” collection and analysis has become possible only through advances in software and hardware and is fast becoming integral to increasing revenues and improving the customer experience.
Add to this, constantly changing industry regulations and high customer expectations, insurers need to stay on their toes when it comes to technology as an enabler, by making it a central and successful part of their operation. How well the technology performs for both staff and customers is vital for future reputation and growth, as insurers vie for business amidst an online price and policy war.
Technology is no longer a nice to have but a differentiator – keeping up with the pace of change and future proofing the technology is key to making it work. With aging legacy systems rife among the insurance industry, modernisation is a necessity to ensure they are fit for purpose. Whatever upgrade approach is taken – be it extending existing systems or a full transformation project involving customisation of an off-the-shelf system – the same core best practice guidelines apply, to ensure the technology is ultimately meeting the business need.
- Understand demand and business requirements
By having a grasp on current systems and the demands placed on them now and in the future, insurers can assess whether a complete transformation is needed or if an existing system can be updated. It also gives insurers the opportunity to assess the efficiency of business processes and update if necessary. Can your CRM system provide 360 degree view of customers? Can your fraud management system flag transactions based on a set of business rules? Can your website be optimised for mobile devices? Failure to understand this could result in gaps being uncovered during implementation and testing stage, causing lengthy and costly delays.
- Create a strategic roadmap and business case
Be realistic about limitations, complexities, effort and associated costs of the project. Underestimation here can be the downfall of any transformation. Involvement of the quality assurance team can result in more realistic timescales, tailored to meet project and quality objectives. This can also help to determine if the current systems would support the new demands placed on it without having to invest in new infrastructure.
- Analyse applications and business capabilities
Extensive analysis of current and potential applications, along with the examination of business and technical capabilities is vital in arriving at the best solution. Integration with existing solutions must be assessed and a detailed quality management and test strategy drawn up to ensure a smooth transition.
- Adopt a risk-based testing approach to implementation
To overcome the complexities of an upgrade or overhaul and ensure reasonable ROI, legacy transformation projects need to adopt a risk based testing approach. Business priorities identified earlier in the process need to be the primary drivers for testing, with skilled, quality assessment teams needed to ensure effective implementation.
- Take an agile approach
An Agile approach enables early and iterative validation of the business requirements through business acceptance and Retrospective sessions. Taking an agile, iterative approach is key to ensuring that any changes take place smoothly and successfully. This will also ensure that testing becomes a key part of the process, by becoming integral as the project is broken down into bite sized chunks and quality assessed at each stage.
Technology will continue to evolve, so it is imperative that insurers don’t stand still and have solid and robust procedures in place to deal with the next trend. Whilst the technologies will change, the processes that underpin their success will stand the test of time and by putting a quality assurance framework at its heart, the industry will be ready and well placed to take advantage of the next big change to come its way.