Connect with us

Technology

WHY BIG DATA IS DRIVING THE INSURANCE SECTOR TO THE CLOUD

Published

on

Why Big Data Is Driving The Insurance Sector To The Cloud

Bob Welton, Regional Director Northern Europe, NTT Europe.

With IDC forecasting the big data market to expand at a compound annual growth rate of more than 31 per cent to be worth nearly $24 billion in 2016, it is important that businesses understand that to unlock the true potential of big data they need to implement a cloud strategy first.

It is clear that a growing number of business leaders are turning to the cloud to help gain a competitive edge for their organisations as well as looking for new and efficient ways of working. In 2013 a study by KPMG study revealed that the use of cloud dominated boardroom planning, with  42 per cent of UK organisations revealing that at least one-fifth of their total IT spend in the next 12 months would focus on cloud services. However while the business benefits of moving to the cloud may be clear for many, insurers are finding the challenge of marrying data, security and the cloud a tricky one.

The most recent wave of automation and new technologies have significantly enhanced operational efficiencies in the insurance sector, increasing revenue opportunities and improving the customer experience. A recent PwC report highlighted that 49% of insurers think a competitive advantage will be gleaned for those who can unlock big data, but a strategy to understand how best to adapt is required. With so much data accumulating alongside growing compliance and regulatory demands the challenge facing insurers is how to use big data now to get ahead tomorrow?

Big data in insurance

Why Big Data Is Driving The Insurance Sector To The Cloud

Why Big Data Is Driving The Insurance Sector To The Cloud

Across the ages insurers have used data in all forms to profile customers and calculate risk. Today, insurers use analytics to build an accurate profile of their customer base and the increasing availability of customer data is driving the view that insight and personalisation is a key-driver for the industry. To achieve a personalised approach, insurers need to take data from a wide variety of sources and apply appropriate analytics to unlock key information. However this information is a moveable feast of timely updates and it’s this inconsistency, along with data overload, quality and data protection laws, that provide the real headache for those driving the sector forward.

Like the wider business community, insurers are beginning to adopt new computing models to embrace big data infrastructure. The cloud provides insurers with new platforms to manage the rapidly growing sets of data and support the compute power to process it all into tangible and insightful information. Similarly cloud computing has the ability to add great value to how data is stored, with Gartner predicting a third of all content will be in the cloud in just four years. Without access to cloud computing, detailed analysis of data would only be a dream for most organisations, as the high computing power systems are extremely costly to run.  Coupled with big data, the cloud creates powerful value and insight that helps businesses innovate. This symbiotic approach to big data and the cloud is one recipe that will ensure insurers can build momentum in their space and stay ahead of their competitors.

An agile approach

It’s clear from our recent research that insurers have made a shift in outlook, altered relationships with their IT providers and are ready to make the investment in the cloud, to get ahead of their competitors. This united approach will eliminate bad service for customers and enable organisations to unify their applications so mistakes are not made. For example a customer with a quote lapse should not be targeted for new business; instead they will be approached with their existing records intact and provided a quote mechanic that is relevant.  These solutions need to marry old, existing, legacy systems with new applications, and feed data through the business in a variety of different ways. Insurers that can do this stand to benefit from greater agility – being able to launch new applications more quickly than competitors and reap the benefits of higher productivity.

For now, the proprietary, actionable ideas and insights generated through analytics are the most valuable data of all and the investment in understanding big data will be critical. The next level will clearly involve insurers having the confidence to move systems into the cloud, and drive innovation and better ways of understanding their customers. Failing to act now will lead to missed customer opportunities and organisations being left in the dark while their competitors reap the benefits.

To find out more on the relationship between big data and insurers, download NTT Communications Whitepaper here.

Technology

Does your institution have operational resilience? Testing cyber resilience may be a good way to find out

Published

on

REMOTE WORKING STRATEGY REQUIRED TO STRENGTHEN CYBER RESILIENCE

By Callum Roxan, Head of Threat Intelligence, F-Secure

If ever 2020 had a lesson, it was that no organization can possibly prepare for every conceivable outcome. Yet building one particular skill will make any crisis easier to handle: operational resilience.

Many financial institutions have already devoted resources to building operational resilience. Unfortunately, this often takes what Miles Celic, Chief Executive Officer of TheCityUK, calls a “near death” experience for this conversion to occur. “Recent years have seen a number of cases of loss of reputation, reduced enterprise value and senior executive casualties from operational incidents that have been badly handled,” he wrote.

But it need not take a disaster to learn this vital lesson.

“Operational resilience means not only planning around specific, identified risks,” Charlotte Gerken, the executive director of the Bank of England, said in a 2017 speech on operational resilience. “We want firms to plan on the assumption that any part of their infrastructure could be impacted, whatever the reason.” Gerken noted that firms that had successfully achieved a level of resilience that survives a crisis had established the necessary mechanisms to bring the business together to respond where and when risks materialised, no matter why or how.

We’ll talk about the bit we know best here; by testing for cyber resilience, a company can do more than prepare for the worst sort of attacks it may face. This process can help any business get a clearer view of how it operates, and how well it is prepared for all kinds of surprises.

Assumptions and the mechanisms they should produce are the best way to prepare for the unknown. But, as the boxer Mike Tyson once said, “Everyone has a plan until they get punched in the mouth.” The aim of cyber resilience is to build an effective security posture that survives that first punch, and the several that are likely to follow. So how can an institution be confident that they’ve achieved genuine operational resilience?

This requires an organization to honestly assess itself through the motto inscribed at the front of the Temple of Delphi: “Know thyself.” And when it comes to cyber security, there is a way for an organization to test just how thoroughly it comprehends its own strengths and weaknesses.

Callum Roxan

Callum Roxan

The Bank of England was the first central bank to help develop the framework for institutions to test the integrity of their systems. CBEST is made up of controlled, bespoke, intelligence-led cyber security tests that replicate behaviours of those threat actors, and often have unforeseen or secondary benefits. Gerken notes that the “firms that did best in the testing tended to be those that really understood their organisations. They understood their own needs, strengths and weaknesses, and reflected this in the way they built resilience.”

In short, testing cyber resilience can provide clear insight into an institution’s operational resilience in general.

Gaining that specific knowledge without a “near-death” experience is obviously a significant win for any establishment. And testing for operational resilience throughout the industry can provide some reminders of the steps every organization should take so that testing provides unique insists about their institution, and not just a checklist of cyber defence basics.

The IIF/McKinsey Cyber Resilience Survey of the financial services industry released in March lasy year provided six sets of immediate actions that institutions could take to improve their cyber security posture. The toplines of these recommendations were:

  1. Do the basics, patch your vulnerabilities.
  2. Review your cloud architecture and security capabilities.
  3. Reduce your supply chain risk.
  4. Practice your incident response and recovery capabilities.
  5. Set aside a specific cyber security budget and prioritise it
  6. Build a skilled talent pool and optimize resources through automation.

But let’s be honest: If simply reading a solid list of recommendations created cyber resilience, cyber criminals would be out of business. Unfortunately, cyber crime as a business is booming and threat actors targeting essential financial institutions through cyber attacks are likely earning billions in the trillion dollar industry of financial crime.A list can’t reveal an institution’s unique weaknesses, those security failings and chokepoints that could shudder operations, not just during a successful cyber attack but during various other crises that challenge their operations. And the failings that lead to flaws in an institution’s cyber defence likely reverberate throughout the organization as liabilities that other crises would likely expose.

The best way to get a sense of operational resilience will always be to simulate the worst that attackers can summon. That’s why the time to test yourself is now, before someone else does.

Continue Reading

Technology

Thomson Reuters to stress AI, machine learning in a post-pandemic world

Published

on

gbaf1news

By Kenneth Li and Nick Zieminski

NEW YORK (Reuters) – Thomson Reuters Corp will streamline technology, close offices and rely more on machines to prepare for a post-pandemic world, the news and information group said on Tuesday, as it reported higher sales and operating profit.

The Toronto-headquartered company will spend $500 million to $600 million over two years to burnish its technology credentials, investing in AI and machine learning to get data faster to professional customers increasingly working from home during the coronavirus crisis.

It will transition from a content provider to a content-driven technology company, and from a holding company to an operational structure.

Thomson Reuters’ New York- and Toronto-listed shares each gained more than 8%.

It aims to cut annual operating expenses by $600 million through eliminating duplicate functions, modernizing and consolidating technology, as well as through attrition and shrinking its real estate footprint. Layoffs are not a focus of the cost cuts and there are no current plans to divest assets as part of this plan, the company said.

“We look at the changing behaviors as a result of COVID … on professionals working from home working remotely being much more reliant on 24-7, digital always-on, sort of real-time always available information, served through software and powered by AI and ML (machine learning),” Chief Executive Steve Hasker said in an interview.

Sales growth is forecast to accelerate in each of the next three years compared with 1.3% reported sales growth for 2020, the company said in its earnings release.

Thomson Reuters, which owns Reuters News, said revenues rose 2% to $1.62 billion, while its operating profit jumped more than 300% to $956 million, reflecting the sale of an investment and other items.

Its three main divisions, Legal Professionals, Tax & Accounting Professionals, and Corporates, all showed higher organic quarterly sales and adjusted profit. As part of the two-year change program, the corporate, legal and tax side will operate more as one customer-facing entity.

Adjusted earnings per share of 54 cents were ahead of the 46 cents expected, based on data from Refinitiv.

The company raised its annual dividend by 10 cents to $1.62 per share.

The Reuters News business showed lower revenue in the fourth quarter. In January, Stephen J. Adler, Reuters’ editor-in-chief for the past decade, said he would retire in April from the world’s largest international news provider.

Thomson Reuters also said its stake in The London Stock Exchange is now worth about $11.2 billion.

The LSE last month completed its $27-billion takeover of data and analytics business Refinitiv, 45%-owned by Thomson Reuters.

(Reporting by Ken Li, writing by Nick Zieminski in New York, editing by Louise Heavens and Jane Merriman)

 

Continue Reading

Technology

Putting data protection back on the financial agenda

Published

on

Putting data protection back on the financial agenda 1

By Wim Stoop, CDP Customer and Product Director, Cloudera

Despite the wave of changes that Brexit has brought financial organisations, from the end of ‘passporting’ to uncertainty over the longer-term equivalence rules, one thing has remained a constant — data privacy regulations are a core responsibility to protect sensitive data and mitigate data breaches. From PSD2 to GDPR, financial institutions need to ensure they are still processing and transferring data in accordance with the industry’s stringent rules and regulations. If not, they risk fines of up to £17.5 million or 4% of their company’s annual global turnover.

As the stakes get higher, the amount of data which financial enterprises are having to deal with is on the rise too. In fact, research by IDC estimated that businesses created and captured 6.4 zettabytes of new data last year alone. This increase in data production has linked to the pandemic and the move to remote working. Replacing face-to-face interactions with online communications has meant that financial businesses suddenly had to cope with a larger amount of data flowing through their networks. In addition, employees working from home are increasingly doing so on potentially unsecured devices, outside of the corporate network, risking exposure and data breaches according to numerous cybersecurity reports.

With an extensive stock of sensitive customer data and so many regulations to keep on top of, remaining compliant can feel overwhelming for financial organisations. However, this shouldn’t be the case. Today we often see businesses trying to retrofit data protection strategies, or take a reactive approach to external forces. Instead, they should be taking a proactive stance on data management. In doing so, security becomes a natural side-effect and financial companies can operate with the assurance that no matter what new regulations come into play, they are compliant. The question is, how to achieve this?

Taking a proactive approach to data privacy

To remain compliant, financial institutions need to get on top of their data. When data is sat in siloes, on legacy systems, it’s inaccessible to all and it becomes a challenge to identify what is sensitive and what isn’t. Poorly managed data can’t be protected and the risk of data breaches increases. By contrast, when properly controlled and stored, it becomes easy to apply data security rules.

From customer names and contact details to transaction records and PINs, financial organisations hold a lot of personal and financial data on customers. However, the trick is understanding that all data holds varying degrees of sensitivity and thus, needs to be managed accordingly. For instance, a customer’s bank account details are more sensitive, compared to their basic personal data, such as name and address, which are usually publicly accessible. By proactively identifying, prioritising and classifying data by its degree of sensitivity, financial companies can apply any and all data protection rules that are necessary, such as restricting certain users from accessing highly confidential information.

Yet, this identification process is often looked at as a reactive measure by many financial businesses. The challenge in proactive data management lies in an organisation’s ability to eliminate the frictions it has in tracking, identifying and classifying information, as opposed to doing so retrospectively. After all, data classification plays a vital role in ensuring data protection is upheld.

A proactive approach is integral to effective data management and governance. The first step in achieving this approach involves creating a data marketplace, or a curated, secured and governed data repository. Having something like a data marketplace in place means that as soon as data enters an organisation, enterprises can determine its degree of sensitivity, how it should be managed, and which analytics need to be run, to extract the most value out of the data.

Once these steps are taken, compliance and data privacy happen almost naturally and become ingrained in the business. When companies are aware of every single piece of data in their possession, they can know exactly how it’s being protected. Such a robust strategy ensures that institutions meet the high standards of trust that their customers have bestowed upon them in protecting their personal data. And, with this level of control, enterprises can avoid data lockout, reduce friction for employees, and optimise the value they unlock from their data. At the same time, they can have the peace of mind that they are compliant and protected.

A business-ready solution for data protection

With so many rules and regulations to keep track of, data protection shouldn’t be another worry to add to the list. Financial companies can maximise the efficacy of their existing security and governance strategies by applying it to all datasets across the enterprise – whether that be on-premise, in the cloud, or a combination of the two. In particular, as a scalable and low-cost solution, organisations are increasingly turning to the cloud for their data management needs. It’s expected that over half (51%) of business data will be stored in the cloud by 2024.

This is where an enterprise data cloud (EDC) really shows it’s worth, allowing financial companies to keep their data protected, compliant, and successfully governed. Simply put, an EDC is a hybrid and multi-cloud platform that harnesses analytics at every stage of the data lifecycle. It enables organisations to extract the true value of their data while still providing a consistent layer of security.

An EDC gives financial businesses a single source of truth, built on technology that operates on any cloud environment and right through to the edge. Armed with an EDC, companies have complete visibility over their data, no matter where it resides in the enterprise or the data lifecycle, easing the task of managing and protecting data. On top of this, an EDC supports a variety of data functions, including the data marketplace, and works to provide control, visibility and examination over data. With all these aspects working together, financial institutions can ensure that all data which passes through their infrastructure and into the data marketplace is efficiently governed and protected.

Bringing technology, people, and process together

Technological solutions, like an EDC, work at their maximum potential when they are in harmony with people and process. But, the triad has been thrown off balance by the rise of remote working and reduction in staff numbers. While all businesses recognise that sensitive data needs to be encrypted and access should be restricted, this has been a difficult feat as employees work from home and use devices outside of the traditional network security parameters. In fact, nearly half (48%) of employees are less likely to follow safe data practices when working from home. This will exponentially increase the risk of data breaches.

In addition, with almost a fifth (18%) of the UK workforce on furlough and team numbers shrinking, companies don’t have the same amount of manpower to validate both the systems being used, as well as the data being run in these systems, to ensure that they are compliant. Within the office environment, organisations were able to create ‘islands of perfect governance’, with all departments being aware of the applications used to manage data and therefore, guaranteeing higher levels of compliance. However, these safety nets have collapsed during home working and it’s become more difficult to ensure the security and privacy of data within an enterprise.

What’s needed here is an overarching framework that provides a standard for data governance. This is enabled by having the right technology solution, a proactive approach to data management and people within a business supporting it from the bottom up in place — forming a triad that works in perfect harmony. A framework such as this also enables enterprises to assess what they need to do to create data protection rules internally that ensure compliance, and allows employees to self-check their data security protocols eliminating any uncertainty about protecting sensitive data.

It is important to remember that the right technology alone won’t make people compliant – whether they are working in an office or remotely. Rather, as pointed out above, it is technology, people, and process, working in sync, that will ensure that regulations are adhered to and data is managed and protected.

Long-lasting success with data protection

With data volumes growing and remote working creating security vulnerabilities, financial businesses need to get on top of their data from the get-go. By proactively identifying sensitive data, accurately securing it, and delivering trusted data to end-users, the right data can be put into the hands of the right people.

Creating a watertight data privacy strategy requires financial organisations to deliver a uniform approach to data management and protection across departments to ensure compliance. In addition, harnessing technology, such as an EDC, will provide visibility and control over sensitive data, enabling financial institutions to unlock real-time insights from their data while still providing a consistent layer of security. With technology, people and process in harmony, enterprises can operate with the confidence that their data is being managed successfully and they are compliant with both existing and new regulations.

Continue Reading
Editorial & Advertiser disclosureOur website provides you with information, news, press releases, Opinion and advertorials on various financial products and services. This is not to be considered as financial advice and should be considered only for information purposes. We cannot guarantee the accuracy or applicability of any information provided with respect to your individual or personal circumstances. Please seek Professional advice from a qualified professional before making any financial decisions. We link to various third party websites, affiliate sales networks, and may link to our advertising partners websites. Though we are tied up with various advertising and affiliate networks, this does not affect our analysis or opinion. When you view or click on certain links available on our articles, our partners may compensate us for displaying the content to you, or make a purchase or fill a form. This will not incur any additional charges to you. To make things simpler for you to identity or distinguish sponsored articles or links, you may consider all articles or links hosted on our site as a partner endorsed link.

Call For Entries

Global Banking and Finance Review Awards Nominations 2021
2021 Awards now open. Click Here to Nominate

Latest Articles

Newsletters with Secrets & Analysis. Subscribe Now