Connect with us

Banking

A risk not worth taking: how banks can avoid millions in regulatory fines

A risk not worth taking: how banks can avoid millions in regulatory fines

By Barry Cashman, Regional Vice President UK&I, Veritas Technologies

Banks are increasingly under the regulatory microscope. The criminal case brought against NatWest for allegedly failing to comply with money laundering rules is case in point of this. If found guilty, the high street bank could face ‘unlimited’ fines.

Another area where banks can expect to continue facing severe scrutiny is in the area of data management and protection. The rise of online and mobile banking services has seen banks being entrusted with an increasing amount of highly sensitive personal customer data and, with the more recent accelerated shift to remote working during the height of the COVID-19 pandemic, this data has become more dispersed than ever before. As a result, banks have had to rapidly extend their IT infrastructures with complex combinations of cloud, virtual and on-premises infrastructures that can become increasingly fragmented and harder to manage. Veritas research found that most banks are currently struggling with this – 63% are suffering from a transformation gap where their security measures lag behind their complex IT infrastructures, meaning they have less visibility and control of their data than ever before.

If banks continue on this trajectory, they risk leaving themselves exposed to a triple-threat of becoming victim to cybercrime, facing hefty fines for regulatory non-compliance and eroding consumer trust. The truth is, cybercriminals have already been taking advantage of this ‘gap’ – in the first half of 2020 alone, SonicWall reported a 20% hike in ransomware attacks.

A game of trust

When customers choose a bank to do business with, they hand over vast amounts highly sensitive personal information which they expect to be treated with the utmost care and protection. If this data falls into the wrong hands, it could damage livelihoods beyond repair. Ultimately, this whittles down to one word: trust. It’s a concept that the industry relies upon to attract and retain customers.

But building an industry on collecting and using highly sensitive customer data is a double-edged sword – while banks can take advantage of a vast pool of valuable customer data to offer personalised services and explore new revenue streams, it also makes them a very attractive target for cybercriminals. In fact, research conducted by the Ponemon Institute reported that an incredible 70% of financial services companies in the UK suffered cyberattacks in 2020 alone. In addition, the Financial Conduct Authority’s requirement for more transparency into operational and security incidents revealed that major banks have suffered at least one out a month in recent years.  With a recent history plagued by cyber threats and outages, trust between customers and banks is fragile at the best of times. Just one more data breach or outage could bring the proverbial stack of cards tumbling to the ground.

Confronting some harsh truths

The honest truth is that many banks are not managing their data as well as they could be and are at huge risk of failing compliance checks.

Given the rising threat of ransomware, now is the most crucial time to be testing and perfecting recovery plans. Yet, Veritas research found that an enormous 46% of banks have either never tested their disaster recovery plans in the event of a ransomware attack or have not tested it in over 90 days. And despite nearly two-thirds (63%) of banks admitting to falling victim to a ransomware attack at some point in their history, more than one in 10 (14%) banks believe it would take them over a month to recover, if they are able to recover at all.

These figures demonstrate that banks are failing to prepare for when the inevitable ransomware attack strikes and could be doing much more to protect their most valuable digital assets. In fact, half (50%) of the banks surveyed have admitted to paying a ransom to recover customer data.

Taking away the risk factor

In a world where banks have had to rapidly accelerate their digital transformation plans and fundamentally shift the way in which they operate in the height of a global pandemic, how can they ensure their data protection strategies measure up?

The answer cannot be to just simplify their IT infrastructure: as the volume of data banks store continues to rise, banks have to accept that there is always going to be complexity in the IT environment. But there is a way to use tools to abstract much of the complexity away. By standardising the systems that manage data across their enterprise, banks can to start extracting value from their data.

Before simply jumping into any course of action though, it’s essential to understand what data they have, its value, where it needs to sit, who should access it and how long it needs to be held for. This data visibility doesn’t need to just be a defence measure though; gaining a better understanding of the data they hold can help banks identify trends and insights that can enable them to offer better customer experiences or open doors to new revenue streams. Without a full view this data, businesses are blind to their own potential.

Once they have visibility into their business-critical data, they need to ensure that business continuity and disaster recovery processes are optimised to protect it. In the event of a ransomware attack, an encrypted backup is the only line of defence. But it’s important to remember that there is no backup plan in place until it’s been tried and tested.

Testing disaster recovery plans help reveal cracks and vulnerabilities businesses otherwise would never have discovered. Are backups sufficiently isolated to avoid infection from spreading, are there enough copies of valuable data and are those copies being retained for long enough? Only regular fire drills and tests can answer these questions conclusively. Testing could be something as simple as staff checking to ensure a backup site will go live should the main application fail or performing a single file recovery and checking the recovered copy matches the original. What’s important is that these tests are regular, repeatable and a crucial part of a business’ backup strategy.

Whatever the next year holds, banks are going to need to be ready to adapt again and again to keep pace. This means having the tools in place to abstract complexity from their IT environments, with robust disaster recovery plans in place to protect their most valuable digital assets. Despite their best efforts, most companies will fail to stop at least one cyberattack over the course of their lifetime. What distinguishes one victim from another is their ability to resist and bounce back. Data responsibility is the foundation of any organisation’s ransomware defence, while backups are its secret weapon.

Editorial & Advertiser disclosure
Global Banking and Finance Review Awards Nominations 2021
2021 Awards now open. Click Here to Nominate

Recommended

Newsletters with Secrets & Analysis. Subscribe Now