Martin Peck, CEO, Data Continuity Group considers the challenges of data management for financial organisations and offers some best practice advice to help ease the pain.…
Financial institutions come in a variety of guises. From banks and trading organisations to financial advisers and insurance companies, all have very different needs when it comes to data management, yet all are governed by very strict guidelines in terms of what data needs to be retained and for how long.
Corporate governance, regulatory compliance and risk management all place huge demands on financial organisations’ handling of data. Unpredictable regulatory reform is the norm for the financial sectors as new directives are issued frequently by the UK and EU. Companies not only need to accurately comply with regulations such as the Data Protection Act, Basel III and Sarbanes-Oxley, but also be able respond to them very quickly.
Data retention requirements also vary between organisations. Mortgage and pension providers are required to retain data for up to 25 years and unless otherwise specified by statute, important application system information and data (critically exchange and email) are retained for six to seven years (this would cover most HMRC requirements which is the basis for a significant amount of retention policy drivers).
Large financial organisations will undoubtedly have the resources in house to manage these complex requirements or will have made the decision to outsource responsibility for it to a data management company. However, for smaller organisations with limited internal resources data management, or should I say lack of it, can become unwieldy.
Getting your house in order
Effective data management is crucial in dealing with the data demands placed on financial organisations. For any financial organisation the first step is analysing and categorising your data depending on its importance to the business:
- Ultra critical data – could cause significant business interruption if lost
- Critical data –has been created or was needed in the last 90 days
- Legacy data – needs to be protected for compliance
- Duplicate or non-business related data – does not need to kept
Both ultra-critical and critical data needs to backed-up on a daily basis so that it is still available in the event of a disaster and is retained for the required retention period. It is also essential to replicate (make frequent copies and store in a different environment) ultra-critical data to help reduce recovery times.
All other legacy data that hasn’t been accessed in a long time but is still of value to the business should be archived and any non-business related data deleted to free up server space and help reduce storage costs.
It’s also vital not to underestimate the importance of regular data back-ups; data loss can happen easily and be caused by hardware faults or failure, viruses or malicious hacking, power failures or human error.
Tape versus disk
Disk and magnetic tape are the two main methods for backing up data and the debate as to which is best has been brewing for many years. Although considered by many to be outmoded, tape fulfils a real need in the financial services sector to store data that must be retained for longer than 12 months
Some financial organisations could have multiple terabytes of data to store, spanning years and in this situation, tape is the cheapest and most effective way of storing any data which is unlikely to be needed. Annual storage costs tend to be modest whether the retention period is six or 25 years. Tape storage also makes it very easy to catalogue files and keep a record of content so that data is relatively easy to restore should you need it. It can then be stored securely off site. However where there is urgent (ie. within 24 hours) specific restore or recovery objectives then disk based storage systems have significant advantages over tape.
Backup or archive retention data should be moved to a location off site from the prime or production environment on a frequent basis to protect against factors such as break-ins which could result in tapes or disks being stolen, power outages or system failures. As a minimum the protected copy of the data would usually be off-sited at least weekly but ideally it would be done on a daily basis. Typically we would seek to design a data protection system that provides two copies of protected data to be held for at least one month, one held locally in a staging area or appliance and one copy off site. Longer retentions are then held offsite.
The location and the media upon which the protected copy of the data is held will affect the speed of the restore and retrieval back into the local environment. Where application data is involved this creates the added delay and issue of recreating the application system environment first before the data itself can be reloaded.
The changing face of data management for financial services organisations means that many are re-evaluating their information service provision with regard to availability, scalability, security, efficiency and cost. Budget and headcount constraints are often the key determinants of the data protection solution design and the on-going management of it. There is clearly a trade-off between the underlying technologies used (disc; disc and tape and now various forms of hybrid’s (such as cloud technologies) and the overall whole life cost of the solution. Furthermore, many financial institutions have to maintain the highest levels of data protection standard across a wider set of the organisations total data holding, but do not have large headcounts within the IT department.
The routine data protection workload is fast changing (particularly with the adoption of virtualisation) and it needs to be done outside of core working hours. As a result, many financial organisations, especially those with limited internal resources, choose to outsource their data management requirements to a specialist data management company. The added benefit being that it removes large amounts of capital expenditure and significant amounts of headcount and overtime and on-call costs.
When evaluating who to outsource data to, seek a provider experienced in the sector and who understands the specific challenges it faces. Also, find out if they’re ISO accredited, particularly ISO 27001, (an Information Security Management System standard) and where their data centres are based as your data may need to remain in the UK.
Data is a valuable corporate asset for banks and other financial organisation, but it also has the potential to be its greatest liability. Managed properly, it can accelerate growth and poorly managed can drive up costs and expose companies and their customers to risk. Data in this sector will continue to grow at an alarming pace, so don’t risk a fine or losing your valuable data, get a grip of it today.