Written by Huw Price, Managing Director, Grid-Tools
Most IT departments are striving to build more agile, modern, cost-effective environments in the bid to accelerate the delivery of valuable, quality software to the business. However, for banks and financial institutions in particular, legacy systems and continually tightening external compliance pressures are impeding agility when it comes to the provision of new software projects.
One of the key ways to speed the delivery of new projects is to minimise the risk of production defects. Creating efficiencies in test data provision mitigates the risk of delays, rework and spiralling costs, which slows time to market.
Testing accounts for approximately 40% of the average software development lifecycle, and as much as 50% of development and testing time is spent manipulating, searching for or manually creating the right data to meet test case requirements. In an ideal world, organisations would standardise and automate the majority of these time consuming manual processes. Recent research however shows that very few companies achieve significant levels of automation and many, especially in the banking and financial services market, are still heavily reliant on manual testing.
The question is, why?
WANT TO BUILD A FINANCIAL EMPIRE?
Subscribe to the Global Banking & Finance Review Newsletter for FREE Get Access to Exclusive Reports to Save Time & Money
By using this form you agree with the storage and handling of your data by this website. We Will Not Spam, Rent, or Sell Your Information.
Where financial institutions are concerned, the fact that most are dealing with legacy systems, systems that in some cases are over 40 years old, is seriously hampering their ability to move to a more standardised approach. To add to the issue, often the knowledge around these systems and various nuances about them remains in the heads of individuals. This makes it increasingly difficult to create standard, repeatable testing processes and is one of the reasons that testing is still incredibly reliant on manual intervention.
Banks are also under constant pressure to meet ever growing and tightening compliance and regulatory requirements. Current data protection and legislation such as HIPAA, PCI DSS, the EU Data Protection Directive and the UK Data Protection Act means that much more vigilant practices around the use of data needs to be adhered to.
Traditionally, most financial institutions used full copies of production databases to provision data for development and testing. This practice however is no longer viable due to growing legislation requirements, so banks now have to use some form of data masking to solve this problem. However, for banks that have large and complex IT architectures with sensitive data stored across multiple sites and disparate data sources, as well as a lot of manual processes, data masking is very expensive, slow and error prone. With an increased pressure on cost control and cost reduction, banks need to find more intelligent ways of finding the right test data as well as provisioning and creating test data marts.
Implementing an end-to-end Test Data Management (TDM) solution offers total control over data throughout the software development lifecycle (SDLC). Building clear, unambiguous requirements from the outset helps to ensure quality in testing, whilst shortening test cycles by more than 30%. It also cuts defects creation by up to 95%, reducing costly rework. Clarity in requirements also allows testing teams to better understand what needs to be tested. This allows them to design the perfect, minimum set of tests to cover all of the required functionality.
No bank can afford the time, the cost or the risk of employing an army of manual testers anymore. Whether building or testing new software, re-engineering systems or migrating applications, banks and financial services companies need to be able to respond to changing requirements by provisioning fit for purpose test data to the right place at the right time to accelerate and improve test cycles.
Moving to an end-to-end Test Data Management system will deliver significant test process improvements that will enhance the performance and effectiveness of testing, ultimately speeding the delivery of better quality software to the business and at less cost.
If you are interested in finding out more about how FIs can get new software development projects to market more quickly why not download our latest whitepaper: Delivering More Rigorous Testing of Software Systems to Banks and Financial Institutions here
About the Author:
Huw Price Managing Director, Grid-Tools
With a career spanning nearly 30 years, Huw Price has been the lead technical architect for several US and European software companies and has provided high-level architectural design support to multinational banks, major utility vendors and health care providers. Voted “IT Director of the Year 2010” by QA Guild, Huw has spent years specialising in test automation tools and has launched numerous innovative products which have re-cast the testing model used in the software industry. He currently speaks at well-known events internationally and his work has been published in numerous magazines such as Professional Tester, CIO Magazine and other technical publications.
Huw’s newest venture, Grid-Tools, has quickly redefined how large organisations need to approach their testing strategy. With Huw’s visionary approach and leadership, the company has introduced a strong data-centric approach to testing, launching new concepts conceived by Huw such as “Data Objects”, “Data Inheritance” and “A Central Test Data Warehouse”.
Currently working with leading edge testing companies and partners such as HP, Software AG and Bender RBT, Grid-Tools is building on the strategy of improving automation tools and associated structured methodologies.