Written by Huw Price, Managing Director, Grid-Tools
A recently published Fujitsu Financial Services Report (2013) highlighted that nearly one third (27%) of financial services organisations consider updating their legacy applications as their main priority over the next few years. However, achieving this whilst keeping software development budgets under control will be a challenge for many financial services organisations going forward.
Financial institutions also have the added pressure of ensuring that they are meeting compliance with an ever growing and tightening series of regulatory requirements and deadlines. For example, current data protection legislation, such as HIPAA, PCI DSS, the EU Data Protection Directive and the UK Data Protection Act, impose vigilant practices around the use of data. As a result, banks cannot simply test using any data they want. They need to ensure that sensitive data is masked and that personally identifiable information (PII) is suitably protected – an exercise that can be quite labour intensive and cost-prohibitive.
As of 1st February 2014, banks in the EU, EFTA, San Marino and Monaco were required to have migrated their systems to be compliant with SEPA (the Single Euro Payments Act). The final deadline – following a six-month extension in January – for non-SEPA payments is now 1stAugust 2014. Designed to simplify bank transfers denominated in Euros, SEPA requires all financial institutions that deal within the EU (non-European banks have until October 2016) to ensure they have the standards, procedures and infrastructure in place to comply. In order to meet these deadlines, banks require the ability to test standard message formats, such as FEDWIRE, CHIPS, PAIN, ISO020022 and PACS. Failure to do so may prevent them from interacting with other EU banks.
As banks look to overhaul their systems, new applications will have to be tested and compliant before they go into live production. With increased pressure on cost control and cost reduction, banks need to find more intelligent ways of finding and using test data.
WANT TO BUILD A FINANCIAL EMPIRE?
Subscribe to the Global Banking & Finance Review Newsletter for FREE Get Access to Exclusive Reports to Save Time & Money
By using this form you agree with the storage and handling of your data by this website. We Will Not Spam, Rent, or Sell Your Information.
More recently testing and development budgets have started to rise in line with an organisational and industry-wide focus on greater quality. This is often accompanied by an internal emphasis on cost-efficiency and delivering IT projects on time and in budget. In order to respond to these requirements IT and development teams need to be able to provision ‘fit for purpose’ test data to the right place at the right time to accelerate and improve test cycles. At the same time they need mitigate the risk of delays, rework and spiralling costs, which slow the time to market of any new application development project.
One of the key ways to get new projects to market fast is to minimise the risk of production defects. The main challenges to successful project delivery are: communicating poor or ambiguous requirements, introducing quality too late in the development lifecycle, and delays caused by manually searching for or creating the right test data. More often than not testing issues and production defects are related to not having the right data in the testing environment in the first place. This is often down to the fact that the development team doesn’t specify clearly enough at the outset what the testers should be testing for. This can then involve rounds and rounds of iterations and circling backwards repeatedly adjusting specifications. When testing teams have received poor requirements they typically end up testing too much or testing with the wrong data. 56% of defects can be traced back to the requirements being undefined or poorly defined. When requirements are clearly defined from the outset, it reduces the number of bugs that make it into live production.
Test data management specialist, Grid-Tools has a visual flowchart tool, Agile Designer™, which helps organisations build quality into their software from the start by mapping requirements to clear, unambiguous visual flow charts. Therefore, business analysts and project managers who are trying to accurately estimate the cost of a project now have a process to do this. Once all of the requirements are mapped out in a clear and unambiguous fashion this creates a firm foundation from which to determine time, complexity and cost estimates. Agile Designer is part of Grid-Tools suite of test management solutions.
Today, IT departments need to focus on creating a more agile, automated and standardised testing environment; one that does not rely on manual intervention and where changes can be made more easily without causing huge delays. This demands an end-to-end test data management solution. Implementing an end-to-end Test Data Management (TDM) solution offers financial institutions total control over their data throughout the development life-cycle from requirements and test case design to finding, making and provisioning the ‘right’ data for testing and QA. Adopting an end-to-end TDM solution improves the quality and accuracy of testing and supports agile development where errors are caught early and fixed more quickly, allowing financial institutions to reduce the number of defects found by 95%. This also reduces the test cycles by more than 30% while reducing the testing time spent searching for data by 50%.
Banks looking to update their legacy applications will be faced with challenges around testing data, profiling, data masking and sub-setting as they try to move this data around in order to upgrade their systems. Grid-Tools Datamaker™ tool facilitates compliance with data protection legislation, whilst significantly reducing infrastructure costs by extracting small, secure intelligence subsets from production. Datamaker™ also allows financial institutions to import, manage, enhance and generate banking messages (SWIFT, FEDWIRE, CHIPS, ISO20022, PAIN and PACS). This is particularly useful to banks who have yet to meet SEPA guidelines. However, it is the ability to quickly design, find and make fit for purpose test data, or synthetically create data where none exists, that makes Datamaker™ really stand out from the crowd.
To find out more about Grid-Tools, its suite of test data management solutions and how it is helping financial institutions update and bring new applications to market quickly and cost effectively, why not download our latest whitepaper here.