Technology
INTEGRATION AND DIVESTMENT: IT’S ALL ABOUT THE DATA!
Published : 11 years ago, on
What are the main issues and challenges involved in integration and divestment programmes? After delivering an integration and divestment project for a major bank, Simon Wong of Xceed Group outlines his key considerations
After many days, nights and meeting hours I was recently part of a successful team involved in re-launch of a major bank on to the high street. For me, this was the culmination of 20 months of hard graft on a complex divestment programme, following what was an equally challenging 18 months on the preceding integration programme. Along the way, I gained insights into large-scale programme challenges and had the opportunity to see things through the lens of both IT and business. Stepping back, there are a number of important issues and challenges to highlight to firms looking to do the same and one moment on the project in particular I’d like to revisit.
The divestment and integration were huge undertakings, both in terms of scale and change (of hardware and software) as well as data. From a hardware and software perspective, the main challenge centres on gaining a clear understanding of what the target solution should be. While this may seem a simple task at first glance, for a financial service organisation IT change is never simple. As their business functions grow and develop, as strategic decisions are made, and as technology itself transitions to a legacy state, financial services platforms evolve over time into highly complex organisms.
As a result, the more technical elements of large scale programmes – be it ‘scale and remediate’ or ‘partition, clone and build’ of the target systems – are highly complex activities that require clear design and careful delivery. Invariably this will require heavy lifting around all IT activity from development to testing, be it system testing, SIT, non-functional testing, and so on. As this portion of the programme involves physical change, it would naturally appear to be the most complicated activity. However, based on my experiences, I would tend to disagree.
I remember at the outset of the integration programme being in a small meeting room in Moorgate. I was debating resources with (among others) the programme’s overarching business test lead, who espoused the complexities of the data. “It’s all about the data,” he repeated adamantly. It’s fair to say that, at the time, the majority of the audience were more concerned with the delivery of the infrastructure and code. But on reflection, if I could get in a DeLorean and turn back time I’d stand up in that meeting and announce “I’m with the business lead… it’s all about the data!”
Part of the difficultly when it comes to testing data is that it feels somewhat abstract. As such, it is difficult to quantify and qualify compared to more tangible deliverables such as mainframes, servers and code. You can count infrastructure, and you can measure code, but it’s more difficult to articulate data as a concept. Regardless, it seems to me that data shapes an organisation – it underpins strategic thinking and decision making. If an IT system were the organs then data would be the blood that pumps through it. So, when you have a large-scale change programme with a heavy bout of data testing, what should you consider?
1. The Requirement. This sounds obvious, but is in fact much harder to quantify than some realise. ‘How much data do you need?’, ‘how many accounts?’ or ’how much this, that or the other?’ would be common questions. Clearly, test professionals would quote different approaches in setting about this definition – like boundary value analysis – but be aware that some functions, such as risk and finance, will invariably require large volume data sets to validate macro level objectives such as distributions, strategy analysis and modelling. Always consider macro level data outcomes and acknowledge the use of samples.
2. Staging the Requirement. Invariably a data cut or some sort of data staging activity would need to be taken to support testing. Care and thought should be given as to when it is taken and how this is aged, to ensure that it can meet the business test outcomes. Considerations should be given to month end and quarter end processing which is quite different to daily processing. The cut should mimic the timings of the event itself. Transactional activity should be given due attention to synthesise account behaviour.
3. Quality of the Data. ‘What does good look like?’ is the invariable question. Cleansing activities need to be done and if there are data manipulation activities such as a conversion, these need to be checked and verified thoroughly.
4. Test the Outcome. Data testing needs an environment, so all the hard challenges listed at the outset regarding hardware and software need to be resolved. Stable code-sets and configuration management are absolutely vital as they ensure the foundations are solid for testing to be executed. Issues with code or infrastructure will invariably make triage of data issues extremely difficult.
When it comes to data in enterprise environments, there’s a huge amount you could explore from mitigating strategies and creative approaches to use of tools, modelling software and so forth. However, that is for another time. All I can say is that having been been on both sides of the fence, in IT and the business, it’s clear that data testing is both absolutely vital and extremely hard. The importance of data and data testing needs to be recognised right at the outset of an integration and divestment programme in order to plan and direct efforts appropriately. It really is all about the data, now where’s my DeLorean.
-
Business3 days ago
Innovations in Sports Marketing Approaches
-
Business4 days ago
United Automobile Insurance Company Celebrates 35 Years Of Customer Care and Business Success
-
Banking4 days ago
Bank of Portugal cuts growth forecast for this year and next
-
Investing3 days ago
Varta seals auto battery investment from Porsche