Search
00
GBAF Logo
trophy
Top StoriesInterviewsBusinessFinanceBankingTechnologyInvestingTradingVideosAwardsMagazinesHeadlinesTrends

Subscribe to our newsletter

Get the latest news and updates from our team.

Global Banking and Finance Review

Global Banking & Finance Review

Company

    GBAF Logo
    • About Us
    • Profile
    • Privacy & Cookie Policy
    • Terms of Use
    • Contact Us
    • Advertising
    • Submit Post
    • Latest News
    • Research Reports
    • Press Release
    • Awards▾
      • About the Awards
      • Awards TimeTable
      • Submit Nominations
      • Testimonials
      • Media Room
      • Award Winners
      • FAQ
    • Magazines▾
      • Global Banking & Finance Review Magazine Issue 79
      • Global Banking & Finance Review Magazine Issue 78
      • Global Banking & Finance Review Magazine Issue 77
      • Global Banking & Finance Review Magazine Issue 76
      • Global Banking & Finance Review Magazine Issue 75
      • Global Banking & Finance Review Magazine Issue 73
      • Global Banking & Finance Review Magazine Issue 71
      • Global Banking & Finance Review Magazine Issue 70
      • Global Banking & Finance Review Magazine Issue 69
      • Global Banking & Finance Review Magazine Issue 66
    Top StoriesInterviewsBusinessFinanceBankingTechnologyInvestingTradingVideosAwardsMagazinesHeadlinesTrends

    Global Banking & Finance Review® is a leading financial portal and online magazine offering News, Analysis, Opinion, Reviews, Interviews & Videos from the world of Banking, Finance, Business, Trading, Technology, Investing, Brokerage, Foreign Exchange, Tax & Legal, Islamic Finance, Asset & Wealth Management.
    Copyright © 2010-2025 GBAF Publications Ltd - All Rights Reserved.

    ;
    Editorial & Advertiser disclosure

    Global Banking and Finance Review is an online platform offering news, analysis, and opinion on the latest trends, developments, and innovations in the banking and finance industry worldwide. The platform covers a diverse range of topics, including banking, insurance, investment, wealth management, fintech, and regulatory issues. The website publishes news, press releases, opinion and advertorials on various financial organizations, products and services which are commissioned from various Companies, Organizations, PR agencies, Bloggers etc. These commissioned articles are commercial in nature. This is not to be considered as financial advice and should be considered only for information purposes. It does not reflect the views or opinion of our website and is not to be considered an endorsement or a recommendation. We cannot guarantee the accuracy or applicability of any information provided with respect to your individual or personal circumstances. Please seek Professional advice from a qualified professional before making any financial decisions. We link to various third-party websites, affiliate sales networks, and to our advertising partners websites. When you view or click on certain links available on our articles, our partners may compensate us for displaying the content to you or make a purchase or fill a form. This will not incur any additional charges to you. To make things simpler for you to identity or distinguish advertised or sponsored articles or links, you may consider all articles or links hosted on our site as a commercial article placement. We will not be responsible for any loss you may suffer as a result of any omission or inaccuracy on the website.

    Home > Technology > INTEGRATION AND DIVESTMENT: IT’S ALL ABOUT THE DATA!
    Technology

    INTEGRATION AND DIVESTMENT: IT’S ALL ABOUT THE DATA!

    INTEGRATION AND DIVESTMENT: IT’S ALL ABOUT THE DATA!

    Published by Gbaf News

    Posted on January 3, 2014

    Featured image for article about Technology

    What are the main issues and challenges involved in integration and divestment programmes? After delivering an integration and divestment project for a major bank, Simon Wong of Xceed Group outlines his key considerations

    Simon Wong Xceed Group

    Simon Wong Xceed Group

    After many days, nights and meeting hours I was recently part of a successful team involved in re-launch of a major bank on to the high street. For me, this was the culmination of 20 months of hard graft on a complex divestment programme, following what was an equally challenging 18 months on the preceding integration programme. Along the way, I gained insights into large-scale programme challenges and had the opportunity to see things through the lens of both IT and business. Stepping back, there are a number of important issues and challenges to highlight to firms looking to do the same and one moment on the project in particular I’d like to revisit.

    The divestment and integration were huge undertakings, both in terms of scale and change (of hardware and software) as well as data. From a hardware and software perspective, the main challenge centres on gaining a clear understanding of what the target solution should be. While this may seem a simple task at first glance, for a financial service organisation IT change is never simple. As their business functions grow and develop, as strategic decisions are made, and as technology itself transitions to a legacy state, financial services platforms evolve over time into highly complex organisms.

    As a result, the more technical elements of large scale programmes – be it ‘scale and remediate’ or ‘partition, clone and build’ of the target systems – are highly complex activities that require clear design and careful delivery. Invariably this will require heavy lifting around all IT activity from development to testing, be it system testing, SIT, non-functional testing, and so on. As this portion of the programme involves physical change, it would naturally appear to be the most complicated activity. However, based on my experiences, I would tend to disagree.

    I remember at the outset of the integration programme being in a small meeting room in Moorgate. I was debating resources with (among others) the programme’s overarching business test lead, who espoused the complexities of the data. “It’s all about the data,” he repeated adamantly. It’s fair to say that, at the time, the majority of the audience were more concerned with the delivery of the infrastructure and code. But on reflection, if I could get in a DeLorean and turn back time I’d stand up in that meeting and announce “I’m with the business lead… it’s all about the data!”

    Part of the difficultly when it comes to testing data is that it feels somewhat abstract. As such, it is difficult to quantify and qualify compared to more tangible deliverables such as mainframes, servers and code. You can count infrastructure, and you can measure code, but it’s more difficult to articulate data as a concept. Regardless, it seems to me that data shapes an organisation – it underpins strategic thinking and decision making. If an IT system were the organs then data would be the blood that pumps through it. So, when you have a large-scale change programme with a heavy bout of data testing, what should you consider?

    1. The Requirement. This sounds obvious, but is in fact much harder to quantify than some realise. ‘How much data do you need?’, ‘how many accounts?’ or ’how much this, that or the other?’ would be common questions. Clearly, test professionals would quote different approaches in setting about this definition – like boundary value analysis – but be aware that some functions, such as risk and finance, will invariably require large volume data sets to validate macro level objectives such as distributions, strategy analysis and modelling. Always consider macro level data outcomes and acknowledge the use of samples.

    2. Staging the Requirement. Invariably a data cut or some sort of data staging activity would need to be taken to support testing. Care and thought should be given as to when it is taken and how this is aged, to ensure that it can meet the business test outcomes. Considerations should be given to month end and quarter end processing which is quite different to daily processing. The cut should mimic the timings of the event itself. Transactional activity should be given due attention to synthesise account behaviour.

    3. Quality of the Data. ‘What does good look like?’ is the invariable question. Cleansing activities need to be done and if there are data manipulation activities such as a conversion, these need to be checked and verified thoroughly.

    4. Test the Outcome. Data testing needs an environment, so all the hard challenges listed at the outset regarding hardware and software need to be resolved. Stable code-sets and configuration management are absolutely vital as they ensure the foundations are solid for testing to be executed. Issues with code or infrastructure will invariably make triage of data issues extremely difficult.

    When it comes to data in enterprise environments, there’s a huge amount you could explore from mitigating strategies and creative approaches to use of tools, modelling software and so forth. However, that is for another time. All I can say is that having been been on both sides of the fence, in IT and the business, it’s clear that data testing is both absolutely vital and extremely hard. The importance of data and data testing needs to be recognised right at the outset of an integration and divestment programme in order to plan and direct efforts appropriately. It really is all about the data, now where’s my DeLorean.

    What are the main issues and challenges involved in integration and divestment programmes? After delivering an integration and divestment project for a major bank, Simon Wong of Xceed Group outlines his key considerations

    Simon Wong Xceed Group

    Simon Wong Xceed Group

    After many days, nights and meeting hours I was recently part of a successful team involved in re-launch of a major bank on to the high street. For me, this was the culmination of 20 months of hard graft on a complex divestment programme, following what was an equally challenging 18 months on the preceding integration programme. Along the way, I gained insights into large-scale programme challenges and had the opportunity to see things through the lens of both IT and business. Stepping back, there are a number of important issues and challenges to highlight to firms looking to do the same and one moment on the project in particular I’d like to revisit.

    The divestment and integration were huge undertakings, both in terms of scale and change (of hardware and software) as well as data. From a hardware and software perspective, the main challenge centres on gaining a clear understanding of what the target solution should be. While this may seem a simple task at first glance, for a financial service organisation IT change is never simple. As their business functions grow and develop, as strategic decisions are made, and as technology itself transitions to a legacy state, financial services platforms evolve over time into highly complex organisms.

    As a result, the more technical elements of large scale programmes – be it ‘scale and remediate’ or ‘partition, clone and build’ of the target systems – are highly complex activities that require clear design and careful delivery. Invariably this will require heavy lifting around all IT activity from development to testing, be it system testing, SIT, non-functional testing, and so on. As this portion of the programme involves physical change, it would naturally appear to be the most complicated activity. However, based on my experiences, I would tend to disagree.

    I remember at the outset of the integration programme being in a small meeting room in Moorgate. I was debating resources with (among others) the programme’s overarching business test lead, who espoused the complexities of the data. “It’s all about the data,” he repeated adamantly. It’s fair to say that, at the time, the majority of the audience were more concerned with the delivery of the infrastructure and code. But on reflection, if I could get in a DeLorean and turn back time I’d stand up in that meeting and announce “I’m with the business lead… it’s all about the data!”

    Part of the difficultly when it comes to testing data is that it feels somewhat abstract. As such, it is difficult to quantify and qualify compared to more tangible deliverables such as mainframes, servers and code. You can count infrastructure, and you can measure code, but it’s more difficult to articulate data as a concept. Regardless, it seems to me that data shapes an organisation – it underpins strategic thinking and decision making. If an IT system were the organs then data would be the blood that pumps through it. So, when you have a large-scale change programme with a heavy bout of data testing, what should you consider?

    1. The Requirement. This sounds obvious, but is in fact much harder to quantify than some realise. ‘How much data do you need?’, ‘how many accounts?’ or ’how much this, that or the other?’ would be common questions. Clearly, test professionals would quote different approaches in setting about this definition – like boundary value analysis – but be aware that some functions, such as risk and finance, will invariably require large volume data sets to validate macro level objectives such as distributions, strategy analysis and modelling. Always consider macro level data outcomes and acknowledge the use of samples.

    2. Staging the Requirement. Invariably a data cut or some sort of data staging activity would need to be taken to support testing. Care and thought should be given as to when it is taken and how this is aged, to ensure that it can meet the business test outcomes. Considerations should be given to month end and quarter end processing which is quite different to daily processing. The cut should mimic the timings of the event itself. Transactional activity should be given due attention to synthesise account behaviour.

    3. Quality of the Data. ‘What does good look like?’ is the invariable question. Cleansing activities need to be done and if there are data manipulation activities such as a conversion, these need to be checked and verified thoroughly.

    4. Test the Outcome. Data testing needs an environment, so all the hard challenges listed at the outset regarding hardware and software need to be resolved. Stable code-sets and configuration management are absolutely vital as they ensure the foundations are solid for testing to be executed. Issues with code or infrastructure will invariably make triage of data issues extremely difficult.

    When it comes to data in enterprise environments, there’s a huge amount you could explore from mitigating strategies and creative approaches to use of tools, modelling software and so forth. However, that is for another time. All I can say is that having been been on both sides of the fence, in IT and the business, it’s clear that data testing is both absolutely vital and extremely hard. The importance of data and data testing needs to be recognised right at the outset of an integration and divestment programme in order to plan and direct efforts appropriately. It really is all about the data, now where’s my DeLorean.

    Related Posts
    Treasury transformation must be built on accountability and trust
    Treasury transformation must be built on accountability and trust
    Financial services: a human-centric approach to managing risk
    Financial services: a human-centric approach to managing risk
    LakeFusion Secures Seed Funding to Advance AI-Native Master Data Management
    LakeFusion Secures Seed Funding to Advance AI-Native Master Data Management
    Clarity, Context, Confidence: Explainable AI and the New Era of Investor Trust
    Clarity, Context, Confidence: Explainable AI and the New Era of Investor Trust
    Data Intelligence Transforms the Future of Credit Risk Strategy
    Data Intelligence Transforms the Future of Credit Risk Strategy
    Architect of Integration Ushers in a New Era for AI in Regulated Industries
    Architect of Integration Ushers in a New Era for AI in Regulated Industries
    How One Technologist is Building Self-Healing AI Systems that Could Transform Financial Regulation
    How One Technologist is Building Self-Healing AI Systems that Could Transform Financial Regulation
    SBS is Doubling Down on SaaS to Power the Next Wave of Bank Modernization
    SBS is Doubling Down on SaaS to Power the Next Wave of Bank Modernization
    Trust Embedding: Integrating Governance into Next-Generation Data Platforms
    Trust Embedding: Integrating Governance into Next-Generation Data Platforms
    The Guardian of Connectivity: How Rohith Kumar Punithavel Is Redefining Trust in Private Networks
    The Guardian of Connectivity: How Rohith Kumar Punithavel Is Redefining Trust in Private Networks
    BNY Partners With HID and SwiftConnect to Provide Mobile Access to its Offices Around the Globe With Employee Badge in Apple Wallet
    BNY Partners With HID and SwiftConnect to Provide Mobile Access to its Offices Around the Globe With Employee Badge in Apple Wallet
    How Integral’s CTO Chidambaram Bhat is helping to solve  transfer pricing problems through cutting edge AI.
    How Integral’s CTO Chidambaram Bhat is helping to solve transfer pricing problems through cutting edge AI.

    Why waste money on news and opinions when you can access them for free?

    Take advantage of our newsletter subscription and stay informed on the go!

    Subscribe

    Previous Technology PostYEAR IN REVIEW: HOW BIGGER & FASTER DRIVES, ENCRYPTION, AND NEW MALWARE IMPACTED DATA RECOVERY IN 2013
    Next Technology PostWILL SANTA REPLACE CHRISTMAS MAGIC WITH $227-BILLION TECHNOLOGY?

    More from Technology

    Explore more articles in the Technology category

    Why Physical Infrastructure Still Matters in a Digital Economy

    Why Physical Infrastructure Still Matters in a Digital Economy

    Why Compliance Has Become an Engineering Problem

    Why Compliance Has Become an Engineering Problem

    Can AI-Powered Security Prevent $4.2 Billion in Banking Fraud?

    Can AI-Powered Security Prevent $4.2 Billion in Banking Fraud?

    Reimagining Human-Technology Interaction: Sagar Kesarpu’s Mission to Humanize Automation

    Reimagining Human-Technology Interaction: Sagar Kesarpu’s Mission to Humanize Automation

    LeapXpert: How financial institutions can turn shadow messaging from a risk into an opportunity

    LeapXpert: How financial institutions can turn shadow messaging from a risk into an opportunity

    Intelligence in Motion: Building Predictive Systems for Global Operations

    Intelligence in Motion: Building Predictive Systems for Global Operations

    Predictive Analytics and Strategic Operations: Strengthening Supply Chain Resilience

    Predictive Analytics and Strategic Operations: Strengthening Supply Chain Resilience

    How Nclude.ai   turned broken portals into completed applications

    How Nclude.ai turned broken portals into completed applications

    The Silent Shift: Rethinking Services for a Digital World?

    The Silent Shift: Rethinking Services for a Digital World?

    Culture as Capital: How Woxa Corporation Is Redefining Fintech Sustainability

    Culture as Capital: How Woxa Corporation Is Redefining Fintech Sustainability

    Securing the Future: We're Fixing Cyber Resilience by Finally Making Compliance Cool

    Securing the Future: We're Fixing Cyber Resilience by Finally Making Compliance Cool

    Supply chain security risks now innumerable and unmanageable for majority of cybersecurity leaders, IO research reveals

    Supply chain security risks now innumerable and unmanageable for majority of cybersecurity leaders, IO research reveals

    View All Technology Posts