Search
00
GBAF Logo
trophy
Top StoriesInterviewsBusinessFinanceBankingTechnologyInvestingTradingVideosAwardsMagazinesHeadlinesTrends

Subscribe to our newsletter

Get the latest news and updates from our team.

Global Banking & Finance Review®

Global Banking & Finance Review® - Subscribe to our newsletter

Company

    GBAF Logo
    • About Us
    • Advertising and Sponsorship
    • Profile & Readership
    • Contact Us
    • Latest News
    • Privacy & Cookies Policies
    • Terms of Use
    • Advertising Terms
    • Issue 81
    • Issue 80
    • Issue 79
    • Issue 78
    • Issue 77
    • Issue 76
    • Issue 75
    • Issue 74
    • Issue 73
    • Issue 72
    • Issue 71
    • Issue 70
    • View All
    • About the Awards
    • Awards Timetable
    • Awards Winners
    • Submit Nominations
    • Testimonials
    • Media Room
    • FAQ
    • Asset Management Awards
    • Brand of the Year Awards
    • Business Awards
    • Cash Management Banking Awards
    • Banking Technology Awards
    • CEO Awards
    • Customer Service Awards
    • CSR Awards
    • Deal of the Year Awards
    • Corporate Governance Awards
    • Corporate Banking Awards
    • Digital Transformation Awards
    • Fintech Awards
    • Education & Training Awards
    • ESG & Sustainability Awards
    • ESG Awards
    • Forex Banking Awards
    • Innovation Awards
    • Insurance & Takaful Awards
    • Investment Banking Awards
    • Investor Relations Awards
    • Leadership Awards
    • Islamic Banking Awards
    • Real Estate Awards
    • Project Finance Awards
    • Process & Product Awards
    • Telecommunication Awards
    • HR & Recruitment Awards
    • Trade Finance Awards
    • The Next 100 Global Awards
    • Wealth Management Awards
    • Travel Awards
    • Years of Excellence Awards
    • Publishing Principles
    • Ownership & Funding
    • Corrections Policy
    • Editorial Code of Ethics
    • Diversity & Inclusion Policy
    • Fact Checking Policy
    Original content: Global Banking and Finance Review - https://www.globalbankingandfinance.com

    A global financial intelligence and recognition platform delivering authoritative insights, data-driven analysis, and institutional benchmarking across Banking, Capital Markets, Investment, Technology, and Financial Infrastructure.

    Copyright © 2010-2026 - All Rights Reserved. | Sitemap | Tags

    Editorial & Advertiser disclosure

    Global Banking & Finance Review® is an online platform offering news, analysis, and opinion on the latest trends, developments, and innovations in the banking and finance industry worldwide. The platform covers a diverse range of topics, including banking, insurance, investment, wealth management, fintech, and regulatory issues. The website publishes news, press releases, opinion and advertorials on various financial organizations, products and services which are commissioned from various Companies, Organizations, PR agencies, Bloggers etc. These commissioned articles are commercial in nature. This is not to be considered as financial advice and should be considered only for information purposes. It does not reflect the views or opinion of our website and is not to be considered an endorsement or a recommendation. We cannot guarantee the accuracy or applicability of any information provided with respect to your individual or personal circumstances. Please seek Professional advice from a qualified professional before making any financial decisions. We link to various third-party websites, affiliate sales networks, and to our advertising partners websites. When you view or click on certain links available on our articles, our partners may compensate us for displaying the content to you or make a purchase or fill a form. This will not incur any additional charges to you. To make things simpler for you to identity or distinguish advertised or sponsored articles or links, you may consider all articles or links hosted on our site as a commercial article placement. We will not be responsible for any loss you may suffer as a result of any omission or inaccuracy on the website.

    1. Home
    2. >Technology
    3. >Improving operational efficiency in the data management process
    Technology

    Improving Operational Efficiency in the Data Management Process

    Published by Gbaf News

    Posted on June 29, 2018

    11 min read

    Last updated: January 21, 2026

    Add as preferred source on Google
    This image showcases a data graph that highlights the resilience of various consumer subsectors, providing insights for investors amid economic pressures, as discussed in the article.
    Graph illustrating resilience in consumer subsectors amidst economic challenges - Global Banking & Finance Review
    Why waste money on news and opinion when you can access them for free?

    Take advantage of our newsletter subscription and stay informed on the go!

    Subscribe

    As the financial services industry starts to harness a raft of new data sources for fast, effective and usable insights, the bottleneck for financial institutions becomes how well they really understand their data management processes.

     How many firms, for example, can answer the following questions: Do we understand the impact of bad data quality? Can we measure this quality, and do we have full oversight over the steps in the data management process? Can we pre-empt data issues? When data issues arise, can we take restorative action quickly and track adjustments along the way without losing oversight of those changes?

     Hugo Boer, Senior Product Manager, Asset Control, outlines why the onus is on transparent, end to end financial data management to enable real-time insight into daily data sourcing, mastering and distribution processes, to improve workflows and increase operational efficiency if firms are to unlock the value from all their new data sources.

     Data Scrutiny

    New regulatory drivers and business pressures have led to increased scrutiny on the data management process. For example, the ECB’s Targeted Review of Internal Models (TRIM) was introduced with the aim of assessing whether internal model results to calculate risk-weighted assets were reliable and comparable. The TRIM guide contained a specific Data Quality Framework focusing on data accuracy, consistency, completeness, validity, availability and traceability as a precondition for these models.

    This regulatory focus is, however, just one aspect of the growing recognition amongst financial institutions of the need to improve insight into data management processes.  There is huge business pressure on the data management teams to not only manage increasing numbers of data sources but also to deliver accurate and consistent data sets in ever decreasing time windows.

    Despite overlap between the data used by different departments, many teams are still operating in functional silos, from finance to risk. In an increasingly joined up and overlapping corporate data environment these dispersed data management activities are inherently inefficient; from parallel data sourcing teams buying the same data multiple times to expending the same effort on data preparation. The result is not only high data sourcing and preparation costs but unnecessary data storage and, critically, unacceptable operational risk.

     Transparent Process

    What is required is a single overview of the data management process; the ability to track data collection and verification progress and gain rapid insight into any problems that could affect delivery Service Level Agreements (SLAs).  And while companies have attempted to deliver point oversight via existing Management Information tools they have failed to provide an intuitive single view over the entire data management process across the business. What Data Management teams require is transparency across the diverse data silos and deliveries to data consumers and insight into the status of every process of data sourcing, cleansing and verification through to delivery to the downstream systems. Essentially, data management teams need a single view into the health of corporate data.

    The implications of enhanced data transparency are significant. In addition to meeting the regulatory requirements associated with increased data scrutiny, including data quality, visibility and completeness, with a single view of the entire data management process, organisations can begin to drive significant operational change and create a culture of continuous data improvement.

    For example, a complete perspective of any overlap in data resources will enable streamlining of data acquisition, reducing both purchase costs as well as data cleansing and delivery costs. In addition, it will overcome the current risks associated with a lack of data understanding between different areas which can create significant federation issues that can affect both operational performance and regulatory compliance. Simple steps such as calibrating consistently applied rules for data sets or asset classes, and ensuring changes to data cleansing rules are documented, will further reinforce the value of acquired data to the business.

     Extended Data Understanding

    This transparency into the status of data sourcing, processing and delivery should not be limited to data management experts: transparency of the data supply chain should be shared with everyone in the company, providing end users with insight into the quality of the data used for risk, finance, post-trade-reporting and so on. Data confidence is a fundamental requirement in post financial crisis trading and providing end users with access to a simplified view of the data acquisition, cleansing and provisioning process for each data source will play a key role in fostering a common, companywide understanding of the data and how it is used.

    For example, showing users that e.g. Bloomberg data is used as the primary source for US corporate bonds, Thomson Reuters data for Foreign Exchange and Six Financial data for corporate actions; capturing comments from data analysts when this hierarchy is changed; what data cleansing rules have been used; and when manual intervention took place can all be valuable information. This transparency will support better data knowledge and confidence and can also overcome some of the data misalignment that has evolved over the past couple of decades. With better understanding of the end to end process for each data source, firms can begin to spot trends in the relative quality of different sources per market and asset class. Are there repeat errors in a data source? Is there an alternative data source already being used somewhere else in the business? Or is it time to onboard a new provider? End to end data management visibility will enable firms to drive a culture of continual improvement, addressing data quality issues and seeking out the most effective data sources for the business.

     Conclusion

    The total cost associated with end to end data management is becoming far more apparent, especially given the growing overlap in data usage across the business and the rise in new data sources available. Add in the escalating regulatory expectations for robust processes and the operational risk associated with siloed data management teams and the implications of this lack of transparency are becoming very apparent.

    To maximise the value of new data sources, financial institutions need to evolve from departmental data silos and achieve end to end to transparency of the data management process. Furthermore, while this will significantly improve the data management operation it is also essential to push data responsibility and knowledge to the end users: data quality is a business issue and providing data transparency to business teams will be key in creating a strong culture of continuous improvement and leveraging feedback to drive up data quality and confidence across the institution.

    As the financial services industry starts to harness a raft of new data sources for fast, effective and usable insights, the bottleneck for financial institutions becomes how well they really understand their data management processes.

     How many firms, for example, can answer the following questions: Do we understand the impact of bad data quality? Can we measure this quality, and do we have full oversight over the steps in the data management process? Can we pre-empt data issues? When data issues arise, can we take restorative action quickly and track adjustments along the way without losing oversight of those changes?

     Hugo Boer, Senior Product Manager, Asset Control, outlines why the onus is on transparent, end to end financial data management to enable real-time insight into daily data sourcing, mastering and distribution processes, to improve workflows and increase operational efficiency if firms are to unlock the value from all their new data sources.

     Data Scrutiny

    New regulatory drivers and business pressures have led to increased scrutiny on the data management process. For example, the ECB’s Targeted Review of Internal Models (TRIM) was introduced with the aim of assessing whether internal model results to calculate risk-weighted assets were reliable and comparable. The TRIM guide contained a specific Data Quality Framework focusing on data accuracy, consistency, completeness, validity, availability and traceability as a precondition for these models.

    This regulatory focus is, however, just one aspect of the growing recognition amongst financial institutions of the need to improve insight into data management processes.  There is huge business pressure on the data management teams to not only manage increasing numbers of data sources but also to deliver accurate and consistent data sets in ever decreasing time windows.

    Despite overlap between the data used by different departments, many teams are still operating in functional silos, from finance to risk. In an increasingly joined up and overlapping corporate data environment these dispersed data management activities are inherently inefficient; from parallel data sourcing teams buying the same data multiple times to expending the same effort on data preparation. The result is not only high data sourcing and preparation costs but unnecessary data storage and, critically, unacceptable operational risk.

     Transparent Process

    What is required is a single overview of the data management process; the ability to track data collection and verification progress and gain rapid insight into any problems that could affect delivery Service Level Agreements (SLAs).  And while companies have attempted to deliver point oversight via existing Management Information tools they have failed to provide an intuitive single view over the entire data management process across the business. What Data Management teams require is transparency across the diverse data silos and deliveries to data consumers and insight into the status of every process of data sourcing, cleansing and verification through to delivery to the downstream systems. Essentially, data management teams need a single view into the health of corporate data.

    The implications of enhanced data transparency are significant. In addition to meeting the regulatory requirements associated with increased data scrutiny, including data quality, visibility and completeness, with a single view of the entire data management process, organisations can begin to drive significant operational change and create a culture of continuous data improvement.

    For example, a complete perspective of any overlap in data resources will enable streamlining of data acquisition, reducing both purchase costs as well as data cleansing and delivery costs. In addition, it will overcome the current risks associated with a lack of data understanding between different areas which can create significant federation issues that can affect both operational performance and regulatory compliance. Simple steps such as calibrating consistently applied rules for data sets or asset classes, and ensuring changes to data cleansing rules are documented, will further reinforce the value of acquired data to the business.

     Extended Data Understanding

    This transparency into the status of data sourcing, processing and delivery should not be limited to data management experts: transparency of the data supply chain should be shared with everyone in the company, providing end users with insight into the quality of the data used for risk, finance, post-trade-reporting and so on. Data confidence is a fundamental requirement in post financial crisis trading and providing end users with access to a simplified view of the data acquisition, cleansing and provisioning process for each data source will play a key role in fostering a common, companywide understanding of the data and how it is used.

    For example, showing users that e.g. Bloomberg data is used as the primary source for US corporate bonds, Thomson Reuters data for Foreign Exchange and Six Financial data for corporate actions; capturing comments from data analysts when this hierarchy is changed; what data cleansing rules have been used; and when manual intervention took place can all be valuable information. This transparency will support better data knowledge and confidence and can also overcome some of the data misalignment that has evolved over the past couple of decades. With better understanding of the end to end process for each data source, firms can begin to spot trends in the relative quality of different sources per market and asset class. Are there repeat errors in a data source? Is there an alternative data source already being used somewhere else in the business? Or is it time to onboard a new provider? End to end data management visibility will enable firms to drive a culture of continual improvement, addressing data quality issues and seeking out the most effective data sources for the business.

     Conclusion

    The total cost associated with end to end data management is becoming far more apparent, especially given the growing overlap in data usage across the business and the rise in new data sources available. Add in the escalating regulatory expectations for robust processes and the operational risk associated with siloed data management teams and the implications of this lack of transparency are becoming very apparent.

    To maximise the value of new data sources, financial institutions need to evolve from departmental data silos and achieve end to end to transparency of the data management process. Furthermore, while this will significantly improve the data management operation it is also essential to push data responsibility and knowledge to the end users: data quality is a business issue and providing data transparency to business teams will be key in creating a strong culture of continuous improvement and leveraging feedback to drive up data quality and confidence across the institution.

    More from Technology

    Explore more articles in the Technology category

    Image for Nominations Open for Technology Awards 2026
    Nominations Open for Technology Awards 2026
    Image for Nominations Open for Innovation Awards 2026
    Nominations Open for Innovation Awards 2026
    Image for Archie earns industry recognition across G2, Capterra, and SoftwareReviews
    Archie Earns Industry Recognition Across G2, Capterra, and SoftwareReviews
    Image for The Bankaool Transformation: How a Regional Mexican Bank Became a Fintech Disruptor
    The Bankaool Transformation: How a Regional Mexican Bank Became a FinTech Disruptor
    Image for Submit Your Entry Today for Digital Banking Awards 2026
    Submit Your Entry Today for Digital Banking Awards 2026
    Image for Behavioral AI in Financial Services: Moving Beyond Automation Toward Human Understanding
    Behavioral AI in Financial Services: Moving Beyond Automation Toward Human Understanding
    Image for Submit Your Entry for Brand of the Year Awards Technology Bahrain 2026
    Submit Your Entry for Brand of the Year Awards Technology Bahrain 2026
    Image for Entries Now Open for Best Islamic Open Banking Burkina Faso APIs 2026
    Entries Now Open for Best Islamic Open Banking Burkina Faso APIs 2026
    Image for Entrepreneurial Discipline in the AI Economy: Insights from Dmytro Lavryniuk
    Entrepreneurial Discipline in the AI Economy: Insights From Dmytro Lavryniuk
    Image for Entries Now Open for Best New Digital Wallet Innovation Award 2026
    Entries Now Open for Best New Digital Wallet Innovation Award 2026
    Image for Call for Entries: Best Digital Wallet 2026
    Call for Entries: Best Digital Wallet 2026
    Image for Nominations Open for Brand of the Year Technology 2026
    Nominations Open for Brand of the Year Technology 2026
    View All Technology Posts
    Previous Technology PostWhat Is an Algorithm
    Next Technology PostMcAfee Labs Sees Criminals “Infect and Collect” in Cryptocurrency Mining Surge