Connect with us

Global Banking and Finance Review is an online platform offering news, analysis, and opinion on the latest trends, developments, and innovations in the banking and finance industry worldwide. The platform covers a diverse range of topics, including banking, insurance, investment, wealth management, fintech, and regulatory issues. The website publishes news, press releases, opinion and advertorials on various financial organizations, products and services which are commissioned from various Companies, Organizations, PR agencies, Bloggers etc. These commissioned articles are commercial in nature. This is not to be considered as financial advice and should be considered only for information purposes. It does not reflect the views or opinion of our website and is not to be considered an endorsement or a recommendation. We cannot guarantee the accuracy or applicability of any information provided with respect to your individual or personal circumstances. Please seek Professional advice from a qualified professional before making any financial decisions. We link to various third-party websites, affiliate sales networks, and to our advertising partners websites. When you view or click on certain links available on our articles, our partners may compensate us for displaying the content to you or make a purchase or fill a form. This will not incur any additional charges to you. To make things simpler for you to identity or distinguish advertised or sponsored articles or links, you may consider all articles or links hosted on our site as a commercial article placement. We will not be responsible for any loss you may suffer as a result of any omission or inaccuracy on the website. .

Top Stories

Data Mastering and Distribution – Putting New Data in the Hands of the Business

Data Mastering and Distribution – Putting New Data in the Hands of the Business

Recent changes in business and regulatory reporting requirements have compelled financial institutions to put in place data management models that enable the management of and response to real-time data requests from diverse sources. And as the industry now starts to explore the growing range of innovative data sources, from sentiment analysis to satellite imagery, to add depth to traditional data resources, there is a significant opportunity to gain new insights through machine learning, correlation and pattern spotting – insights that can improve time to market and drive down costs.

Yet unlocking this new data diversity requires a fundamental change to traditional data management; financial data models must support not only fast and effective on-boarding of new data sources but also efficient data exploration and easy access and distribution of data across the business.

As Martijn Groot VP, Product Management, Asset Control, explains, traditional EDM models are being replaced by a new generation of data services that address the full information lifecycle, from data sourcing to integration and distribution.

New Data Services

From investment assessment to Know Your Customer (KYC), the way in which financial institutions consider decision making is set to change fundamentally over the next year as organisations begin to on-board and explore the new raft of data sources. While the new data model has been driven by regulatory demands, the sheer depth of information now created and collected globally is extraordinary – and is set to take the industry far beyond the traditional catalogue of price and reference data sources.

From web crawling to mine news and spot corporate events to sentiment analysis, satellite and geospatial information, traffic and travel patterns and property listings – the way in which organisations can now analyse investment opportunities, track Politically Exposed Persons, and company news, is being transformed.

No longer will organisations be limited to published financial statements and earning calls; instead investment decisions can be based on a much broader and deeper – but potentially also murkier – set of data. For example, the addition of social media sentiment analysis, combined with satellite information tracking car park usage, can deliver a new level of understanding into a supermarket’s performance. Indeed, with the availability now of transcripts of all earning calls, it is possible to understand who is asking specific questions and how CEOs and CFOs respond – insight which can be tracked and analysed to deliver fast, actionable investment insight. Similarly, with KYC – the ability to rapidly deep dive through multiple diverse data sources provides a chance to address the escalating overhead associated with customer on-boarding and reduce the cost of doing business.

New Mastering Model

The challenge, of course, is to find a way to harness these new data sources; to on-board this new insight into a way that is fast, effective and usable. Where does this leave traditional EDM solutions that have played a vital role in managing traditional data sources? The mastering process must still provide a 360-degree version of the truth that can be used across the organisation, from valuations to risk and financial reporting; the addition of data sources reinforces the need for excellent structured processes that compare sources to find discrepancies and deliver that golden source. But this process must now also deliver excellent integration – with organisations looking for robust Application Programming Interfaces (APIs) to enable the fast stitching together and exploration of these new data sources.

In addition to adding new depth to traditional information, these data sources also change the emphasis of the mastering process. Rather than focusing on error detection in order to achieve consistency and accuracy, these sources enable organisations to undertake pattern discovery, leveraging new techniques, including machine learning, to spot new correlations or reveal unusual activity, within market surveillance for example.

Speed is critical; with the vast number of new data sources available, fast, effective data discovery will be essential to drive down the cost of change and provide organisations with a chance to gain differentiation in time to market. Intelligent data mastering is at the heart of this new model. Combining APIs that enable integration with an easy process for testing and on-boarding these new models in production will be essential – and that will require APIs that support popular data science languages, including R and Python.  In addition, the use of NoSQL technology combined with the ability to deploy new models close to the data will be key to supporting the significant associated data processing demand.

Data Distribution

This ability to combine robust data mastering processes with excellent integration will build a new data foundation; it will enable an organisation to pull together these diverse data sets, create new insight based on sentiment from social media and performance from satellites and the traditional measures of price history and published financials.

To maximise the value of these data sources, organisations also need to reconsider access and utilisation. Making these new data sets easily accessible, not only to new algorithms and data scientists but also to end users within risk, investment, operations or compliance will mark a significant step change in data exploitation.

Ensuring the data easily integrates with the languages adopted by data scientists is fundamental; but to deliver the immense potential value to end users, data analysis must evolve beyond the traditional technical requirements of SQL queries.  Offering end users self-service access via enterprise search, a browser, Excel and easy to understand interaction models rather than via proprietary APIs and custom symbologies, will open up these new data sources to deliver even greater corporate value.

New Era

These new data sources are radically different to the traditional data resources – and their potential value to an organisation is untapped. Pattern matching in particular can be oriented not only towards improving operations or reducing risk but also towards improved pricing and new revenue opportunities. Matching of data items will not only take place through common keys but also through spotting the same behaviour in hitherto unrelated data or otherwise finding repeating patterns in time, space and across different data sets.

Especially for active investment management, the use of non-traditional data sources can help compete and differentiate against passive investment strategies; while in compliance and risk management, accessing a broader range of sources can help trigger early warnings on suspect transactions, relationships or price movements.

The potential is incredibly exciting – and first mover advantage cannot be overstated. The key for financial institutions over the next year or so is to move beyond traditional EDM models and embrace the new mastering and distribution services that will enable essential exploitation of data across the business.

Global Banking & Finance Review

 

Why waste money on news and opinions when you can access them for free?

Take advantage of our newsletter subscription and stay informed on the go!


By submitting this form, you are consenting to receive marketing emails from: Global Banking & Finance Review │ Banking │ Finance │ Technology. You can revoke your consent to receive emails at any time by using the SafeUnsubscribe® link, found at the bottom of every email. Emails are serviced by Constant Contact

Recent Post