The acronym EDM for Enterprise Data Management has long been associated with the discipline of sourcing, mastering and distributing data that is widely used across different departments in a financial services firm. This typically includes valuation data, instrument master data and entity data. However, major changes in business requirements, regulatory reporting requirements and enabling technologies mean that each of the main steps in the EDM process is undergoing massive changes.
From shorter reporting cycle times and a demand for ad hoc, intra-day requests for client or instrument set ups, the challenge for organisations is to ensure Enterprise Data Management (EDM) systems are able to both respond to and manage those real-time data requests – whether from the existing security master data or from a real-time data flow provided by a third party. And it is not only the frequency and granularity of data sourcing. The variety of sources used in an EDM function will increase as well.
Martijn Groot, VP Product Management, Asset Control, outlines how developments in sourcing, mastering and distribution will come down to nothing short of the demise of the EDM as we know it.
The proactive sourcing imperative
WANT TO BUILD A FINANCIAL EMPIRE?
Subscribe to the Global Banking & Finance Review Newsletter for FREE Get Access to Exclusive Reports to Save Time & Money
By using this form you agree with the storage and handling of your data by this website. We Will Not Spam, Rent, or Sell Your Information.
While there can be no question that the financial data landscape has changed dramatically over the past decade in response to the regulatory overhaul, it is both the frequency and granularity of data that is changing the sourcing process, as well as the simple availability of more data to work with.
Over the past few years, organisations have leveraged the increasing power of Enterprise Data Management (EDM) systems to bulk-load and warehouse security master data on a typically daily basis. But, while effective, such models no longer meet regulatory requirements – for example the demands set out for forthcoming MiFID II regulation. There is a necessary evolution amongst both data and EDM suppliers to move from the traditional end of day file based delivery of reference data towards a more specific model, where individual data items are sourced on demand via Application Programming Interfaces (APIs).
On top of this, an increasing availability of content not available via the structured offerings from the enterprise data providers means an opportunity for data scientists to differentiate and extract new insights. One need only look at alternativedata.org to get an inkling of the richness of raw materials to work with. Examples include web crawling sources to mine news and spot corporate events, sentiment analysis, satellite and geospatial information, traffic and travel patterns and property listings. All this data will come with new indices and summary statistics which, when properly accessible, can be analysed and monitored for investment signals and risk management.
But what does this mean for end to end data management processes? How will organisations manage the complex mix of real-time, intra-day sourced data, new content and existing data resources; ensure information is used efficiently and effectively in servicing their business users; and satisfy the regulators?
Effective Data Capture
The key to successfully leveraging real-time data is a different approach from the classic EDM model to creating the security master data source. Unlike the daily download and reconciliation, systems must now be able to respond to real-time user requests for information. MiFID II regulation, for example, presents the need for organisations to retrieve a specific set of data including, but not limited to, International Securities Identification Numbers (ISIN) for OTC derivatives, and TOTV and uTOTV flags for new as well as existing OTC instruments.
To support these needs, the ANNA Derivative Service Bureau (DSB) has recently announced its instantaneous service in financial data, which requires users to supply the Product Definition and further Input Attributes to the ANNA DSB via a RESTful API or the FIX Protocol. In response, an EDM must capture system requests related to client set up or instrument in real-time; then verify whether or not that request can be serviced from an existing data set, to prevent an unnecessary and costly hit on an external source; and only go out to look for the additional data from ANNA DSB when needed.
It is this ability to continuously listen to requests and then screen each one before seeking any additional external data, which will be key to efficiently and cost effectively managing this real-time data model.
Precision Data Sourcing
While the decision by data providers to increasingly leverage APIs to enable more interactive and continuous, ad hoc individual data item requests has, in the main, been in response to regulatory change, this real-time system opens the door to a far more precise and relevant data sourcing model. With a real-time approach, organisations are able to actively source the specific data required as and when needed – which is very different to the current model of pre-emptively sourcing all and any data that may or may not be used and storing it within a warehouse.
The most obvious benefit, of course, is that data is refreshed in real-time, based on a trigger from the business, avoiding the risk of the data loaded into the warehouse becoming out of date. In addition, however, the data vendors are now by default moving away from the previous blunt delivery models where data was bundled up into, for example, all European Equities or all North American Corporate bonds. Now the option is to cherry pick just the specific instruments or data elements required – reducing the volume of data both purchased and stored.
Moreover, there already is a vast array of new data sources including unstructured data available – with data providers currently looking at how best to monetise a mass of new information. EDMs able to efficiently explore (ie. master) and exploit (eg. distribute) these alternative sources, both structured and unstructured – with text and data sets, will provide a new depth of insight and enable organisations to respond to ever shorter reporting cycle times and get new insights from spotting patterns across the spectrum of sources.
The financial data model is changing as organisations face up to the opportunities provided by the broadening range of data sources and the challenges of regulatory demand for timely, efficient data to support compliance. In both cases, the onus is on organisations to avoid redundant storage points of data, to find the connections between different sources through sufficiently powerful mapping and matching logic but also to avoid repurchasing existing information.
There is, of course, still a requirement for an effective data warehouse model, a place both to retain security master data that will be used again to minimise expenditure and provide the full audit trail of all data activity. But the EDM must also be able to support and manage those real-time data requests, to check the security master data and if not available, immediately source the required information. It must also enable pattern discovery, not only oriented towards improving operations or reducing risk, but also oriented towards improved pricing and new revenue opportunities. Finally, it needs to provide business users with unfettered access to this mastered data.
An EDM that can support real-time data demands enables organisations to embrace a far more conscious approach to data sourcing, mastering and distribution, purchasing only the specific data sets required and aiding with data discovery across the new world of data sources. But this is just the start of operational enhancements; the regulatory demand for a near real-time response to servicing requests from trading and risk systems opens the door to hitherto unachievable operational benefits, from faster on-boarding processes to new instrument creation, and heralds a new dawn for unlocking data value.