by Patrick Lastennet, Interxion’s Financial Services Director of Marketing and Business Development
After President Obama passed the Dodd-Frank Act last year, in an effort to avoid another financial crisis like the one in 2008, Europe is following suit with the European Market Infrastructure Regulation (EMIR) and MiFID II. While these regulations aim to increase transparency into financial markets and, specifically, over-the-counter (OTC) derivatives, the impact of MiFID II isn’t likely to be effective until 2014 as it is still in the negotiation and adoption stage. Its pending implementation, however, is extremely important as the ‘flash crash’ of the Dow Jones Industrial Average in 2010 clearly illustrated the high levels of correlation between the performance of financial instruments and the impact of new developments and defaults. This has been the driving force behind such new regulations and is forcing firms to manage their risk in real-time.
Banks have traditionally built in house data centre infrastructures for security, compliance and reliability reasons. Now regulatory compliance is placing a significant strain on financial institutions’ IT infrastructure, as they have to evaluate and cope with exponential increases in real-time data. With MiFID II set to go live in the near future, further real-time requirements will only amplify the need for high performance analysis of big data sets. In the face of this challenge many banks are now adopting the use of a 3rd party data centre for non-core business processes.
Tackling Big Data
Together, OTC clearing reform and MiFID II will place a variety of new requirements, including new derivative transparency measures, on financial services companies. This will increase the amount of data that financial services companies are responsible for. With these measures in place, both pre- and post-trade transaction information will need to be made available for those contracts which today are traded over the counter. Those same contracts will need to be cleared through Central Counter Parties (CCPs).
Adding to this increase in data volumes, risk management requirements soon to be imposed by MiFID II and OTC derivatives regulation are demanding that banks manage and record every transaction in real-time. For complex OTC derivatives products, such risk calculations will drastically increase the requirements for computing power and storage for trading firms and investment banks.
The combination of all of these requirements is creating a challenge that most financial institutions’ legacy data centres are simply not equipped to handle. It puts a significant strain on firms’ technical infrastructure, which often causes extreme latency or even business-crippling downtime. As a result, companies need more powerful systems and ICT infrastructure to be able to sustain such high volumes of data, crunch numbers and perform calculations in real-time. The problem here is that the underlying infrastructure required to support such systems is not readily available or even accessible to most firms.
WANT TO BUILD A FINANCIAL EMPIRE?
Subscribe to the Global Banking & Finance Review Newsletter for FREE Get Access to Exclusive Reports to Save Time & Money
By using this form you agree with the storage and handling of your data by this website. We Will Not Spam, Rent, or Sell Your Information.
To cope with such increasing levels of data and keep pace with real-time transaction records, it is in financial firms’ best interest to use a virtualised IT environment. With virtualised server farms, companies can pull more computing resources to complete the tasks without belabouring their IT systems and limited physical capacity. In a virtualised environment, since additional virtual machines (VMs) can be spun up or down at will, firms can adapt to high-data volumes more easily than continually adding more physical boxes.
In order to power a suitable virtual environment, financial institutions need to sustain high-density power throughout their IT facility. The problem, however, is that legacy data centres only have about one kilowatt of power per square metre, whereas a virtual environment required by today’s financial firms under these new regulations would require more than twice that.
Impending Basel III legislation, forcing banks to increase capital and liquidity retention to cover their market position is stretching banks balance sheets, meaning many are considering to externalise data centre operations. For this reason, many firms are realising the benefit of colocating to a third-party data centre provider, like Interxion, that is already equipped with high-density requirements, suitable for current and future implementations and demands.
Beyond high-density power, colocating to a third-party data centre provider presents many benefits for financial firms looking for ways to cope with steadily rising data volumes. When looking at the footprint of most financial institutions IT installation, at least 50 per cent of it is typically used to process market data, in terms of footprint, computing power and cost. Similarly, trading firms’ bandwidth requirements are almost entirely used for this same purpose. As a result of this growth and corresponding market data volume requirements, financial firms need a massively scalable infrastructure in order to keep pace with increasing demands.
Such scalability is readily available in colocation facilities like Interxion that employ a modular approach with easy build-outs, eliminating wasteful over-provisioning, all allows for customers within the data centre to scale up or down at will. The availability of scalable resources like connectivity and power within a colocation facility also help to minimise costs, since having multiple participants all under one roof allows companies to leverage the economies of scale with colocation.
Taking into account the facilities’ close proximity to all major liquidity venues, colocating to a third-party data centre provider brings benefits to all stakeholders involved in financial trading, especially in carrier-neutral facilities where all financial networks are located, ensuring optimised, cost-effective connectivity to all liquidity venues
For distribution processes, financial institutions typically first receive the data and then have to redistribute it to their own customers down the line. In a colocation facility, however, customers can locate within the same data centre as the distributing firm and can cross connect, presenting opportunities for instant distribution and significant time saving. Furthermore, sharing a facility with other market participants can help create a hub topology that interconnects within itself, increasing firms reach within the data centre and encouraging a reduced time to market for acquiring new customers.
Increasing Infrastructure Externalisation
With unprecedented levels of data being generated in real-time from the need to manage risks associated with increasing market volatility and the immediacy of financial impacts, firms are facing a real challenge to handle the increase in data cost effectively. What’s more, the transparency requirements from upcoming OTC clearing reform and MiFID II regulation will simply add to this data increase. In preparation of Basel III, turning to a data centre colocation provider and externalising their infrastructure to meet demands and data volumes, therefore, has become extremely popular.
Indeed, with colocation facilities’ high-density power capabilities, scalability and proximity to major liquidity venues, financial institutions are finding major advantages in moving away from their legacy systems and inadequate infrastructures. While the first wave of financial regulations drove financial institutions’ adoption of colocation data centres for high-frequency trading performance, new requirements for risk management and analytics are showcasing the long-term benefits that financial institutions can gain by locating their infrastructure in external data centres.