Laura Shepard, Director of HPC Marketing, DataDirectNetworks
Banks are arguably one of the biggest enterprise consumers of IT across all industries – research from Ovum in June 2013 suggested global IT spending at retail banks alone would reach $132bn dollars by 2015. You would think, therefore, that most would be a step ahead of the game in terms of how they use IT. Whilst that may be true across many aspects of this sector, for example high frequency traders using ultra low latency networking or innovations such as ‘back testing’ or using time series databases, most are still using last century infrastructure design. The infrastructure simply hasn’t matched the capability or the scale of the innovative software. Financial firms are three to five years behind academics and research institutes when it comes to state of the art infrastructure design.
Decades old enterprise computing
Quant analysts are constantly evolving and developing algorithms based on historical data to help predict the market direction, important for a number of reasons – not least to make money and reduce exposure to risk. Yet, it puzzles me why in so many firms this core function is still being implemented in the same way it has always been done – one that is based on practices common place in enterprise computing several years ago. Latest technologies can increase performance of algorithms 5 times. A lot of the ‘features’ of enterprise computing, replication, de-duplication and compression, all interfere with performance. When performance matters most only HPC techniques deliver.
Finance houses’ ‘back testing’ capability is also being undermined by enterprise computing from two decades ago. Whilst there is not necessarily anything wrong with this per se, it is limiting their ability to query and build algorithms for ever-expanding data volumes. Increasingly, financial institutions want to run back testing on multi-year and multi location/market data, to help them more accurately predict market movement globally and to help reduce their risk exposure.
But today’s current financial technology infrastructures, in the most part, are not tuned up to deal with double digit terabytes and soon to be hundreds of terabytes of data. Long term, it becomes impractical from a cost perspective and a ‘real estate’ standpoint to continue to use ‘90s and ‘00s enterprise style computing – these systems require a lot of space.
Today, finance houses want to analyse more data from more sources than just exchanges and run simultaneous programmes to create a better understanding of their market. This includes a number of elements such as, market tick data, which is fixed format: it moves fast but its type is predictable; Sentiment analysis, which adds an additional data dimension, type variation as well as volume is an analytics approach to data whereby algorithms are emerging to search through popular social media feeds – twitter, for example – and other public sources looking for key words and phrases to spot trends that then inform the trading practice. For example, an algorithm that looks for discussions on energy, pipeline failure or oil spill will ‘e-discover’ the mood of market and predict how the market trades.
The sheer increase in data volumes and the desire to analyse more data from multiple sources, whether it be structured in databases or unstructured like social media, it suddenly becomes no longer cost effective to try and build enterprise systems that can hold that much data in system memory, or storage cache. The particle physicists and life science researchers hit this wall 5 years ago and turned to non-blocking storage and on-line analytics to pre-process the data before storing and streaming gigabytes of data a second for months on end. If you were looking for the Higgs Boson particle and someone from the back of room put his hand up and said, “Sorry, can you run the 3 month experiment again, had a cache overflow and missed the bit in the middle” you would probably like to review contracts. Financial services may not have the volumes of their academic peers but they do have the same velocity problem – churning 30TB of market data faster than your competition will bring first mover advantage to any algorithmic trader.
Embrace HPC techniques
To be competitive, firms need to look beyond client/server and even distributed systems and embrace HPC techniques, which have been tried and tested by the academic research community for at least the last ten years or so. In addition, technology strategy ambitions should not stop at batching data through cache. To batch is not terribly efficient, SAP Hana and SAS Grid are ushering the merger of online transaction processing and online-analytics into the enterprise. Batch processing will become as archaic as mainframes. Slicing data into ten iterations might speed things up, but ten iterations will soon become 100 iterations and so on, it is inefficient to say the least.
Some organisations might look to put in flash storage between storage infrastructure and system memory. Whilst flash will provide better performance in some cases, alone it will not scale to the terabytes of data finance houses hold, nor deliver the bandwidth. Flash still remains too expensive to replace spinning disk data arrays.
There is no middle ground to addressing this data growth and “need to know more” challenge; the successful firms will embrace Big Data. And, there is a new way, borrowed from HPC, and the visionary finance houses are starting to see the benefit from a parallel approach – it started in the systems with grid computing and is now reaching the file system, helping firms analyse more positions faster and develop more effective trading strategies, which they can deploy in less time.
The majority of finance houses running algorithmic back testing do so out of KX or other custom in-memory databases running on 90s style infrastructure, which is like pairing a Ferrari engine to a Mini gearbox. STAC M3 tests showed that flipping the storage to supercomputer class can accelerate the database 800%.
By using fast, scalable, external disk systems with massively parallel access to data, researchers can perform analysis against much larger data sets delivering more effective models, faster. This means analysts can run hundreds of models or hundreds of iterations of the same model in the same time it used to take to run a few.
It’s not just trading where HPC techniques can be exploited, risk and compliance departments will want to process more data and get answers more quickly and lower capital reserving. One firm I know has reduced a risk calculation from 9 minutes to 2 minutes using HPC style data storage – that gets them to the market 7 minutes earlier which makes a big difference when volumes are high.
The UK regulator wants financial institutions to raise another £27billion in capital reserve. If a firm can demonstrate their risk control measures are strong to pass the stress tests, they will be able to deploy capital into work and not stuck in reserve. This means dramatically increasing your ability to assess total market exposure from only once or twice a day – to multiple, intra-day assessments.
Blind Case Study
At one global hedge fund, hundreds of servers capture 3GB of tick data per exchange every day. Tens of quantitative analysts work with that data to create and test equity trading strategies on hundreds more servers. They were previously using several larger NAS filers that simply could not keep up with the volume of data and the analyst’s data access performance requirements.
They moved to a similar capacity DDN SFA system and with the additional performance delivered, they were able to back test 3x the number of models, significantly reducing their time to deploy new strategies. The Securities Technology Analysis Center LLC (STAC®), M3 benchmark recently demonstrated that the SFA12K-40 (with a hybrid configuration of Flash and spinning disk technology) from DDN, delivered performance more than eight times faster than the traditional storage average and, in some cases, almost twice the performance of Flash storage.
“Before we rolled out the DDN systems, our [filer] farm just couldn’t keep up with the growth in the market and our business. With the DDN systems, our traders are getting new strategies into the market faster.” – director of IT, global hedge fund.
Organisations that want to be able to analyse all their exchanges, across multiple locations and query such ‘phenomena’ as sentiment (Big Data) need to be looking at adopting a parallel file approach to their storage environment.
People who are building their own tools that can take advantage of a parallel file system will do very well. They can use the parallel file system as the basis to stream data from multiple sources and have many different systems using the same data storage concurrently so they do not need a dedicated copy of each set of data. So, parallel file systems enable multiple nodes to hit the same data concurrently and same data sources concurrently – not only can hit the tick database from London and Singapore at the same time but also the sentiment analysis – mentioned earlier on.
Blind case study
A large US proprietary trading firm specialising in high frequency trading recognised early that they would need to move away from direct attached and NAS storage if they were going to handle their requirements to share and access petabytes of data across multiple types of high performance trading groups including teams in currencies, derivatives, international equities, technology equities and more.
An extreme need for speed was identified as the firm’s success depended on being highly competitive in bringing many new strategies to bear quickly and failing out fading or unsuccessful strategies in as close to real time as possible. A parallel architecture was selected to provide two key criteria to meet these goals:
- A global namespace for more efficient data gathering and sharing
- Parallel IO to remove the time lag if sequential jobs necessary in NAS architectures.
Several generations of parallel infrastructure were tried based on storage and parallel file systems, one provided by a major server vendor and another by a major storage vendor – under production conditions, neither was able to meet even the minimum performance requirements based on current and projected needs which are in the petabytes and in the GB/s sustained IO performance.
Impressed with DDN’s massively parallel architecture, and open platform approach to parallel file systems which would allow them to change infrastructure if they needed to without throwing out their entire investment, this company evaluated then selected DDN SFA with GPFS. In addition to surpassing the performance, availability, scalability and cost requirements, the office of the firm’s CIO also liked DDN’s Python based infrastructure and features that would support possible future directions like embedding applications in the storage itself to cut IO path process steps by almost half for extreme low latency .
“Before we rolled out the DDN systems, our [NAS] farm just couldn’t keep up with the growth in the market and our business. With the DDN systems, our traders are getting new strategies into the market faster.” – Director of IT, large proprietary trading firm.
For me, the next big things in terms of Big Data, and the analysis of that data, relates to risk management. The first step is CAT (Consolidated Audit Trail) but there is so much more to gain than mere compliance. There are quantifiable goals on the table such as intra-day risk assessment, capitalisation and liquidity reporting and predictive modeling. But beyond that, institutions would love to know what their market exposure and opportunity is at any given time. Knowing the position at any moment in the day will allow the firm to know which way to jump and deliver that elusive, first mover advantage.
Barclays announces new trade finance platform for corporate clients
Barclays Corporate Banking has today announced that it is working with CGI to implement the CGI Trade360 platform. This new platform will provide an industry leading end-to-end global trade finance solution for Barclays clients in the UK and around the world.
With the CGI Trade360 platform, Barclays will provide clients with greater connectivity and visibility into their supply chains, allowing them to optimise working capital efficiency, funding and risk mitigation. By utilising cloud based functionality for corporate banking clients, Barclays will also be able to offer a leading client user experience through easy access and real-time integration to essential information, combined with the latest trade solutions as the industry-wide shift to digitisation continues to accelerate.
This move underpins Barclays commitment to supporting the trade and working capital needs of their clients and reinforces a commitment to innovation that has been central to the bank for more than 300 years.
James Binns, Global Head of Trade & Working Capital at Barclays, said: “We are delighted to announce our move to the CGI Trade360 platform and to have started the implementation process. We have a longstanding partnership with CGI, and the CGI Trade360 platform will mean we can continue delivering the best possible trade solutions and service to our clients for many years to come.”
Neil Sadler, Senior Vice President, UK Financial Services, at CGI, said: “Having worked closely with Barclays for the last 30 years, we knew we were in an excellent position to enhance their systems. Not only do we have a history with them and understand how they work, but part of the CGI Trade360 solution includes a proof of concept phase, which is essentially seven weeks of meetings and workshops with employees across the globe to guarantee the product’s efficiency and answer all queries. We’re delighted that Barclays chose to continue working with us and look forward to supporting them over the coming years.”
What’s the current deal with commodities trading?
By Sylvain Thieullent, CEO of Horizon Software
The London Metal Exchange (LME) trading ring has been the noisy home of metals traders buying and selling for over a hundred years. It’s the world’s oldest and largest metals market and is home to the last open outcry trading floor. Recently however, the age-old trading ring, though has been closed during the pandemic and, just a few weeks ago, the LME announced that it will remain so for another six months and that it is taking steps to improve its electronic trading. This news fits in with a growing narrative in commodities about a shift to electronic trading that has been bubbling away under the surface.
Something certainly is stirring in commodities. The crisis has affected different raw materials differently: a weakening dollar and rising inflation risks bode well for some commodities with precious metals being very attractive, as seen by gold reaching all-time highs. Oil on the other hand has had a tough year and experienced record lows from the Saudi-Russia pricing war. It has been a turbulent year, and now prices look set to soar. While a recent analyst report from Goldman Sachs predicts a bullish market in commodities for the year ahead, with the firm forecasting that it’s commodities index will surge 28%, led by energy (43%) and precious metals (18%).
Increasingly, therefore, it seems that 2020 is turning out to be a watershed moment for commodities, and it’s likely that the years ahead will bring about significant transformation. And whilst this evolution might have been forced in part by coronavirus, these changes have been building up for some time. Commodities are one of the last assets to embrace electronic trading; FX was the first to take the plunge in the 90s, and since then equities and bonds have integrated technology into their infrastructure, which has steadily become more advanced.
The slow uptake in commodities can be explained by several truths: the volumes are smaller and there is less liquidity, and the instruments are generally less exotic, essentially meaning it has not been essential for them to develop such technology – at least not until now. This means that, for the most part, the technology in commodities trading is a bit outdated. But that is changing. Commodities trading is on the cusp of taking steps towards the levels of sophistication in trading as we see in other asset classes, with automated and algo trading becoming ever prominent.
Yet, as commodities trading institutions are upgrading their systems, they will be beginning to discover the extent of the job at hand. It’s no easy task to upgrade how an entire trading community operates so there’s lots to be done across these massive organisations. It requires a massive technology overhaul, and exchanges and trading firms alike must be cautious in the way they proceed, carefully establishing a holistic, step-by-step implementation strategy, preferably with an agile, V-model approach.
The workflow needs to be upgraded at every stage to ensure a smooth end-to-end trading experience. So, in replacement of the infamous ring, these players will be looking to transform key elements of their trading infrastructure, including re-engineering of matching engines and improving communications with clearing houses.
However, these changes extend beyond technology. For commodities players to make a success of the transformation in their community, exchanges need to have highly skilled technology and change the very culture of trading. All of which is currently being done against a backdrop of lockdown, which makes things much more difficult and can slow down implementation.
What is clear is that coronavirus has definitely acted as a catalyst for a reformation in commodities. It is a foreshadowing of what lies ahead for commodities trading infrastructure because, a few years down the line, commodities trading could well be very different to how it is now, and the trading ring consigned to history.
Afreximbank’s African Commodity Index declines moderately in Q3-2020
African Export-Import Bank (Afreximbank) has released the Afreximbank African Commodity Index (AACI) for Q3-2020. The AACI is a trade-weighted index designed to track the price performance of 13 different commodities of interest to Africa and the Bank on a quarterly basis. In its Q3-2020 reading, the composite index fell marginally by 1% quarter-on-quarter (q/q), mainly on account of a pull-back in the energy sub-index. In comparison, the agricultural commodities sub-index rose to become the top performer in the quarter, outstripping gains in base and precious metals.
The recurrence of adverse commodity terms of trade shocks has been the bane of African economies, and in tracking the movements in commodity prices the AACI highlights areas requiring pre-emptive measures by the Bank, its key stakeholders and policymakers in its member countries, as well as global institutions interested in the African market, to effectively mitigate risks associated with commodity price volatility.
An overview of the AACI for Q3-2020 indicates that on a quarterly basis
- The energy sub-index fell by 8% due largely to a sharp drop in oil prices as Chinese demand waned and Saudi Arabia cut its pricing;
- The agricultural commodities sub-index rose 13% due in part to suboptimal weather conditions in major producing countries. But within that index
- Sugar prices gained on expectations of firm import demand from China and fears that Thailand’s crop could shrink in 2021 following a drought;
- Cocoa futures enjoyed a pre-election premium in Ghana and Côte d’Ivoire, despite the looming risk of bumper harvests in the 2020/21 season and the decline in the price of cocoa butter;
- Cotton rose to its highest level since February 2020 due to the threat of storm Sally on the US cotton harvest, coupled with poor field conditions in the US;
- Coffee rose 10% as La Nina weather conditions in Vietnam, the world’s largest producer of Robusta coffee, raised the possibility of a shortage in exports.
- Base metals sub-index rose 9% due to several factors including ongoing supply concerns for copper in Chile and Peru and strong demand in China, especially as the State Grid boosted spending to improve the power network;
- Precious metals sub-index, the best performer year-to-date, rose 7% in the quarter as the demand for haven bullion continued in the face of persistent economic challenges triggered by COVID-19 and heightening geopolitical tensions. In addition, Gold enjoyed record inflows into gold-backed exchange traded funds (ETFs) which offset major weaknesses in jewellery demand.
Regarding the outlook for commodity prices, the AACI highlights the generally conservative market sentiment with consensus forecasts predicting prices to stay within a tight range in the near term with the exception of Crude oil, Coffee, Crude Palm Oil, Cobalt and Sugar.
Dr Hippolyte Fofack, Chief Economist at Afreximbank, said:
“Commodity prices in Q3-2020 have largely been impacted by COVID-19. The pandemic has exposed global demand shifts that have seen the oil industry incur backlogs and agricultural commodity prices dwindle in the first half of the year. The outlook for 2021 is positive however conservative the markets still are. We hope to see an increase in global demand within Q1 and Q2 – 2021 buoyed by the relaxation of most COVID-19 disruptions and restrictions.’’
Data Unions, fisherfolk and DeFi
By Ruby Short, Streamr In the fintech world it seems every month there’s a new trend or terminology to get...
Deloitte: Middle East organizations need to rethink their workforce in the wake of COVID-19
Organizations in the Middle East have had to take immediate actions in reaction to the COVID-19 pandemic, such as shifting...
One in five insurance customers saw an improvement in customer service over lockdown, research shows
SAS research reveals that insurers improved their customer experience during lockdown One in five insurance customers noted an improvement in...
ECOMMPAY expands Open Banking payments solution to Europe
Open Banking by ECOMMPAY facilitates fast, secure and simple payments International payment service provider and direct bank card acquirer, ECOMMPAY, has...
Bots Are People Too: Robotic Process Automation in Finance
By Tom Venables, Practice Director – Application & Cyber Security at Turnkey Consulting As technology has advanced, Robotic Process Automation...
The power of superstar firms amid the pandemic: should regulators intervene?
By Professor Anton Korinek, Darden School of Business and Research Associate at the Oxford Future of Humanity Institute. Gosia Glinska, associate...
How to drive effective AI adoption in investment management firms
By Chandini Jain, CEO of Auquan Artificial intelligence (AI) has the potential to augment the work of investment management firms...
Democratising today’s business software with integrated cloud suites
By Gibu Mathew, VP & GM, APAC, Zoho Corporation Advances in the cloud have changed the way we interact with...
Why the UK is standing tall at the forefront of fintech
By Michael Magrath, Director of Global Standards and Regulations, OneSpan In recent years, the UK has established itself as one...
How CFO’s can Help Their Businesses Successfully Navigate The Financial Fallout From COVID-19
By Mohamed Chaudry, Group CFO of FoodHub 2020 has been one of the toughest years in recent memory for business....