Laura Shepard, Director of HPC Marketing, DataDirectNetworks
Banks are arguably one of the biggest enterprise consumers of IT across all industries – research from Ovum in June 2013 suggested global IT spending at retail banks alone would reach $132bn dollars by 2015. You would think, therefore, that most would be a step ahead of the game in terms of how they use IT. Whilst that may be true across many aspects of this sector, for example high frequency traders using ultra low latency networking or innovations such as ‘back testing’ or using time series databases, most are still using last century infrastructure design. The infrastructure simply hasn’t matched the capability or the scale of the innovative software. Financial firms are three to five years behind academics and research institutes when it comes to state of the art infrastructure design.
Decades old enterprise computing
Quant analysts are constantly evolving and developing algorithms based on historical data to help predict the market direction, important for a number of reasons – not least to make money and reduce exposure to risk. Yet, it puzzles me why in so many firms this core function is still being implemented in the same way it has always been done – one that is based on practices common place in enterprise computing several years ago. Latest technologies can increase performance of algorithms 5 times. A lot of the ‘features’ of enterprise computing, replication, de-duplication and compression, all interfere with performance. When performance matters most only HPC techniques deliver.
Finance houses’ ‘back testing’ capability is also being undermined by enterprise computing from two decades ago. Whilst there is not necessarily anything wrong with this per se, it is limiting their ability to query and build algorithms for ever-expanding data volumes. Increasingly, financial institutions want to run back testing on multi-year and multi location/market data, to help them more accurately predict market movement globally and to help reduce their risk exposure.
But today’s current financial technology infrastructures, in the most part, are not tuned up to deal with double digit terabytes and soon to be hundreds of terabytes of data. Long term, it becomes impractical from a cost perspective and a ‘real estate’ standpoint to continue to use ‘90s and ‘00s enterprise style computing – these systems require a lot of space.
Today, finance houses want to analyse more data from more sources than just exchanges and run simultaneous programmes to create a better understanding of their market. This includes a number of elements such as, market tick data, which is fixed format: it moves fast but its type is predictable; Sentiment analysis, which adds an additional data dimension, type variation as well as volume is an analytics approach to data whereby algorithms are emerging to search through popular social media feeds – twitter, for example – and other public sources looking for key words and phrases to spot trends that then inform the trading practice. For example, an algorithm that looks for discussions on energy, pipeline failure or oil spill will ‘e-discover’ the mood of market and predict how the market trades.
The sheer increase in data volumes and the desire to analyse more data from multiple sources, whether it be structured in databases or unstructured like social media, it suddenly becomes no longer cost effective to try and build enterprise systems that can hold that much data in system memory, or storage cache. The particle physicists and life science researchers hit this wall 5 years ago and turned to non-blocking storage and on-line analytics to pre-process the data before storing and streaming gigabytes of data a second for months on end. If you were looking for the Higgs Boson particle and someone from the back of room put his hand up and said, “Sorry, can you run the 3 month experiment again, had a cache overflow and missed the bit in the middle” you would probably like to review contracts. Financial services may not have the volumes of their academic peers but they do have the same velocity problem – churning 30TB of market data faster than your competition will bring first mover advantage to any algorithmic trader.
Embrace HPC techniques
To be competitive, firms need to look beyond client/server and even distributed systems and embrace HPC techniques, which have been tried and tested by the academic research community for at least the last ten years or so. In addition, technology strategy ambitions should not stop at batching data through cache. To batch is not terribly efficient, SAP Hana and SAS Grid are ushering the merger of online transaction processing and online-analytics into the enterprise. Batch processing will become as archaic as mainframes. Slicing data into ten iterations might speed things up, but ten iterations will soon become 100 iterations and so on, it is inefficient to say the least.
Some organisations might look to put in flash storage between storage infrastructure and system memory. Whilst flash will provide better performance in some cases, alone it will not scale to the terabytes of data finance houses hold, nor deliver the bandwidth. Flash still remains too expensive to replace spinning disk data arrays.
There is no middle ground to addressing this data growth and “need to know more” challenge; the successful firms will embrace Big Data. And, there is a new way, borrowed from HPC, and the visionary finance houses are starting to see the benefit from a parallel approach – it started in the systems with grid computing and is now reaching the file system, helping firms analyse more positions faster and develop more effective trading strategies, which they can deploy in less time.
The majority of finance houses running algorithmic back testing do so out of KX or other custom in-memory databases running on 90s style infrastructure, which is like pairing a Ferrari engine to a Mini gearbox. STAC M3 tests showed that flipping the storage to supercomputer class can accelerate the database 800%.
By using fast, scalable, external disk systems with massively parallel access to data, researchers can perform analysis against much larger data sets delivering more effective models, faster. This means analysts can run hundreds of models or hundreds of iterations of the same model in the same time it used to take to run a few.
It’s not just trading where HPC techniques can be exploited, risk and compliance departments will want to process more data and get answers more quickly and lower capital reserving. One firm I know has reduced a risk calculation from 9 minutes to 2 minutes using HPC style data storage – that gets them to the market 7 minutes earlier which makes a big difference when volumes are high.
The UK regulator wants financial institutions to raise another £27billion in capital reserve. If a firm can demonstrate their risk control measures are strong to pass the stress tests, they will be able to deploy capital into work and not stuck in reserve. This means dramatically increasing your ability to assess total market exposure from only once or twice a day – to multiple, intra-day assessments.
Blind Case Study
At one global hedge fund, hundreds of servers capture 3GB of tick data per exchange every day. Tens of quantitative analysts work with that data to create and test equity trading strategies on hundreds more servers. They were previously using several larger NAS filers that simply could not keep up with the volume of data and the analyst’s data access performance requirements.
They moved to a similar capacity DDN SFA system and with the additional performance delivered, they were able to back test 3x the number of models, significantly reducing their time to deploy new strategies. The Securities Technology Analysis Center LLC (STAC®), M3 benchmark recently demonstrated that the SFA12K-40 (with a hybrid configuration of Flash and spinning disk technology) from DDN, delivered performance more than eight times faster than the traditional storage average and, in some cases, almost twice the performance of Flash storage.
“Before we rolled out the DDN systems, our [filer] farm just couldn’t keep up with the growth in the market and our business. With the DDN systems, our traders are getting new strategies into the market faster.” – director of IT, global hedge fund.
Organisations that want to be able to analyse all their exchanges, across multiple locations and query such ‘phenomena’ as sentiment (Big Data) need to be looking at adopting a parallel file approach to their storage environment.
People who are building their own tools that can take advantage of a parallel file system will do very well. They can use the parallel file system as the basis to stream data from multiple sources and have many different systems using the same data storage concurrently so they do not need a dedicated copy of each set of data. So, parallel file systems enable multiple nodes to hit the same data concurrently and same data sources concurrently – not only can hit the tick database from London and Singapore at the same time but also the sentiment analysis – mentioned earlier on.
Blind case study
A large US proprietary trading firm specialising in high frequency trading recognised early that they would need to move away from direct attached and NAS storage if they were going to handle their requirements to share and access petabytes of data across multiple types of high performance trading groups including teams in currencies, derivatives, international equities, technology equities and more.
An extreme need for speed was identified as the firm’s success depended on being highly competitive in bringing many new strategies to bear quickly and failing out fading or unsuccessful strategies in as close to real time as possible. A parallel architecture was selected to provide two key criteria to meet these goals:
- A global namespace for more efficient data gathering and sharing
- Parallel IO to remove the time lag if sequential jobs necessary in NAS architectures.
Several generations of parallel infrastructure were tried based on storage and parallel file systems, one provided by a major server vendor and another by a major storage vendor – under production conditions, neither was able to meet even the minimum performance requirements based on current and projected needs which are in the petabytes and in the GB/s sustained IO performance.
Impressed with DDN’s massively parallel architecture, and open platform approach to parallel file systems which would allow them to change infrastructure if they needed to without throwing out their entire investment, this company evaluated then selected DDN SFA with GPFS. In addition to surpassing the performance, availability, scalability and cost requirements, the office of the firm’s CIO also liked DDN’s Python based infrastructure and features that would support possible future directions like embedding applications in the storage itself to cut IO path process steps by almost half for extreme low latency .
“Before we rolled out the DDN systems, our [NAS] farm just couldn’t keep up with the growth in the market and our business. With the DDN systems, our traders are getting new strategies into the market faster.” – Director of IT, large proprietary trading firm.
For me, the next big things in terms of Big Data, and the analysis of that data, relates to risk management. The first step is CAT (Consolidated Audit Trail) but there is so much more to gain than mere compliance. There are quantifiable goals on the table such as intra-day risk assessment, capitalisation and liquidity reporting and predictive modeling. But beyond that, institutions would love to know what their market exposure and opportunity is at any given time. Knowing the position at any moment in the day will allow the firm to know which way to jump and deliver that elusive, first mover advantage.
By Paddy Osborn, Academic Dean, London Academy of Trading
Whether you’re negotiating a business deal, playing a sport or trading financial markets, it’s vital that you have a plan. Top golfers will have a strategy to get around the course in the fewest number of shots possible, and without this plan, their score will undoubtedly be worse. It’s the same with trading. You can’t just open a trading account and trade off hunches and hopes. You need to create a structured and robust plan of attack. This will not only improve your profitability, but will also significantly reduce your stress levels during the decision-making process.
In my opinion, there are four stages to any trading strategy.
S – Set-up
T – Trigger
E – Execution
M – Management
Good trading performance STEMs from a structured trading process, so you should have one or more specific rules for each stage of this process.
Before executing any trades, you need to decide on your criteria for making your trading decisions. Should you base your trades off fundamental analysis, or maybe political news or macroeconomic data? If so, then you need to understand these subjects and how markets react to specific news events.
Alternatively, of course, there’s technical analysis, whereby you base your decisions off charts and previous price action, but again, you need a set of specific rules to enable you to trade with a consistent strategy. Many traders combine both fundamental and technical analysis to initiate their positions, which, I believe, has merit.
What needs to happen for you to say “Ah, this looks interesting! Here’s a potential trade.”? It may be a news event, a major macro data announcement (such as interest rates, employment data or inflation), or a chart level breakout. The key ingredient throughout is to fix specific and measurable rules (not rough guidelines that can be over-ridden on a whim with an emotional decision). For me, I may take a view on the potential direction of an asset (i.e. whether to be long or short) through fundamental analysis, but the actual execution of the trade is always technical, based off a very specific set of rules.
To take a simple example, let’s assume an asset has been trending higher, but has stopped at a certain price, let’s say 150. The chart is telling us that, although buyers are in long-term control, sellers are dominant at 150, willing to sell each time the price touches this level. However, the uptrend may still be in place, since each time the price pulls back from the 150 level, the selling is weaker and the price makes a higher short-term low. This clearly suggests that upward pressure remains, and there’s potential to profit from the uptrend if the price breaks higher.
Once you’ve found a potential new trade set-up, the next step is to decide when to pull the trigger on the trade. However, there are two steps to this process… finger on trigger, then pull the trigger to execute.
Continuing the example above, the trigger would be to buy if the price breaks above the resistance level at 150. This would indicate that the sellers at 150 have been exhausted, and the buyers have re-established control of the uptrend. Also, it is often the case that after pause in a trend such as this, the pent-up buying returns and the price surges higher. So the trigger for this trade is a breakout above 150.
We have a finger on the trigger, but now we need to decide when to squeeze it. What if the price touches 150.10 for 10 seconds only? Has our resistance level broken sufficiently to execute the trade? I’d say not, so you need to set rules to define exactly how far the price needs to break above 150 – or for how long it needs to stay above 150 – for you to execute the trade. You’re basically looking for sufficient evidence that the uptrend is continuing. Of course, the higher the price goes (or the longer it stays above 150), the more confident you can be that the breakout is valid, but the higher price you will need to pay. There’s no perfect solution to this decision, and it depends on many things, such as the amount of other supporting evidence that you have, your levels of aggression, and so on. The critical point here is to fix a set of specific rules and stick to those rules every time.
Good trade management can save a bad trade, while poor trade management can turn an excellent trade entry into a loser. I could talk for days about in-trade management, since there are many different methods you can use, but the essential ingredient for every trade is a stop loss. This is an order to exit your position for a loss if the market doesn’t perform as expected. By setting a stop loss, you can fix your maximum risk on a trade, which is essential to preserving your capital and managing your overall risk limits. Some traders set their stop loss and target levels and let the trade run to its conclusion, while others manage their trades more actively, trailing stop losses, taking interim profits, or even adding to winning positions. No matter how you decide to manage each trade, it must be the same every time, following a structured and robust process.
The final step in the process is to review every trade to see if you can learn anything, particularly from your losing trades. Are you sticking to your trading rules? Could you have done better? Should you have done the trade in the first place? Only by doing these reviews will you discover any patterns of errors in your trading, and hence be able to put them right. In this way, it’s possible to monitor the success of your strategy. If your trades are random and emotional, with lots of manual intervention, then there’s no fixed process for you to review. You also need to be honest with yourself, and face up to your bad decisions in order to learn from them.
In this way, using a structured and robust trading strategy, you’ll be able to develop your trading skills – and your profits – without the stress of a more random approach.
Economic recovery likely to prove a ‘stuttering’ affair
By Rupert Thompson, Chief Investment Officer at Kingswood
Equity markets continued their upward trend last week, with global equities gaining 1.2% in local currency terms. Beneath the surface, however, the recovery has been a choppy affair of late. China and the technology sector, the big outperformers year-to-date, retreated last week whereas the UK and Europe, the laggards so far this year, led the gains.
As for US equities, they have re-tested, but so far failed to break above, their post-Covid high in early June and their end-2019 level. The recent choppiness of markets is not that surprising given they are being buffeted by a whole series of conflicting forces.
Developments regarding Covid-19 as ever remain absolutely critical and it is a mixture of bad and good news at the moment. There have been reports of encouraging early trial results for a new treatment and potential vaccine but infection rates continue to climb in the US. Reopening has now been halted or reversed in states accounting for 80% of the population.
We are a long way away from a complete lockdown being re-imposed and these moves are not expected to throw the economy back into reverse. But they do emphasise that the economic recovery, not only in the US but also elsewhere, is likely to prove a ‘stuttering’ affair.
Indeed, the May GDP numbers in the UK undid some of the optimism which had been building recently. Rather than bouncing 5% m/m in May as had been expected, GDP rose a more meagre 1.8% and remains a massive 24.5% below its pre-Covid level in February.
Even in China, where the recovery is now well underway, there is room for some caution. GDP rose a larger than expected 11.5% q/q in the second quarter and regained all of its decline the previous quarter. However, the bounce back is being led by manufacturing and public sector investment, and the recovery in retail sales is proving much more hesitant.
China is not just a focus of attention at the moment because its economy is leading the global upturn but because of the increasing tensions with Hong Kong, the US and UK. UK telecoms companies have now been banned from using Huawei’s 5G equipment in the future and the US is talking of imposing restrictions on Tik Tok, the Chinese social media platform. While this escalation is not as yet a major problem, it is a potential source of market volatility and another, albeit as yet relatively small, unwelcome drag on the global economy.
Government support will be critical over coming months and longer if the global recovery is to be sustained. This week will be crucial in this respect for Europe and the US. The EU, at the time of writing, is still engaged in a marathon four-day summit, trying to reach an agreement on an economic recovery fund. As is almost always the case, a messy compromise will probably end up being hammered out.
An agreement will be positive but the difficulty in reaching it does highlight the underlying tensions in the EU which have far from gone away with the departure of the UK. Meanwhile in the US, the Democrats and Republicans will this week be engaged in their own battle over extending the government support schemes which would otherwise come to an end this month.
Most of these tensions and uncertainties are not going away any time soon. Markets face a choppy period over the summer and autumn with equities remaining at risk of a correction.
European trading firms begin coming to terms with the new normal
By Terry Ewin, Vice President EMEA, IPC
In recent weeks, the phrase ‘never let a good crisis go to waste’ has received a large amount of usage. Management consultancies, industry associations and organisations, including the Organisation for Economic Co-operation and Development (OECD) have all used it in order to discuss how the current crisis, caused by the Coronavirus pandemic, presents an opportunity for new and worthwhile change.
The saying is also commonly used to indicate that the destruction and damage that is caused by a crisis gives organisations the chance to rebuild, and to do things that would not have previously been possible. This has the potential to impact financial trading firms, where projects that this time last year would not have made much sense now appearing to be as clear as day. In Europe, banks and brokers alike are beginning to think about what life will look like post-pandemic, and how their technology strategies may need changing.
We can think of three distinct phases when it comes to a crisis. Firstly, there is the emergency phase. This is followed by the transition period before we come to the post-crisis period.
Starting with the emergency phases, this is when firms are in critical crisis management mode. Plans are activated to ensure business continuity, and banks and brokers work to ensure critical functions can still take place so as to continue servicing their clients. With regards to the current crisis period, both large and small European banks and brokers were able to handle this phase relatively well, partly due to the fact that communications technology has reached the point where productive Work From Home (WFH) strategies are in place. For example, cloud-connectivity, in addition to the use of soft turrets for trading, has enabled traders from across the continent to keep working throughout lockdown. From our work with clients, we know that they were able to make a relatively smooth transition to WFH operations.
In relation to the current coronavirus crisis, we are in the second phase – the transition period. This is the stage when financial companies begin figuring out how best to manage the worst effects of the ongoing crisis, whilst planning longer-term changes for a post-crisis world. One thing to note with this phase, is that no one knows how long it will last. There is still so much we don’t know about this virus. As such, this has an impact on when it will be safe for businesses to operate in a similar way to how they were run in a pre-pandemic world. But with restrictions across Europe starting to be eased, there is an expectation that companies will start to slowly work their way towards more on-site trading. For example, banks are starting to look at hybrid operations, whereby traders come in a couple of times a week, and WFH for the rest of the week. This will result in fewer people in the office building, which makes it easier to practise social distancing. It also means that there is a continued reliance on the technology that enables people to WFH effectively.
Finally, we have the post-crisis period. In terms of the current crisis, this stage is very unlikely to occur until a vaccine has been developed and distributed to the masses. Although COVID-19 has caused mass economic disruption, many analysts are predicting a strong rebound once the medical pieces of the puzzles are put into place. It may not be entirely V-shaped, but the resiliency displayed by the financial markets thus far suggests that it will be healthy.
Currently, many European trading firms are taking what could be described as a two-pronged approach.
The first part of this consists of planning for the possibility of an extension to phase two. Medical experts have suggested that there could be some seasonality to the virus, with the threat of a second wave of COVID-19 cases in the Autumn meaning that the risk of new restrictions remains. If this comes to fruition, there would be a need for organisations to fine-tune their current WFH strategies and measures, and for them to take greater advantage of the cloud so as to power communications apps.
The second component consists of firms starting to think about the long-term needs of their trading systems. Simply put, they are preparing themselves for the third phase.
It is in this last sense, that the idea of never letting ‘a good crisis go to waste’ resonates most clearly.
Satisfaction with Credit Card Issuers in Canada Remains Flat Amid COVID-19, J.D. Power Finds
Tangerine Bank Ranks Highest in Overall Credit Card Customer Satisfaction for Second Consecutive Year With 73% of credit card customers...
The benefits of automated pension plans
While many people will prefer to speak to fellow human beings when discussing their investments, automation is already part of...
Pandemic risks eclipse treasury priorities as businesses diversify investments to mitigate impact
The Covid-19 pandemic has shunted aside existing challenges to sit atop treasurers’ priority lists, according to “The resilient treasury: Optimising...
Boost for consumers as banks recognise room for improvement on service and delivery
42% of banks are looking to improve service provision and boost customer satisfaction in the year ahead Less than half...
By Paddy Osborn, Academic Dean, London Academy of Trading Whether you’re negotiating a business deal, playing a sport or trading...
The impact of the Accounts Payable risk landscape
By David Thorley, Director of Customer Development, FISCAL Technologies The current economic climate has never been so uncertain. Not since...
The Viral Return On Investment
By Sabine Saadeh Author of Trading Love Investment Pitch It was around August 2018 when a friend of mine approached...
How AI and ML are changing insurance for good
By Alan O’Loughlin, Director of Analytics and Statistical Modelling, International and John Beal, Senior Vice President of Analytics at LexisNexis®...
How Assistive Learning Technology Is Making Online Learning Inclusive
By Sandra Goger is Learning Technology Analyst at Iflexion, Denver-based software development company. The global online learning market is expected...
Can your company data make you famous?
By Kerry Gould, Associate Director, Speed Communications Businesses gather and generate reams of data every day on everything from purchasing...