In the beginning of February, days of volatile trading slowed down apps and led to website outages at major brokerage firms and investment banks. While people speculated about whether this heralded the start of another financial crisis, I reflected on how much the online world has changed since the last crash just a decade ago. If a little turbulence is already causing apps and sites to fail, how will today’s web architecture cope with the next crisis?
The wild west of tech and finance
Looking back over the last decade and the rapid, unpredictable development of technology that has unfolded and started not just to enable business but to drive it, it should not be surprising that companies have been constantly under pressure to adapt. The pace of growth and volatility of change in technology, and more specifically of the web and what possibilities it creates, has mirrored closely the wild, unbridled nature of financial markets. The number of internet users worldwide has grown from 23.5 percent in 2008 to 51.8 percent in 2017. The GSMA counted four billion global mobile connections in 2008. By 2016 this number had grown to a whopping 7.9 billion connections.
In 2008, mobile accounted for just a tiny part of web traffic. Fast forward eight years and in 2016 global mobile traffic had already surpassed desktop traffic. By 2017 there were 2.2 million apps available on the App store and Google Play Store hit the three million apps mark. How could this not change everything?
But web and app traffic aren’t just human; there are also machines/bots with which to contend. More than half the traffic coming in today is from bots, which adds more unpredictability to the mix, given how exponentially fast the number of requests can come from an automated bot.
WANT TO BUILD A FINANCIAL EMPIRE?
Subscribe to the Global Banking & Finance Review Newsletter for FREE Get Access to Exclusive Reports to Save Time & Money
By using this form you agree with the storage and handling of your data by this website. We Will Not Spam, Rent, or Sell Your Information.
It’s like the untamed wild west in some ways, which again parallels the movements of the financial markets.
For brokerage and investment firms this means that during the February 2018 “turbulence” their web architectures needed to cope with many more visitors and channels than a decade ago. It’s therefore no surprise that some industry watchers hold the proliferation of apps largely responsible for the recent crash of several brokerage sites.
Today’s web pages eat bandwidth
It’s not just that the amount of web traffic and number of channels has increased, however. The average web page size has grown significantly, which is eating bandwidth. Data from HTTP Archive shows how average page size grew more than 5x from 702 KB in November 2010 to 3804 KB in February 2018. Video is a key factor here, increasing from zero to 1099 KB. The same for images that grew more than 4x time from 416 KB to 1824 KB, scripts that increased 4.5x times to 508 KB and finally fonts that grew 60 x times to 119 KB.
Five ways to cope with wild trading times
Given how important websites and apps are to the world of investment, brokerage firms must urgently evaluate whether their online channels are fit to cope in the present environment (as well as the inevitable fluctuations of the future). For example, can apps and websites scale and maintain a stable load speed, even when multiple people access them simultaneously? The good news is that web architecture technology has evolved significantly over the last decade to overcome these challenges. Let’s have a look at some of these new technologies and examine how they can help to avoid crashes or slowdowns of apps and sites.
- Caching to the rescue
One of the basic elements of today’s stable architectures is a web cache, also known as a reverse caching proxy. Think of it as a temporary storage area that mirrors a site or application’s content. The web cache serves a visitors’ request to a site or application, eliminating the need for the server to fetch the requested content from the backend. Consequently, server overload, the most common cause for slow load times and website caches, is removed. As such, a web cache can serve up tens of thousands of consecutive requests per second, speeding up website performance at least several hundred times over while reducing server load.
- A safety net for CDN performance
Many brokerage firms use external content delivery networks (CDNs), distributed systems of so-called edge servers across many data centres, with an origin server at the centre. If a user visits a site or an app, he or she will be directed to the closest edge server. If one server is down or slow, the visitor will be redirected to the next. By doing so, the CDN provides a protective layer, shielding the origin from attacks or overloads. One of the technologies used at the edge is caching. Most of the CDN providers also offer solutions for mobile websites or mobile apps.
However, CDN providers tend to locate their servers in the same major cities. Consequently, during high traffic periods they share the same peering relationships, which can cause significant delays. To prevent this, companies have recently started to embrace a hybrid model where the CDN is complemented with a private content delivery architecture in the cloud. In high-traffic scenarios, this setup allows data to be dynamically redirected from the CDN to the private cloud CDN.
- Coping with fast-changing content
Brokerage and investment firms need to continuously update huge amounts of information on investment offerings across different channels. When website content changes, the content stored in the web cache needs to be ‘invalidated’ or deleted so visitors don’t wind up seeing old information. Where multiple systems and data centres are involved this can prove to be a challenge. And during times of wild trading this can become near unmanageable.
Cache invalidation might sound simple, but it’s so tricky that internet veteran Phil Karlton declared it to be one of two ‘hard things’ in computer science. Modern caching tools make it simple to send invalidation requests to multiple caches at once, even when they are in different locations. This allows site content to be updated simultaneously in a controlled, automated and fast way – even during wild trading conditions. This is important as the entire market must receive the same information, almost to the millisecond or the financial company risks losing their permit to operate.
- Accelerating mobile page load
Brokerage firms increasingly use personalisation to fine-tune the content they display to individual users based on factors such as behaviour, previous investments and location. When content is updated, the cache contents need to change as well and all sites containing the new content must be reloaded. Established tools like Edge Side Includes (ESI) automates this process, but the updates occur sequentially, which can slow down content delivery.
A new approach called parallel ESI loads changed content elements – as the name indicates – in parallel, which significantly speeds up the process.
- An insider’s tip – encryption
A little known fact is that using TLS encryption helps to increase performance for mobile visitors. Apparently cellular network providers queue up and put through TLS encrypted traffic. It’s a small performance gain, but sometimes it can be enough to make a big impact.
Crashes are no longer an option – scaling is key
These are just five ways in which new technologies can help brokerage and investment firms to enhance their web architectures to withstand the traffic and bandwidth challenges associated with wild trading times. I would encourage any CTO and CIO to evaluate them because uncertainty and high traffic peaks, though nothing new, are increasingly the new normal and website crashes are no longer acceptable.
Author: Lars Larsson, CEO of Varnish Software