By Matt Pfeil, Chief Customer Officer and Co-Founder, DataStax
While the hype around big data has been at an all-time high for the past couple of years, the reality is that software products are changing things all around us. This is not limited to the banking sector. Boring things in your house like thermostats and weighing scales are themselves becoming connected to the Internet, and businesses are developing around this use of personal data. As people want to run their lives better, they turn to products and companies that can help them make better decisions based on the data that they create.
For banks, this use of data represents a great opportunity too, but at a much bigger scale. In 2014, Gartner estimated that 73 percent of organizations either planned to invest in big data over the following 12 months, or had already started their implementations. The finance sector is leading the charge to deploy new technologies that help improve decision-making and the customer experience. In particular, banks are looking at how they can make more use of data for decisions in real time across both investment and retail banking operations.
This involves being able to answer very specific questions as they are asked. For example, should a customer be able to carry out a transaction, or is it a risky one that might be fraudulent? What impact does an individual trade have on the institution’s liquidity, and how do each of those trading positions interact with each other? What does the big picture look like for the bank in the market?
To answer these kinds of questions, IT teams have to consider how they can manage the huge amounts of data that they are creating at any given point, as well as how to bring this data together in order to support the new services in the first place. By using this data in real time, IT can expand the usefulness of existing services and support the development of new products for customers. This is important as customers want to use multiple channels to interact with their bank, shifting over to online and mobile services for day-to-day transactions alongside traditional in-person sessions when they are necessary.
Part of this approach comes back to how banks see themselves in the wider market. One CIO I spoke with mentioned that he did not see competition coming from other financial institutions; instead, he was looking at how Facebook and Google are planning new services that might enter into what was previously considered the remit of the banking sector. In response to this new kind of pressure, he was looking at how to keep ahead of these potential competitors, rather than direct competitors. Recommendations and user experience were therefore at the forefront of his thinking. In essence, this involves traditional companies taking the DNA of businesses that were born on the web and applying it to the business of banking.
Investment banking and big data – getting more real-time insight
There are currently two areas where banks want to use real-time data within their trading operations, pre- and post-trade risk analytics and financial liquidity analytics. Each of these is concerned with both the long-term impact of decisions, and with how specific decisions currently affect the bank. Looking at these areas requires both the “thousand foot” view that comes with huge data sets that have been acquired over time, and the ability to manage the individual transaction too.
For trading, being able to model the wider impact of decisions before and after they are made can help those trades become more profitable for the organization as a whole. Looking at the overall liquidity that a bank may have at any given point is a critical requirement for all institutions to consider. Big data is therefore essential to meeting these business requirements.
Following on from the financial crash of 2008, banks have to maintain much larger levels of reserves in order to cover their trades and lending. The introduction of Basel III and the guidance from the U.S. Federal Reserve will compel banks to hold more of their assets to protect against potential risk. They have to demonstrate that they are holding enough capital to cover their trading operations at any given point in time. Using big data, the trading desk has the potential to analyze how potential decisions can affect the bank, and show whether additional liquidity will be required. Over time, this will mean a potential reduction in the volume of capital and assets that the bank has to hold while still meeting the needs of the regulators.
Here, big data can support more efficient use of capital and assets while allowing the bank to show it is in compliance. While the implementation of more compliance regulation does mean that banks will have to hold more cash and assets to cover potential risk, this approach can lead to significant efficiency around how assets are used in order to generate profits over time.
Big data in the retail banking sector
So far, much of this discussion around big data has focused on investment banking. However, big data can be used in the retail sector too. There are two main areas for this: fraud detection analytics, and implementing new services that take different data sources into account as part of the offering.
Fraud detection is a huge area of potential investment for big data according to Gartner. While many financial institutions have forms of fraud analysis in place today, the reality is that these are often limited to one channel only. For banks that are stressing their multi-channel strategies and abilities at online and (more importantly) mobile banking, this is a potential gap.
Analyzing transactions for fraud across multiple channels, and in real time, is a compelling use case for big data. Banking IT teams can look at the experience that online retailers and e-commerce sites have in order to learn and apply similar approaches. The likes of eBay have to monitor transactions and behaviour for fraudulent activity in real time, whether the request is being made via mobile app, mobile web or online channels, for example. As these B2C companies operate at huge scale, banks can learn from their approaches and apply some of the same techniques for web-scale IT.
This approach is also being used around designing new services. Banking IT teams can collaborate with the marketing division to support the extension of mobile channels with data. A good example is location – while many people have used social networking sites like Foursquare in the past, location-based services haven’t been adopted in the banking sector. This is now changing, as mobile apps can include data like location both to improve the user experience, but also as part of the security around the device and service being used.
If organizations do bring additional sources of data into these mobile apps, then availability of that data is an important consideration. When using location-based data as a factor in deciding what service to offer, or whether a payment should be processed, the data itself has to be continuously available to the application. Otherwise, there is the risk that mistakes are made and the customer experience suffers. Considering availability is important for services that are online and mobile – customers expect those services to be available at all times, performing at the same level. Amazon found that every 100 milliseconds of latency translated to a 1 per cent drop in revenue; for banks, the availability and speed of their services will increasingly be a yardstick that they are judged by. If services are perceived to be lacking or slow, then customers will move their accounts elsewhere.
This shift to using data also has further implications for the business around data security. Banks have always been targets for hacking attacks, whether this is on the IT infrastructure side or targeted at individual accounts. However, with so much data available and stored by banks as part of their services, the risk from a data breach goes up. This makes it more important than ever that data security is in-depth, structured and well-managed.
Changing technology approaches
According to Gartner, 32 percent of banking organizations surveyed last year are already implementing this kind of solution. However, many IT teams are facing challenges in getting these big data implementations from initial projects into full production. This is mainly due to the complexities of working at scale. Traditional technologies like Relational Databases (RDBMS) and Data Warehouses are approaching the limits that they can work at. To cope with these new channels and applications, banking IT teams are turning to new technologies like NoSQL databases and Hadoop are needed to handle the huge amounts of data that their applications require.
NoSQL refers to a family of new database platforms that are designed to cope with the volume of data that banks are now generating and using within their decision-making processes. While Google and Facebook originally developed NoSQL technologies to deal with the extraordinary volumes of data they processed on a daily basis, platforms like Apache Cassandra are now being used to bridge the gap around scale and availability that RDBMS platforms cannot. As the banking industry is now approaching the same scale and complexity, IT teams need to use similar tools in order to deliver results that consumers and enterprise clients have come to expect.
Alongside this, there is a general move over to using cloud computing strategies rather than more traditional big iron servers. The ability to scale up on commodity hardware is appealing as banks can manage their investments in hardware in a more linear fashion, making it more feasible economically to invest in supporting new services.
With all this change from a technology perspective, there are also organisational changes that have to take place too. Rather than looking at IT from a functional perspective, it’s important to look at how companies can manage their operations from a data perspective instead. This encourages much more cross-team and cross-function working, which can spur greater innovation and better service delivery for customers.
Looking to the future
Banks are looking at how they can make use of data to maintain their competitive advantage. This approach means that there is more emphasis on how to deal with data in real time and at scale. At the same time, IT teams are seeing what lessons they can learn from B2C services and web-scale organizations. As businesses seek to become more data-driven, new platforms that can provide the necessary scalability, availability and speed will prove crucial to these efforts. The time has come for technologies like NoSQL to meet these needs.
Matt is Chief Customer Officer & Co-Founder at DataStax. The company supports the delivery of Apache Cassandra to enterprise environments. Prior to DataStax, Matt built and managed the Email and Apps infrastructure development group at Rackspace.
Prior to Rackspace, Matt was at Webmail.us where he worked in various management roles in infrastructure and scalability. Matt holds a BS from Virginia Tech in Computer Science.