Nick Pollard, UK General Manager, Guidance Software
Targeted cyberattacks on financial institutions show no signs of abating. A recent reportfrom lobby group TheCityUKwarned that the financial sector is “the perfect target” for cyber crimeand it’s been estimated that financial services businesses encounter security incidents 300% more frequently than other industries.
This, perhaps, stands to reason; attackers will focus their efforts on the most lucrative targets–from attacks on payment systems to the theft of high value, high worth PII (personally identifiable information). As such, the sector is facing ever-mounting challenges in managing cyber risks, perpetrated by well-resourced and well-organised criminal gangs. The impact of their activities can be devastating,as was evident earlier this year when criminals used malware to hack the Bangladesh Central Banks’ SWIFT messaging network, stealing £56 million.
The reality is that attacks such as these will become more pervasive as criminals turn their attention to digital crime. In the face of more complex challenges, there are no quick-fix answers. However, the industry can take measures to improve the way in which sensitive data is audited, tracked and controlled. Combined with fit-for-purpose detection and incident response strategies, a deep understanding of data will help reduce the surface area of risk and can mitigate the damage if a breach does occur.
Know Your Data
No amount of threat detection or prevention can make a system 100% secure. Teams charged with managing the protection of customer data face a perfect storm of increasingly sophisticated threats like ransomware, zero day exploits, phishing scams, bots and more. Banks rightfully invest millions into cybersecurity tools to detect and prevent attacks.However, attackers can, and will, exploit vulnerabilities (technological or human) that exist in any system.
Defending digital assets is a complex challenge and a successful attack can cause significant damage in terms of the direct financial cost from theft, or loss of data, but also the wider impact from consequent loss of business, legal, regulatory and reputational costs.Taken together, these impacts create a significant amount of risk for any organisation, and especially for the financial sector. The best foundation for limiting risk and improving resilience is taking a thorough audit of your most important digital asset: your data. This is no small task; there is more data held, across more locations, and on more endpoint devices than ever before. Add to that the host of digital information that can be shared tothe cloud, via email and even with social media, and the challenge of tracking data is further exacerbated.
When conducting an audit, start by ensuring that data maps are up-to-date, that all data repositories have the correct control polices and that, above all, you fully understand where the most critical data is stored. Identify where any ‘unsecured’ data is being held and conduct regular sweeps across the network to get the full picture of data locations.
Ensure that all the correct policies for data access, data governance and data protection are not only implemented, but also enforced.
Proactively,financial institutions should look to monitor activity and establish what normal activity looks like for their endpoint devices and servers. With a baseline of normal activity, an organisation can more quickly identify indicators of compromise (IoCs) by identifying abnormal behaviours either by systems or employees that could pose a risk. Leveraging data from all servers and end-user devices, endpoint security analytics can give complete visibility of endpoint activities across the network, in order to detect any anomalous behaviour, areas of potential risk, and security threats before damage can spread.
Stress-Test Incident Response Plans
Even when incident response (IR) measures are in place, all too often the processes for knowing the lines of decision making, the communications processes and which systems to shut down have not been thoroughly practised. Although many organisations may think that they have robust security policies in place, how well-drilled are they at responding to security incidents when these occur? All too frequently, it isn’t until a real world incident occurs that the shortcomings of plans are uncovered.
The most significant starting point for determining the scale of a threat is assessing the type of data that has been targeted, which is why it’s important to have mapped exactly where sensitive data is held, and to prioritise responses based on these sensitive data profiles. This can save significant time in data ‘stock-taking’ that would need to be performed in the immediate aftermath of a cyber-attack.
Mitigating the risk and impact of breaches requires a coordinated, well-planned and fully practised security strategy. Whilst attacks are on the rise, we shouldn’t wait for the ‘hit’. It’s more important than ever to be proactive, taking all necessary measures to identify and secure data, spot the warning signs of attacks and implement the appropriate response should any incident occur.
The Coming AI Revolution
By H.P Bunaes, CEO and founder of AI Powered Banking.
There is a revolution in AI coming and it’s going to render legacy data and model governance practices obsolete.
The revolution will manifest in three ways:
- Automated machine learning platforms like DataRobot, H2O.ai, Dataiku, and rapidminer are making data scientists more productive. A lot more productive. One company told me that they were seeing 7x as many models from their data science group shortly after the implementation of a leading autoML platform. The increase in model output will quickly reveal bottlenecks in model validation, production implementation, and model operation and management.
- The increasing popularity of tools aimed at “citizen data scientists”, local data literate subject matter experts in the business without formal data science training who nevertheless know a good model and a good use case when they see it, will turn a large percentage of technically savvy business people into model developers. Models developed by citizen data scientists will quickly dwarf the volume of models created by formal data science organizations adding further strain on existing procedures and revealing gaps in governance.
- Availability of nearly unlimited capacity on demand for both data storage and computing power from cloud providers will lead to the proliferation of sophisticated predictive models that can learn from broad swaths of data; structured (your existing databases, for example), semi-structured (your documents), and even unstructured (such as images), sniffing out the data that is relevant to any one particular prediction or population. Demand for more, and different kinds of data for modeling, and the need to integrate model results into downstream dataflows and IT applications, will make data platforms and data flows significantly more complex, harder to manage, and increase points of failure.
What this all adds up to is an explosion in the volume of predictive models and of the data in motion in your organization. Where there were no models, there will suddenly be many. Where there was one model, you may find there are now hundreds. And the pipes providing data into and delivering results out of these models are going to proliferate. Operational and reputational risk from model failure will rise significantly as companies outgrow their existing data and model governance frameworks and legacy procedures.
Making this worse, many banks are starting from a weak position. The demand for more and better models (descriptive and predictive) has already led to a thicket of overlapping, partially inconsistent data flows to a multitude of models. Model outputs themselves have become part of the data flow to downstream data marts, BI, apps and even to other models as inputs. It is the rare organization that knows where all that data is coming from, where it is going, how it is being used, and can identify the potential impacts of changes to data and to the models that consume it.
Certainly there has been much improvement in recent years in data governance at most large organizations. Data quality, data standards, data integration, and data accessibility on robust platforms (increasingly cloud based) have all gotten better. And most organizations now have robust model risk management practices in place, to test and validate models before they go into production use.
But these worlds are about to collide. Data and analytics, once distinct and manageable separately are going to become inextricably intertwined. As brilliantly explained in a paper by several smart people at Google (“The Hidden Technical Debt in Machine Learning Systems”), we will rapidly reach the point where “changing anything changes everything.”
Take a simple example, what differentiates data on a client from a CRM system from data on a client created by a predictive model? The answer: nothing. Yet they are managed today by different groups. The former is typically managed by Data Governance, which is usually led by the Chief Data Officer. The latter is usually the province of Model Risk Management often found in the Corporate Risk Management organization.
But when model outputs become inputs to reports, to business processes, to critical operational or client facing systems, or to other models, they need to be governed just like any other data.
The perfect illustration of this challenge is in change management. Often you will find data change management in the chief data officer’s organization and model change management in the model risk organization. But changes in the data can, and often do, effect models in sometimes unpredictable fashion. And changes to models can change outputs and have major impacts to downstream consumers of those results if they are not prepared for the coming changes.
Managing them separately and distinctly will therefore no longer be sufficient. How to tackle this?
- First and foremost, you must have a complete catalog of all models including metadata describing model inputs and their source and model outputs along with their destination and uses. There are a number of solutions now coming on the market for this purpose including Verta.ai, ModelOp, and Algorithmia.
- Second, data management needs to expand to include not only source data but also all the results (predictions, descriptions) produced by models.
- Third, model management too needs to expand its remit, not just focusing on model testing and validation prior to model implementation but also monitoring model performance and managing model changes after the fact .
- Fourth there must be formal procedures for keeping model management and data management mutually informed and closely coordinated. Data cannot change without assessing model impact, and models cannot change without assessing data impact.
Organizationally, it may be infeasible to combine legacy organizations across traditional lines of responsibility. And it may be better to leverage existing expertise across model management, data engineering, data management, and IT. But a new partnership model, new tools, and new procedures will be needed.
The explosion in AI is upon us. To use AI safely and effectively you need to get your data and analytics house in order and make sure the right mechanisms are in place to keep it so. Regulators have taken note of the risks of poorly managed AI, and it is only a matter of time before they dictate minimum standards. Combining, or at least tightly coupling, data and model governance is where to start.
How financial services organisations are using data to underpin future growth
By John O’Keeffe, Director of Looker EMEA at Google Cloud
In addition to the turmoil caused by the COVID-19 pandemic, a significant decline in venture capital investment has left many financial services organisations feeling deflated, with others struggling to survive. According to figures from trade body Innovate Finance, investment in UK fintech organisations fell 30% in Q2 of this year, with smaller challenger firms and start-ups being the most profoundly hit by our current economic problems.
As a result, both challenger banks and more established players have had to pivot their strategies in order to maintain relevance and market share. Nonetheless, the outlook for fintech in the UK and further afield looks promising for the future. The reality of spending much of our time at home, and out of reach of brick and mortar services, means that many of us are becoming even more accustomed to digital banking for example. Recent analysis of finance application usage from Adjust, found that the average sessions in investment apps surged 88% globally, while payment and banking app sessions increased by 49% and 26%, respectively, during the COVID-19 pandemic.
However, the fact remains that investment in the sector is currently hard to come by. To help regain momentum, a review into the UK’s fintech industry was launched to identify opportunities to support growth across the industry. Data has – and will continue to – play a key role in this push for innovation, helping organisations spot gaps in the market, predict customer behaviours and ensure that the decisions they make are based on real insights. At such a critical time, enabling a data-led approach will help organisations ascertain exactly what is required to accelerate change and ensure the sustainability of the industry.
The financial services industry is a data-rich environment, giving organisations a potential goldmine of customer interactions, product performance and market trends. However, the difficulty often lies in bringing this into a coherent whole, and extracting the business insights required for long-term success. This is as much about strategy and accessibility as it is about technology. Fostering a true “data culture” where employees across the business, whether data experts or not, can access real-time intelligence that informs their day-to-day decision making in a positive way, is crucial. This may mean tweaking your onboarding and training programmes, identifying data evangelists that can catalyse others, or simply making data engaging and relatable for those who are new to the practice.
For many organisations, data is often stored within traditional business intelligence tools, third-party SQL clients or even just a simple spreadsheet, meaning that valuable data insights are siloed and often hindered by a bottleneck between a stretched analytics team and the rest of the business. There is also the all-important General Data Protection Regulation (GDPR) to consider, so data governance and having a clear view of where data is being housed, and for what purpose, is particularly pivotal.
With this in mind, it is crucial to have a “single source of truth” to bring various data streams together and enable real-time, self-serve insights to your whole employee base. As an example of this in practice, data is a great way to understand your existing clients more intimately and nip any problems in the bud early. By building a custom data dashboard incorporating, for example, number of support tickets issued, change in ticket sentiment and number of days to renewal, you can build up an accurate picture of account health and how this has changed over time. In combination with real-time metrics on which products and features are being used and how, sales teams can have more meaningful and accurate conversations with their customers, converting at-risk accounts into potential growth opportunities.
Given the dip in VC investment mentioned earlier, it is more important than ever for startups and scale-ups to do more with less and set a strategic roadmap that supports rapid growth. By using data to measure and action customer feedback, these organisations can be more agile in taking new products to market and making sure these are useful and address specific pain points.
Whether a fintech scale-up or an established name, it has never been more important to shift your operations to a more data-led strategy. With an uncertain outlook ahead for business across all sectors, making data the “single source of truth” can help to navigate market trends, identify new growth opportunities and simply make an organisation’s decision-making smarter and more efficient. Through data-driven innovation and growth, one of Britain’s most valuable industries can continue to thrive in the future.
The Bank of England partners with Appvia to assist in the design, construction and assurance of a new cloud environment
The Bank of England has appointed self-service cloud-native delivery platform Appvia to support the creation of a new cloud environment.
The announcement follows a public procurement process which commenced in January 2020. The Bank of England will work with Appvia on design, construction and assurance of a modern, fit for purpose cloud environment.
During the two-year partnership, Appvia will be supporting development and project teams within the Bank in testing and deploying code in cloud environments, working with security teams to integrate the cloud into existing operational and security processes; and implementing information governance compliance so staff are able to collaborate safely and securely.
Oliver Tweedie, Head of Digital Platforms at the Bank of England, said, “We have selected Appvia as our Cloud Delivery Partner to help us realise the Bank’s cloud ambitions and unlock the potential of the Cloud. Appvia come with a great pedigree and a wealth of experience delivering Cloud services within government. Working in collaboration with Bank Technology teams, Appvia will help us shape and build the future of Cloud services across our organisation – a key part of our Technology strategy.”
Jon Shanks, CEO and Co-Founder of Appvia, said, “This is an exciting opportunity to work with the Bank as it undergoes a step-change in its approach to the cloud. Harnessing innovative cloud solutions, such as containers and Kubernetes is a real business enabler for the Bank to streamline the software development lifecycle, ways of working and cloud operating model. We look forward to working with all stakeholders at the Bank of England to support its digital transformation journey.”
Appvia, which counts the Home Office among its major clients, is a self-service platform that enables organisations to scale their infrastructure quickly, securely and easily using services such as Kubernetes. In September, Appvia launched the world’s first developer-centric tool to enable teams to predict and control cloud costs.
The Coming AI Revolution
By H.P Bunaes, CEO and founder of AI Powered Banking. There is a revolution in AI coming and it’s going...
Q&A with Joe Steele, Head of Workplace Technology at Starling Bank
In just under a year, many businesses had no choice but to go online and with digital transformation on the rise...
How financial services organisations are using data to underpin future growth
By John O’Keeffe, Director of Looker EMEA at Google Cloud In addition to the turmoil caused by the COVID-19 pandemic, a...
Three questions the financial services industry must answer in 2021
Xformative, a Mastercard Start Path recipient, shares what these questions mean for fintech partners and their innovations This year, fintechs...
A quarter of banking customers noted an improvement in customer service over lockdown, research shows
SAS research reveals that banks offered an improved customer experience during lockdown A quarter (27%) of banking customers noted an...
Is Digital Transformation the Key to Business Survival in the New World?
After a turbulent year, enterprises are returning to the prospect of a new world following an unprecedented pandemic. Around the...
Virtual communications: How to handle difficult workplace conversations online
Have potentially difficult conversation at work, like discussing a pay rise, explaining deadline delays or going through performance reviews are...
Black Friday payment data reveals rapid growth of ‘pay later’ methods like Klarna
Payment processor Mollie reveals the most popular payment methods for Black Friday Mollie, one of the fastest-growing payment service providers,...
Brand guidelines: the antidote to your business’ identity crisis
By Andrew Johnson, Creative Director and Co-Founder. How well do you really know your business? Do you know which derivative of your...
COVID-19 creates long and winding road for startups seeking investment
By Jayne Chan, Head of StartmeupHK, Invest Hong Kong Countless technology and other companies describe themselves as innovators, disruptors or...