By Ed Airey, product marketing director for COBOL Solutions, Micro Focus
Recently, long-standing programming language, COBOL, hit the headlines as organisations scrambled to meet the demand for coders with the skills to maintain and modernise systems powering crucial services. While this focus on COBOL may be new from a media perspective, for many large organisations, especially those operating within the finance and insurance sectors, it never went away.
Despite being over 60 years old, COBOL remains the basis of the system code for the financial services (FS) industry. It was built in line with the development of mainframes during the 1950s and 1960s to meet shifting business needs. Back then, mainframes were designed for commercial use to process large amounts of data with a high computing power. Fast forward to 2020, many banks and insurance providers still run on mainframes and, in turn, modern COBOL.
Undoubtedly, COBOL continues to play a central role in the FS sector, acting as the backbone to some of the world’s most important organisations. Although COBOL has been constantly improved upon over its 60 years with millions invested annually, the language’s enduring status ultimately stems back to its original design which has made it irreplaceable as the core language of business-critical computing. Here are the key benefits.
A number of programming languages exist today. From C and C++, to Python, Java and Visual Basic, each differs from the others in terms of functionality and difficulty. Even though it has a few years (or decades) on the other languages, COBOL has maintained its position in the TIOBE programming language index, having cemented itself as one of the most popular programming languages globally over time.
A large reason behind this enduring popularity is that COBOL was initially designed with the end goal of being as simple as possible. At a time when IT barely existed and only a handful of people knew about data processing, it was created to establish a way for non-computer-literate professionals to communicate with computers. The original specification therefore required the language to be open-ended, allow for change and amendments, use ‘simple or pseudo English and avoid symbols as much as possible’.
The readability characteristics of COBOL are one of its key advantages today: it can be recognised almost immediately and understood by everybody. Moreover, being easy to read, it is also easy to learn, which is important for quickly overcoming the tech skills shortage.
Alongside simplicity, another main objective of the original COBOL specification revolved around portability. COBOL’s high portability enables developers to analyse, debug, develop, test and deploy applications on different platforms.
In addition, it is relatively straightforward to recompile COBOL applications and run them in the cloud, containers, .NET and JVM, as well as in more traditional environments including mainframes, Linux, Unix and Windows.
To meet business requirements, COBOL provides speed, easy access, robustness and strong data manipulation, while also providing greater accuracy than other programming languages as a result of its 38 decimal digits.
Openness = modernisation
Another crucial factor contributing to COBOL’s continued popularity is the openness of the technology. This has made it possible to continuously modernise the language, meaning that businesses do not have to rip and replace core functionality. In fact, a recent Micro Focus study found that 70% of businesses would rather update existing technology than completely overhaul their systems.
To support modernisation, COBOL’s design allows for cross-platform customisation, maintenance and enhancement through modern IDE frameworks. The language can also be examined and refactored to support micro services, APIs or other service-based modernisation programmes. What’s more, new tooling is now available to streamline and automate the majority of this process.
Amid the global pandemic, COBOL’s openness also means it can support a new normal where people need to work remotely, or collaboratively, or be connected to a mainframe, depending on the situation. It is therefore flexible for today’s workforce and business environment.
Lastly, since it was first developed, the ongoing development of COBOL has been promoted by large companies to ensure core systems always meet the latest standards.
COBOL’s role in the FS sector
These benefits are all fundamental to the FS industry where there is no room for failure. Within banking, for instance, IT systems running on COBOL deliver too much value to be replaced.
FS sector facilities operate with a large volume of transactions and users at any one time. Similar to when they were built 60 years ago, they need to be safe and robust, and have scalable data processing functionality. These systems need to match COBOL’s capacity for accessing, processing, manipulating and reporting on huge amounts of data, significant arithmetic activity and working at speed.
COBOL allows for constant modernisation of these core systems that would prove too hazardous to completely convert. If history has taught us anything, it’s that introducing new functions to an IT system is rarely an error-free process. In the case of banking, any mistakes or downtime could potentially put data and information essential to transactions and securities trading at risk.
In this sense, COBOL is key to helping IT departments prevent the risk of system failure and ensure customers’ sensitive data is protected.
Continuously evolving to business needs
COBOL has been the bedrock of the FS industry since its inception and will no doubt hold this position well into the future as the language continues to evolve.
Collaborations between business and academia such as the COBOL Academic Program, alongside the 16,000 strong COBOL programmer Facebook community, growing range of user forums, training groups and other bodies are bringing the language to the next generation of IT talent.
Through this continual evaluation and renewal, COBOL will be a key part of the IT landscape for many years to come, within the FS sector and beyond.
The Coming AI Revolution
By H.P Bunaes, CEO and founder of AI Powered Banking.
There is a revolution in AI coming and it’s going to render legacy data and model governance practices obsolete.
The revolution will manifest in three ways:
- Automated machine learning platforms like DataRobot, H2O.ai, Dataiku, and rapidminer are making data scientists more productive. A lot more productive. One company told me that they were seeing 7x as many models from their data science group shortly after the implementation of a leading autoML platform. The increase in model output will quickly reveal bottlenecks in model validation, production implementation, and model operation and management.
- The increasing popularity of tools aimed at “citizen data scientists”, local data literate subject matter experts in the business without formal data science training who nevertheless know a good model and a good use case when they see it, will turn a large percentage of technically savvy business people into model developers. Models developed by citizen data scientists will quickly dwarf the volume of models created by formal data science organizations adding further strain on existing procedures and revealing gaps in governance.
- Availability of nearly unlimited capacity on demand for both data storage and computing power from cloud providers will lead to the proliferation of sophisticated predictive models that can learn from broad swaths of data; structured (your existing databases, for example), semi-structured (your documents), and even unstructured (such as images), sniffing out the data that is relevant to any one particular prediction or population. Demand for more, and different kinds of data for modeling, and the need to integrate model results into downstream dataflows and IT applications, will make data platforms and data flows significantly more complex, harder to manage, and increase points of failure.
What this all adds up to is an explosion in the volume of predictive models and of the data in motion in your organization. Where there were no models, there will suddenly be many. Where there was one model, you may find there are now hundreds. And the pipes providing data into and delivering results out of these models are going to proliferate. Operational and reputational risk from model failure will rise significantly as companies outgrow their existing data and model governance frameworks and legacy procedures.
Making this worse, many banks are starting from a weak position. The demand for more and better models (descriptive and predictive) has already led to a thicket of overlapping, partially inconsistent data flows to a multitude of models. Model outputs themselves have become part of the data flow to downstream data marts, BI, apps and even to other models as inputs. It is the rare organization that knows where all that data is coming from, where it is going, how it is being used, and can identify the potential impacts of changes to data and to the models that consume it.
Certainly there has been much improvement in recent years in data governance at most large organizations. Data quality, data standards, data integration, and data accessibility on robust platforms (increasingly cloud based) have all gotten better. And most organizations now have robust model risk management practices in place, to test and validate models before they go into production use.
But these worlds are about to collide. Data and analytics, once distinct and manageable separately are going to become inextricably intertwined. As brilliantly explained in a paper by several smart people at Google (“The Hidden Technical Debt in Machine Learning Systems”), we will rapidly reach the point where “changing anything changes everything.”
Take a simple example, what differentiates data on a client from a CRM system from data on a client created by a predictive model? The answer: nothing. Yet they are managed today by different groups. The former is typically managed by Data Governance, which is usually led by the Chief Data Officer. The latter is usually the province of Model Risk Management often found in the Corporate Risk Management organization.
But when model outputs become inputs to reports, to business processes, to critical operational or client facing systems, or to other models, they need to be governed just like any other data.
The perfect illustration of this challenge is in change management. Often you will find data change management in the chief data officer’s organization and model change management in the model risk organization. But changes in the data can, and often do, effect models in sometimes unpredictable fashion. And changes to models can change outputs and have major impacts to downstream consumers of those results if they are not prepared for the coming changes.
Managing them separately and distinctly will therefore no longer be sufficient. How to tackle this?
- First and foremost, you must have a complete catalog of all models including metadata describing model inputs and their source and model outputs along with their destination and uses. There are a number of solutions now coming on the market for this purpose including Verta.ai, ModelOp, and Algorithmia.
- Second, data management needs to expand to include not only source data but also all the results (predictions, descriptions) produced by models.
- Third, model management too needs to expand its remit, not just focusing on model testing and validation prior to model implementation but also monitoring model performance and managing model changes after the fact .
- Fourth there must be formal procedures for keeping model management and data management mutually informed and closely coordinated. Data cannot change without assessing model impact, and models cannot change without assessing data impact.
Organizationally, it may be infeasible to combine legacy organizations across traditional lines of responsibility. And it may be better to leverage existing expertise across model management, data engineering, data management, and IT. But a new partnership model, new tools, and new procedures will be needed.
The explosion in AI is upon us. To use AI safely and effectively you need to get your data and analytics house in order and make sure the right mechanisms are in place to keep it so. Regulators have taken note of the risks of poorly managed AI, and it is only a matter of time before they dictate minimum standards. Combining, or at least tightly coupling, data and model governance is where to start.
How financial services organisations are using data to underpin future growth
By John O’Keeffe, Director of Looker EMEA at Google Cloud
In addition to the turmoil caused by the COVID-19 pandemic, a significant decline in venture capital investment has left many financial services organisations feeling deflated, with others struggling to survive. According to figures from trade body Innovate Finance, investment in UK fintech organisations fell 30% in Q2 of this year, with smaller challenger firms and start-ups being the most profoundly hit by our current economic problems.
As a result, both challenger banks and more established players have had to pivot their strategies in order to maintain relevance and market share. Nonetheless, the outlook for fintech in the UK and further afield looks promising for the future. The reality of spending much of our time at home, and out of reach of brick and mortar services, means that many of us are becoming even more accustomed to digital banking for example. Recent analysis of finance application usage from Adjust, found that the average sessions in investment apps surged 88% globally, while payment and banking app sessions increased by 49% and 26%, respectively, during the COVID-19 pandemic.
However, the fact remains that investment in the sector is currently hard to come by. To help regain momentum, a review into the UK’s fintech industry was launched to identify opportunities to support growth across the industry. Data has – and will continue to – play a key role in this push for innovation, helping organisations spot gaps in the market, predict customer behaviours and ensure that the decisions they make are based on real insights. At such a critical time, enabling a data-led approach will help organisations ascertain exactly what is required to accelerate change and ensure the sustainability of the industry.
The financial services industry is a data-rich environment, giving organisations a potential goldmine of customer interactions, product performance and market trends. However, the difficulty often lies in bringing this into a coherent whole, and extracting the business insights required for long-term success. This is as much about strategy and accessibility as it is about technology. Fostering a true “data culture” where employees across the business, whether data experts or not, can access real-time intelligence that informs their day-to-day decision making in a positive way, is crucial. This may mean tweaking your onboarding and training programmes, identifying data evangelists that can catalyse others, or simply making data engaging and relatable for those who are new to the practice.
For many organisations, data is often stored within traditional business intelligence tools, third-party SQL clients or even just a simple spreadsheet, meaning that valuable data insights are siloed and often hindered by a bottleneck between a stretched analytics team and the rest of the business. There is also the all-important General Data Protection Regulation (GDPR) to consider, so data governance and having a clear view of where data is being housed, and for what purpose, is particularly pivotal.
With this in mind, it is crucial to have a “single source of truth” to bring various data streams together and enable real-time, self-serve insights to your whole employee base. As an example of this in practice, data is a great way to understand your existing clients more intimately and nip any problems in the bud early. By building a custom data dashboard incorporating, for example, number of support tickets issued, change in ticket sentiment and number of days to renewal, you can build up an accurate picture of account health and how this has changed over time. In combination with real-time metrics on which products and features are being used and how, sales teams can have more meaningful and accurate conversations with their customers, converting at-risk accounts into potential growth opportunities.
Given the dip in VC investment mentioned earlier, it is more important than ever for startups and scale-ups to do more with less and set a strategic roadmap that supports rapid growth. By using data to measure and action customer feedback, these organisations can be more agile in taking new products to market and making sure these are useful and address specific pain points.
Whether a fintech scale-up or an established name, it has never been more important to shift your operations to a more data-led strategy. With an uncertain outlook ahead for business across all sectors, making data the “single source of truth” can help to navigate market trends, identify new growth opportunities and simply make an organisation’s decision-making smarter and more efficient. Through data-driven innovation and growth, one of Britain’s most valuable industries can continue to thrive in the future.
The Bank of England partners with Appvia to assist in the design, construction and assurance of a new cloud environment
The Bank of England has appointed self-service cloud-native delivery platform Appvia to support the creation of a new cloud environment.
The announcement follows a public procurement process which commenced in January 2020. The Bank of England will work with Appvia on design, construction and assurance of a modern, fit for purpose cloud environment.
During the two-year partnership, Appvia will be supporting development and project teams within the Bank in testing and deploying code in cloud environments, working with security teams to integrate the cloud into existing operational and security processes; and implementing information governance compliance so staff are able to collaborate safely and securely.
Oliver Tweedie, Head of Digital Platforms at the Bank of England, said, “We have selected Appvia as our Cloud Delivery Partner to help us realise the Bank’s cloud ambitions and unlock the potential of the Cloud. Appvia come with a great pedigree and a wealth of experience delivering Cloud services within government. Working in collaboration with Bank Technology teams, Appvia will help us shape and build the future of Cloud services across our organisation – a key part of our Technology strategy.”
Jon Shanks, CEO and Co-Founder of Appvia, said, “This is an exciting opportunity to work with the Bank as it undergoes a step-change in its approach to the cloud. Harnessing innovative cloud solutions, such as containers and Kubernetes is a real business enabler for the Bank to streamline the software development lifecycle, ways of working and cloud operating model. We look forward to working with all stakeholders at the Bank of England to support its digital transformation journey.”
Appvia, which counts the Home Office among its major clients, is a self-service platform that enables organisations to scale their infrastructure quickly, securely and easily using services such as Kubernetes. In September, Appvia launched the world’s first developer-centric tool to enable teams to predict and control cloud costs.
The Coming AI Revolution
By H.P Bunaes, CEO and founder of AI Powered Banking. There is a revolution in AI coming and it’s going...
Q&A with Joe Steele, Head of Workplace Technology at Starling Bank
In just under a year, many businesses had no choice but to go online and with digital transformation on the rise...
How financial services organisations are using data to underpin future growth
By John O’Keeffe, Director of Looker EMEA at Google Cloud In addition to the turmoil caused by the COVID-19 pandemic, a...
Three questions the financial services industry must answer in 2021
Xformative, a Mastercard Start Path recipient, shares what these questions mean for fintech partners and their innovations This year, fintechs...
A quarter of banking customers noted an improvement in customer service over lockdown, research shows
SAS research reveals that banks offered an improved customer experience during lockdown A quarter (27%) of banking customers noted an...
Is Digital Transformation the Key to Business Survival in the New World?
After a turbulent year, enterprises are returning to the prospect of a new world following an unprecedented pandemic. Around the...
Virtual communications: How to handle difficult workplace conversations online
Have potentially difficult conversation at work, like discussing a pay rise, explaining deadline delays or going through performance reviews are...
Black Friday payment data reveals rapid growth of ‘pay later’ methods like Klarna
Payment processor Mollie reveals the most popular payment methods for Black Friday Mollie, one of the fastest-growing payment service providers,...
Brand guidelines: the antidote to your business’ identity crisis
By Andrew Johnson, Creative Director and Co-Founder. How well do you really know your business? Do you know which derivative of your...
COVID-19 creates long and winding road for startups seeking investment
By Jayne Chan, Head of StartmeupHK, Invest Hong Kong Countless technology and other companies describe themselves as innovators, disruptors or...