Connect with us

Top Stories

A Framework for Analytics Operational Risk Management



A Framework for Analytics Operational Risk Management 1

By H. P. Bunaes founder of AI Powered Banking.

As analytics, descriptive and predictive, are embedded in business processes in every nook and cranny of your organization, managing the operational risk associated with all of this is critical. A failure of your data analytics may, at best, impact operational efficiency, but at worst it could result in reputational damage or monetary loss. What makes this tricky is that analytics can appear to be working normally, when in fact erroneous results are being produced and sent to unsuspecting internal or external recipients downstream.

When there were only a handful of models in use, and they were developed by one group who controlled them from end to end, operational risk was manageable. But analytics is becoming pervasive, and may now be fragmented across many functions and lines of business, and operational risk is rising as a result. Many analytics groups have a long backlog of requests and resources are stretched thin. Monitoring of models in production may be low on the priority list. And, it is the rare organization indeed that knows where all the analytics in operation are and how they are being used.

Some recent examples:

● A chief analytics officer at a large US bank described how a model for approving overdrafts was found deeply embedded in the deposit system. No one remembered it was there, never mind knew how it worked.
● Another described the “what the hell” moment when data critical to credit models one day simply disappeared from the data stream.
● And a consumer banking analytics head at another bank described how models used to predict delinquencies suddenly stopped working as the pandemic hit since data used to build them was simply no longer relevant.


The topic of model risk management has been well thought through, and in some sectors, such as banking, regulatory guidance is clear. But the focus of model risk management has been on model validation and testing: all the important things that need to happen prior to implementation.

But as one head of analytics told me recently “it’s what happens after the fact that is of greatest concern [now]”. A new head of Model Risk Management at a top 10 US bank told me that “operational risk management is top of mind”. And a recently retired chief analytics officer added that unfortunately “[data scientists] just don’t get operational risk.”

In many organizations, the full extent of their deployed analytics is not known. There is no consolidated inventory of analytics, so no one knows where it all is and what it does. One large US bank last year did a survey of all of their predictive models in operation and found “thousands of models” that had not been through any formal approval, validation, or testing process according to several people I spoke with.


There are tools and platforms coming on the market for managing analytics op risk (often referred to, somewhat narrowly, as “ML ops”, for machine learning operations). I’ve counted 10 of them:, Algorithmia, quickpath, fiddler, Domino, ModelOp,, DataKitchen,, and DataRobot (their standalone MLops product formerly known as Parallel M). Each vendor takes a somewhat different approach to managing analytics ops risk. Over simplifying a bit, most focus either on model monitoring or on model management, only a few try to do both. Algorithmia is strong in model management, quickpath is strong in model monitoring. ModelOp and try to do both.

But, none of them have a prescribed operational risk management (ORM) framework. And without an effective framework for managing analytics in use, no tool will solve the problem.

In this article I will describe what an effect ORM for analytics should include at minimum.

Comprehensive Operational Risk Management Framework for Analytics

Comprehensive Operational Risk Management Framework for Analytics


The keystone to any ORM framework is a comprehensive model inventory, a database of models including all documentation, metadata (e.g. input data used and its source and lineage, results produced and where consumed), and operational results and metrics. Knowing what and where all of your analytics are and where and how they are being used is a prerequisite for good ORM. You can’t manage what you don’t know about.

Requiring that all data about each model is captured and stored centrally prior to implementation and use is the first bit of policy I’d recommend. All of the model validation and testing done in an effective Model Risk Management process needs to be captured in the model inventory/database. And all model inputs and model outputs, their sources and their destinations need to be cataloged.

The second bit of policy is that any use of a model must be captured centrally – – who is using the model, why, and to do what? The framework falls apart if there are unknown users of models. As described in a great paper on the hidden technical debt of analytics models, a system of models can grow over time such that a change to one model can affect many downstream models. “Changing anything changes everything.”

The second critical piece to analytics operational risk management is good change management: data change management, IT change management, and model change management. Nothing ever stays the same. The environment changes, client and competitor behavior changes, upstream data sources come and go, and the IT environment is in a constant state of change. From my experience, and confirmed through many conversations with industry practitioners, the primary reason that models fail in operation is poor change management. Even subtle changes, with no obvious impact to downstream models, can have dramatic and unpredictable effects.

Changes to data need to go through a process for identifying, triaging, and remediating downstream impacts. A database of models can be used to quickly identify which models could be impacted by a change in the data. The data changes then need to be tested prior to implementation, at least for models exceeding some risk threshold. Changes to models themselves need to be tested as well when those results, even if more accurate for one purpose, are consumed by multiple applications or as inputs to other models downstream. And, of course, changes to the IT environment need to be tested to be sure that there isn’t an impact to models such as latency or performance under load.

People tend to dislike a change management process viewed as slow or bureaucratic. So change management has to be time and cost efficient. Higher priority changes going through first, for example, routine changes as a lower priority. If the change management process is slow and burdensome, people will inevitably try to go around it degrading the effectiveness of the process.


Model monitoring means actively watching models for signs of any degradation or of increasing risk of failure (prior to any measurable degradation). An analytics head at a top 10 US bank confided that “modelers just don’t think monitoring is important”. Monitoring must include watching the incoming data for drift, data quality problems, anomalies in the data, or combinations of data never seen before. Even subtle changes in the incoming data can have dramatic downstream effects. There must be operational metrics and logs, capturing all incoming data and outgoing results, performance relative to SLA’s, volumes over time, and a record of all control issues or process failures.

H. P. Bunaes

H. P. Bunaes

Operational data on models must be captured and logged to provide an audit trail, for diagnostics, and for reporting purposes. Logs should include all incoming data used in the model and all resulting predictions output, as well as volumes and latency metrics for tracking performance against SLA’s. Traceability, explainability, and reproducibility will all be necessary for 3rd line of defense auditors and regulators.

Traceability means the full data lineage from raw source data through all data preparation and manipulation steps prior to model input. Explainability means being able to show how models arrived at their predictions, including which feature values were most important to the predicted outcomes. Model reproducibility requires keeping a log not only of incoming data, but of the model version, so that results can be replicated in the future after multiple generations of changes to the data and/or the model itself.

Issue logs must be continuously updated describing any process failures (unanticipated incoming data changes), control failures (data quality problems), or outages causing models to go “off line” temporarily. Auditors and regulators will want to see a triage and escalation process, demonstrating that the big issues are identified and get the right level of attention quickly.


Models must be tested for bias and independently reviewed for fairness and appropriateness of data use. Reputational risk assessments should be completed, including a review of the use of any sensitive personal data. Models should be tested for bias across multiple demographics (gender, age, ethnicity, and location). Models used especially for decisioning such as credit approval must be independently reviewed for fairness. A record of declines, for example, should be reviewed to ensure that the model is not systematically declining any one demographic unfairly. It is an unavoidable consequence of building predictive models that any model trained on biased data will itself be biased. It may be necessary therefore to mask sensitive data from the model that could result in unintentional model bias.


Lastly, it is not enough to have an effective model management and monitoring process. One must be able to prove to auditors and examiners that it works. For that you need good reporting which includes:

● An inventory of all models in operation
● A log of all model changes in a specified time period (this quarter to date, last full quarter, year to date, etc): new models implemented, model upgrades, and models retrained on new data
● A log of data changes: new data introduced, new features engineered, or changes in data definitions or usage
● For changes to existing models performance metrics on out of sample test data before and after the enhancements
● For each model in production, ability to generate a detailed report of model operation including a log of data in/results out, model accuracy metrics (where absolute truth can be known after the fact), and operational metrics (number of predictions made, latency, and performance under load for operationally critical models)
● Issue log: issue description, issue priority, date of issue logging and aging, status of remediation, escalation status, actions to be taken, and individual responsible for closure, new issues and closed issues in a given period
● Operational alert history: for a given period, for each model, a report of all incoming data alerts (missing data, data errors, anomalies in the data)
● Data change management logs showing what data changed and when and which models were identified as potentially effected and tested
● IT change management logs showing changes to the infrastructure effecting models

In my experience auditors and examiners presented with a comprehensive report package for review can be satisfied that you have an effective process in place and are likely to stop there. If no such evidence is available, they will look much deeper into your organization’s use of models which will be disruptive to operations and likely result in a long list of issues for management attention.


There are multiple ways to create the right organizational partnerships for effective analytics ORM. The brute force method would be to create a new organizational unit for “analytics operations”. One could argue in favor of this approach that this new organizational unit could be built with all the right skills and expertise and could build or select the right tools and platforms to support their mission.

But a better approach might be to create a virtual organization comprised of all the key players: data scientists, data engineers (the CDO’s organization, typically), the business unit, model risk management (typically in Corporate Risk Management, but sometimes found in Finance or embedded in multiple business units), traditional IT, and audit.

Orchestrating this partnership requires clear roles and responsibilities, and well articulated and documented policies and procedures explaining the rules of the road and who’s responsible for every aspect of analytics ORM.

The latter is harder to pull off, requires more upfront thought and investment, but may yield a better and more efficient result in the long run as everyone has a stake in the success of the process and existing resources can be both leveraged and focused on the aspects of the framework they are best suited to support.


As organizations increasingly become analytics driven, a process for managing analytics operational risk will safeguard the company from unpleasant surprises and ensure that analytics continue to operate effectively. Some might argue that the process outlined here will be costly to build and operate. I would argue that (a) they are already spending more than they think on model operations, management, and maintenance (b) that unexpected failures that cascade through the data environment are always harder and more costly to fix than the cost of proactive prevention and (c) that creating a centrally managed process will free up expensive resources to do more  of the high value add work the business needs. Companies that want to scale up analytics will find that an effective ORM framework creates additional capacity, speeds the process, and eliminates nasty surprises.

Author Bio:

H.P. Bunaes has 30 years experience in banking, with broad banking domain knowledge and deep expertise in data and analytics. After retiring from banking H.P. led the financial services industry vertical at DataRobot, designing the Go To Market strategy for banking and fintech, and advising 100’s of banks and fintechs on data and analytics strategy. H.P. recently founded AI Powered Banking ( with a mission of helping banks and fintechs leverage their data and advanced analytics and helping technology firms craft their GTM strategy for the financial services sector. H.P. is a graduate of M.I.T. where he earned an M.S. in Information Technology.  


This is a Sponsored Feature.

Top Stories

Oil extends losses as Texas prepares to ramp up output



Oil extends losses as Texas prepares to ramp up output 2

By Ahmad Ghaddar

LONDON (Reuters) – Oil prices fell from recent highs for a second day on Friday as Texas energy firms began to prepare for restarting oil and gas fields shuttered by freezing weather.

Brent crude futures were down $1.16, or 1.8%, to $62.77 per barrel, by 1150 GMT, while U.S. West Texas Intermediate (WTI) crude futures fell $1.42, or 2.4%, to $59.10 a barrel.

Unusually cold weather in Texas and the Plains states curtailed up to 4 million barrels per day (bpd) of crude oil production and 21 billion cubic feet of natural gas, according to analysts.

Texas refiners halted about a fifth of the nation’s oil processing amid power outages and severe cold.

However, firms in the region on Friday were expected to prepare for production restarts as electric power and water services slowly resume, sources said.

“The market was ripe for a correction and signs of the power and overall energy situation starting to normalise in Texas provided the necessary trigger,” said Vandana Hari, energy analyst at Vanda Insights.

Oil fell despite a surprise fall in U.S. crude stockpiles in the week to Feb. 12, before the freeze. Inventories fell by 7.3 million barrels to 461.8 million barrels, their lowest since March, the Energy Information Administration reported on Thursday. [EIA/S]

The United States on Thursday said it was ready to talk to Iran about both nations returning to a 2015 agreement that aimed to prevent Tehran from acquiring nuclear weapons.

While the thawing relations could raise the prospect of reversing sanctions imposed by the previous U.S. administration, analysts did not expect Iranian oil sanctions to be lifted anytime soon.

“This breakthrough increases the probability that we may see Iran returning to the oil market soon, although there is much to be discussed and a new deal will not be a carbon-copy of the 2015 nuclear deal,” StoneX analyst Kevin Solomon said.

(Additional reporting by Roslan Khasawneh in Singapore and Sonali Paul in Melbourne; editing by Jason Neely)

Continue Reading

Top Stories

Analysis: Carmakers wake up to new pecking order as chip crunch intensifies



Analysis: Carmakers wake up to new pecking order as chip crunch intensifies 3

By Douglas Busvine and Christoph Steitz

BERLIN (Reuters) – The semiconductor crunch that has battered the auto sector leaves carmakers with a stark choice: pay up, stock up or risk getting stuck on the sidelines as chipmakers focus on more lucrative business elsewhere.

Car manufacturers including Volkswagen, Ford and General Motors have cut output as the chip market was swept clean by makers of consumer electronics such as smartphones – the chip industry’s preferred customers because they buy more advanced, higher-margin chips.

The semiconductor shortage – over $800 worth of silicon is packed into a modern electric vehicle – has exposed the disconnect between an auto industry spoilt by decades of just-in-time deliveries and an electronics industry supply chain it can no longer bend to its will.

“The car sector has been used to the fact that the whole supply chain is centred around cars,” said McKinsey partner Ondrej Burkacky. “What has been overlooked is that semiconductor makers actually do have an alternative.”

Automakers are responding to the shortage by lobbying governments to subsidize the construction of more chip-making capacity.

In Germany, Volkswagen has pointed the finger at suppliers, saying it gave them timely warning last April – when much global car production was idled due to the coronavirus pandemic – that it expected demand to recover strongly in the second half of the year.

That complaint by the world’s No.2 volume carmaker cuts little ice with chipmakers, who say the auto industry is both quick to cancel orders in a slump and to demand investment in new production in a recovery.

“Last year we had to furlough staff and bear the cost of carrying idle capacity,” said a source at one European semiconductor maker, who spoke on condition of anonymity.

“If the carmakers are asking us to invest in new capacity, can they please tell us who will pay for that idle capacity in the next downturn?”


The auto industry spends around $40 billion a year on chips – about a tenth of the global market. By comparison, Apple spends more on chips just to make its iPhones, Mirabaud tech analyst Neil Campling reckons.

Moreover, the chips used in cars tend to be basic products such as micro controllers made under contract at older foundries – hardly the leading-edge production technology in which chipmakers would be willing to invest.

“The suppliers are saying: ‘If we continue to produce this stuff there is nowhere else for it to go. Sony isn’t going to use it for a Playstation 5 or Apple for its next iPhone’,” said Asif Anwar at Strategy Analytics.

Chipmakers were surprised by the panicked reaction of the German car industry, which persuaded Economy Minister Peter Altmaier to write a letter in January to his counterpart in Taiwan to ask its semiconductor makers to supply more chips.

No extra supplies were forthcoming, with one German industry source joking that the Americans stood a better chance of getting more chips from Taiwan because they could at least park an aircraft carrier off the coast – referring to the ability of the United States to project power in Asia.

Closer to home, a source at another European chipmaker expressed disbelief at the poor understanding at one carmaker of how it operates.

“We got a call from one auto maker that was desperate for supply. They said: Why don’t you run a night shift to increase production?” this person said.

“What they didn’t understand is that we have been running a night shift since the beginning.”


While Infineon, the leading supplier of chips to the global auto industry, and Robert Bosch, the top ‘Tier 1’ parts supplier, both plan to commission new chip plants this year, there is little chance of supply shortages easing soon.

Specialist chipmakers like Infineon outsource some production of automotive chips to contract manufacturers led by Taiwan Semiconductor Manufacturing Co Ltd (TSMC), but the Asian foundries are currently prioritising high-end electronics makers as they come up against capacity constraints.

Over the longer term, the relationship between chip makers and the car industry will become closer as electric vehicles are more widely adopted and features such as assisted and autonomous driving develop, requiring more advanced chips.

But, in the short term, there is no quick fix for the lack of chip supply: IHS Markit estimates that the time it takes to deliver a microcontroller has doubled to 26 weeks and shortages will only bottom out in March.

That puts the production of 1 million light vehicles at risk in the first quarter, says IHS Markit. European chip industry executives and analysts agree that supply will not catch up with demand until later in the year.

Chip shortages are having a “snowball effect” as auto makers idle some capacity to prioritize building profitable models, said Anwar at Strategy Analytics, who forecasts a drop in car production in Europe and North America of 5%-10% in 2021.

The head of Franco-Italian chipmaker STMicroelectronics, Jean-Marc Chery, forecasts capacity constraints will affect carmakers until mid-year.

“Up to the end of the second quarter, the industry will have to manage at the lean inventory level,” Chery told a recent Goldman Sachs conference.

(Douglas Busvine from Berlin and Christoph Steitz from Frankfurt; Additional reporting by Mathieu Rosemain and Gilles Gillaume in Paris; Editing by Susan Fenton)

Continue Reading

Top Stories

Aussie and sterling hit multi-year highs on recovery bets



Aussie and sterling hit multi-year highs on recovery bets 4

By Tommy Wilkes

LONDON (Reuters) – The Australian dollar rose to near a three-year high and the British pound scaled $1.40 for the first time since 2018 on optimism about economic rebounds in the two countries and after the U.S. dollar was knocked by disappointing jobs data.

The U.S. currency had been rising in recent days as a jump in Treasury yields on the back of the so-called reflation trade drew investors. But an unexpected increase in U.S. weekly jobless claims soured the economic outlook and sent the dollar lower overnight.

On Friday it traded down 0.3% against a basket of currencies, with the dollar index at 90.309.

The Aussie rose 0.8% to $0.784, its highest since March 2018. The currency, which is closely linked to commodity prices and the outlook for global growth, has been helped by a recent rally in commodity prices.

The New Zealand dollar also gained, and was not far off a more than two-year high, while the Canadian dollar rose too.

Sterling rose to $1.4009 on Friday, an almost three-year high amid Britain’s aggressive vaccination programme.

Given the size of Britain’s vital services sector, analysts say the faster it can reopen the economy, the better for the currency. Sterling was also helped by better-than-expected purchasing managers index flash survey data for February.

The U.S. dollar has been weighed down by a string of soft labour data, even as other indicators have shown resilience, and as President Joe Biden’s pandemic relief efforts take shape, including a proposed $1.9 trillion spending package.

Despite the recent rise in U.S. yields, many analysts think they won’t climb too much higher, limiting the benefit for the dollar.

“Our view remains that the Fed will hold the line and remain very cautious about tapering asset purchases. We think it will keep communicating that tightening is very far off, which should dampen pro-dollar sentiment,” said UBS Global Wealth Management strategist Gaétan Peroux and analyst Tilmann Kolb.

ING analysts said “the rise in rates will be self-regulating, meaning the dollar need not correct too much higher”.

They see the greenback index trading down to the 90.10 to 91.05 range.

U.S. dollar

Aussie and sterling hit multi-year highs on recovery bets 5

The euro rose 0.4% to $1.2134. The single currency showed little reaction to purchasing manager index data, which showed a slowdown in business activity in February. However, factories had their busiest month in three years, buoying sentiment.

The dollar bought 105.39 yen, down 0.3% and a continued retreat from the five-month high of 106.225 reached Wednesday.

(Editing by Hugh Lawson and Pravin Char)

Continue Reading
Editorial & Advertiser disclosureOur website provides you with information, news, press releases, Opinion and advertorials on various financial products and services. This is not to be considered as financial advice and should be considered only for information purposes. We cannot guarantee the accuracy or applicability of any information provided with respect to your individual or personal circumstances. Please seek Professional advice from a qualified professional before making any financial decisions. We link to various third party websites, affiliate sales networks, and may link to our advertising partners websites. Though we are tied up with various advertising and affiliate networks, this does not affect our analysis or opinion. When you view or click on certain links available on our articles, our partners may compensate us for displaying the content to you, or make a purchase or fill a form. This will not incur any additional charges to you. To make things simpler for you to identity or distinguish sponsored articles or links, you may consider all articles or links hosted on our site as a partner endorsed link.

Call For Entries

Global Banking and Finance Review Awards Nominations 2021
2021 Awards now open. Click Here to Nominate

Latest Articles

Where are we with Open Banking, and should we be going further? 6 Where are we with Open Banking, and should we be going further? 7
Banking3 mins ago

Where are we with Open Banking, and should we be going further?

By Mitchel Lenson, Non-Executive Chairman, Exizent Open Banking has the power to revolutionise the way we manage our money, but...

Oil extends losses as Texas prepares to ramp up output 8 Oil extends losses as Texas prepares to ramp up output 9
Top Stories3 mins ago

Oil extends losses as Texas prepares to ramp up output

By Ahmad Ghaddar LONDON (Reuters) – Oil prices fell from recent highs for a second day on Friday as Texas...

From distrust to love/hate – are fintechs and banks starting to get along? From distrust to love/hate – are fintechs and banks starting to get along?
Banking22 mins ago

What will become of our banks and their channels in 2021?  

By Mark Aldred, banking specialist at Auriga As we embark on the new year, 2020 will hopefully become distant but...

Knowing the best alternative payment methods Knowing the best alternative payment methods
Finance25 mins ago

Three ways payment orchestration improves financial reconciliation

By Brian Coburn, CEO or Bridge, When Luca Pacioli, the 15th century Venetian monk, invented double-entry account keeping, managing financial...

Circular Economy must be top of the business agenda in 2021 10 Circular Economy must be top of the business agenda in 2021 11
Finance29 mins ago

Circular Economy must be top of the business agenda in 2021

By Andrew Sharp, CEO of CDSL, the UK’s leading appliance spare parts distributor The last year has been one in...

Analysis: Carmakers wake up to new pecking order as chip crunch intensifies 12 Analysis: Carmakers wake up to new pecking order as chip crunch intensifies 13
Top Stories31 mins ago

Analysis: Carmakers wake up to new pecking order as chip crunch intensifies

By Douglas Busvine and Christoph Steitz BERLIN (Reuters) – The semiconductor crunch that has battered the auto sector leaves carmakers...

Bitcoin steams to new record and nears $1 trillion market cap 14 Bitcoin steams to new record and nears $1 trillion market cap 15
Finance34 mins ago

Bitcoin steams to new record and nears $1 trillion market cap

By Tom Wilson and Stanley White LONDON/TOKYO (Reuters) – Bitcoin hit yet another record high on Friday, and moved within...

What does cybersecurity look like for the financial sector in 2021? 17 What does cybersecurity look like for the financial sector in 2021? 18
Technology38 mins ago

What does cybersecurity look like for the financial sector in 2021?

By Neill Lawson-Smith, managing director at CIS The landscape is changing incredibly fast, with cybercriminals using the most up-to-date technology...

Aussie and sterling hit multi-year highs on recovery bets 19 Aussie and sterling hit multi-year highs on recovery bets 20
Top Stories53 mins ago

Aussie and sterling hit multi-year highs on recovery bets

By Tommy Wilkes LONDON (Reuters) – The Australian dollar rose to near a three-year high and the British pound scaled...

Extrusion Equipment Market In-Depth Analysis on Size, Cost Structure and Prominent Key Players -Milacron, RDN Manufacturing Co., Inc., Coperion GmbH, Conair Group, Toshiba Machine Co., Ltd., HPM, Krauss Maffei. 22 Extrusion Equipment Market In-Depth Analysis on Size, Cost Structure and Prominent Key Players -Milacron, RDN Manufacturing Co., Inc., Coperion GmbH, Conair Group, Toshiba Machine Co., Ltd., HPM, Krauss Maffei. 23
Research Reports1 hour ago

Extrusion Equipment Market In-Depth Analysis on Size, Cost Structure and Prominent Key Players -Milacron, RDN Manufacturing Co., Inc., Coperion GmbH, Conair Group, Toshiba Machine Co., Ltd., HPM, Krauss Maffei.

Impact of COVID-19 on Industrial Automation Market COVID-19 pandemic has caused a severe impact on the global economy at various...

Newsletters with Secrets & Analysis. Subscribe Now