By H. P. Bunaes founder of AI Powered Banking.
As analytics, descriptive and predictive, are embedded in business processes in every nook and cranny of your organization, managing the operational risk associated with all of this is critical. A failure of your data analytics may, at best, impact operational efficiency, but at worst it could result in reputational damage or monetary loss. What makes this tricky is that analytics can appear to be working normally, when in fact erroneous results are being produced and sent to unsuspecting internal or external recipients downstream.
When there were only a handful of models in use, and they were developed by one group who controlled them from end to end, operational risk was manageable. But analytics is becoming pervasive, and may now be fragmented across many functions and lines of business, and operational risk is rising as a result. Many analytics groups have a long backlog of requests and resources are stretched thin. Monitoring of models in production may be low on the priority list. And, it is the rare organization indeed that knows where all the analytics in operation are and how they are being used.
Some recent examples:
● A chief analytics officer at a large US bank described how a model for approving overdrafts was found deeply embedded in the deposit system. No one remembered it was there, never mind knew how it worked.
● Another described the “what the hell” moment when data critical to credit models one day simply disappeared from the data stream.
● And a consumer banking analytics head at another bank described how models used to predict delinquencies suddenly stopped working as the pandemic hit since data used to build them was simply no longer relevant.
The topic of model risk management has been well thought through, and in some sectors, such as banking, regulatory guidance is clear. But the focus of model risk management has been on model validation and testing: all the important things that need to happen prior to implementation.
But as one head of analytics told me recently “it’s what happens after the fact that is of greatest concern [now]”. A new head of Model Risk Management at a top 10 US bank told me that “operational risk management is top of mind”. And a recently retired chief analytics officer added that unfortunately “[data scientists] just don’t get operational risk.”
In many organizations, the full extent of their deployed analytics is not known. There is no consolidated inventory of analytics, so no one knows where it all is and what it does. One large US bank last year did a survey of all of their predictive models in operation and found “thousands of models” that had not been through any formal approval, validation, or testing process according to several people I spoke with.
TOOLS AND PLATFORMS
There are tools and platforms coming on the market for managing analytics op risk (often referred to, somewhat narrowly, as “ML ops”, for machine learning operations). I’ve counted 10 of them: Verta.ai, Algorithmia, quickpath, fiddler, Domino, ModelOp, superwise.ai, DataKitchen, cnvrg.io, and DataRobot (their standalone MLops product formerly known as Parallel M). Each vendor takes a somewhat different approach to managing analytics ops risk. Over simplifying a bit, most focus either on model monitoring or on model management, only a few try to do both. Algorithmia is strong in model management, quickpath is strong in model monitoring. ModelOp and Verta.ai try to do both.
But, none of them have a prescribed operational risk management (ORM) framework. And without an effective framework for managing analytics in use, no tool will solve the problem.
In this article I will describe what an effect ORM for analytics should include at minimum.
The keystone to any ORM framework is a comprehensive model inventory, a database of models including all documentation, metadata (e.g. input data used and its source and lineage, results produced and where consumed), and operational results and metrics. Knowing what and where all of your analytics are and where and how they are being used is a prerequisite for good ORM. You can’t manage what you don’t know about.
Requiring that all data about each model is captured and stored centrally prior to implementation and use is the first bit of policy I’d recommend. All of the model validation and testing done in an effective Model Risk Management process needs to be captured in the model inventory/database. And all model inputs and model outputs, their sources and their destinations need to be cataloged.
The second bit of policy is that any use of a model must be captured centrally – – who is using the model, why, and to do what? The framework falls apart if there are unknown users of models. As described in a great paper on the hidden technical debt of analytics models, a system of models can grow over time such that a change to one model can affect many downstream models. “Changing anything changes everything.”
The second critical piece to analytics operational risk management is good change management: data change management, IT change management, and model change management. Nothing ever stays the same. The environment changes, client and competitor behavior changes, upstream data sources come and go, and the IT environment is in a constant state of change. From my experience, and confirmed through many conversations with industry practitioners, the primary reason that models fail in operation is poor change management. Even subtle changes, with no obvious impact to downstream models, can have dramatic and unpredictable effects.
Changes to data need to go through a process for identifying, triaging, and remediating downstream impacts. A database of models can be used to quickly identify which models could be impacted by a change in the data. The data changes then need to be tested prior to implementation, at least for models exceeding some risk threshold. Changes to models themselves need to be tested as well when those results, even if more accurate for one purpose, are consumed by multiple applications or as inputs to other models downstream. And, of course, changes to the IT environment need to be tested to be sure that there isn’t an impact to models such as latency or performance under load.
People tend to dislike a change management process viewed as slow or bureaucratic. So change management has to be time and cost efficient. Higher priority changes going through first, for example, routine changes as a lower priority. If the change management process is slow and burdensome, people will inevitably try to go around it degrading the effectiveness of the process.
Model monitoring means actively watching models for signs of any degradation or of increasing risk of failure (prior to any measurable degradation). An analytics head at a top 10 US bank confided that “modelers just don’t think monitoring is important”. Monitoring must include watching the incoming data for drift, data quality problems, anomalies in the data, or combinations of data never seen before. Even subtle changes in the incoming data can have dramatic downstream effects. There must be operational metrics and logs, capturing all incoming data and outgoing results, performance relative to SLA’s, volumes over time, and a record of all control issues or process failures.
Operational data on models must be captured and logged to provide an audit trail, for diagnostics, and for reporting purposes. Logs should include all incoming data used in the model and all resulting predictions output, as well as volumes and latency metrics for tracking performance against SLA’s. Traceability, explainability, and reproducibility will all be necessary for 3rd line of defense auditors and regulators.
Traceability means the full data lineage from raw source data through all data preparation and manipulation steps prior to model input. Explainability means being able to show how models arrived at their predictions, including which feature values were most important to the predicted outcomes. Model reproducibility requires keeping a log not only of incoming data, but of the model version, so that results can be replicated in the future after multiple generations of changes to the data and/or the model itself.
Issue logs must be continuously updated describing any process failures (unanticipated incoming data changes), control failures (data quality problems), or outages causing models to go “off line” temporarily. Auditors and regulators will want to see a triage and escalation process, demonstrating that the big issues are identified and get the right level of attention quickly.
ETHICS AND MODEL BIAS
Models must be tested for bias and independently reviewed for fairness and appropriateness of data use. Reputational risk assessments should be completed, including a review of the use of any sensitive personal data. Models should be tested for bias across multiple demographics (gender, age, ethnicity, and location). Models used especially for decisioning such as credit approval must be independently reviewed for fairness. A record of declines, for example, should be reviewed to ensure that the model is not systematically declining any one demographic unfairly. It is an unavoidable consequence of building predictive models that any model trained on biased data will itself be biased. It may be necessary therefore to mask sensitive data from the model that could result in unintentional model bias.
Lastly, it is not enough to have an effective model management and monitoring process. One must be able to prove to auditors and examiners that it works. For that you need good reporting which includes:
● An inventory of all models in operation
● A log of all model changes in a specified time period (this quarter to date, last full quarter, year to date, etc): new models implemented, model upgrades, and models retrained on new data
● A log of data changes: new data introduced, new features engineered, or changes in data definitions or usage
● For changes to existing models performance metrics on out of sample test data before and after the enhancements
● For each model in production, ability to generate a detailed report of model operation including a log of data in/results out, model accuracy metrics (where absolute truth can be known after the fact), and operational metrics (number of predictions made, latency, and performance under load for operationally critical models)
● Issue log: issue description, issue priority, date of issue logging and aging, status of remediation, escalation status, actions to be taken, and individual responsible for closure, new issues and closed issues in a given period
● Operational alert history: for a given period, for each model, a report of all incoming data alerts (missing data, data errors, anomalies in the data)
● Data change management logs showing what data changed and when and which models were identified as potentially effected and tested
● IT change management logs showing changes to the infrastructure effecting models
In my experience auditors and examiners presented with a comprehensive report package for review can be satisfied that you have an effective process in place and are likely to stop there. If no such evidence is available, they will look much deeper into your organization’s use of models which will be disruptive to operations and likely result in a long list of issues for management attention.
There are multiple ways to create the right organizational partnerships for effective analytics ORM. The brute force method would be to create a new organizational unit for “analytics operations”. One could argue in favor of this approach that this new organizational unit could be built with all the right skills and expertise and could build or select the right tools and platforms to support their mission.
But a better approach might be to create a virtual organization comprised of all the key players: data scientists, data engineers (the CDO’s organization, typically), the business unit, model risk management (typically in Corporate Risk Management, but sometimes found in Finance or embedded in multiple business units), traditional IT, and audit.
Orchestrating this partnership requires clear roles and responsibilities, and well articulated and documented policies and procedures explaining the rules of the road and who’s responsible for every aspect of analytics ORM.
The latter is harder to pull off, requires more upfront thought and investment, but may yield a better and more efficient result in the long run as everyone has a stake in the success of the process and existing resources can be both leveraged and focused on the aspects of the framework they are best suited to support.
As organizations increasingly become analytics driven, a process for managing analytics operational risk will safeguard the company from unpleasant surprises and ensure that analytics continue to operate effectively. Some might argue that the process outlined here will be costly to build and operate. I would argue that (a) they are already spending more than they think on model operations, management, and maintenance (b) that unexpected failures that cascade through the data environment are always harder and more costly to fix than the cost of proactive prevention and (c) that creating a centrally managed process will free up expensive resources to do more of the high value add work the business needs. Companies that want to scale up analytics will find that an effective ORM framework creates additional capacity, speeds the process, and eliminates nasty surprises.
H.P. Bunaes has 30 years experience in banking, with broad banking domain knowledge and deep expertise in data and analytics. After retiring from banking H.P. led the financial services industry vertical at DataRobot, designing the Go To Market strategy for banking and fintech, and advising 100’s of banks and fintechs on data and analytics strategy. H.P. recently founded AI Powered Banking (https://aipoweredbanking.net) with a mission of helping banks and fintechs leverage their data and advanced analytics and helping technology firms craft their GTM strategy for the financial services sector. H.P. is a graduate of M.I.T. where he earned an M.S. in Information Technology.
What to Know Before You Expand Across Borders
By Sean King, Director of International Tax at McGuire Sponsel
The American retail giant, Target Corporation, has a market cap of $64 billion and access to seemingly limitless resources and advisors. So, when the company engaged in its first global expansion, how could anything possibly go wrong?
Less than two years after opening its first Canadian store in 2013, Target shut down all133 Canadian locations and terminated more than 17,000 Canadian employees.
Expansion of an operation to another country can create unique challenges that may impact the financial viability of the entire enterprise. If Target Corporation can colossally fail in its expansion to Canada, how might Mom ‘N’ Pop LLC fare when expanding into Switzerland, Singapore, or Australia?
Successful global expansion requires an understanding of multilayered taxes, regulatory hurdles, employment laws, and cultural nuances. Fortunately, with the right guidance, global expansion can be both possible and profitable for businesses of any size.
Any company with global ambitions must first consider whether the company’s expansion outside of the U.S. will give rise to a taxable presence in the local country. In the cross-border context, a “permanent establishment” can be created in a local country when the enterprise reaches a certain level of activity, which is problematic because it exposes the U.S. multinational to taxation in the foreign country.
Foreign entity incorporation
To avoid permanent establishment risk, many U.S. multinationals choose to operate overseas through a formal corporate subsidiary, which reduces the company’s foreign income tax exposure, though it may result in an additional level of foreign income tax on the subsidiary’s earnings. In most jurisdictions, multinationals can operate their business in the foreign country as a branch, a pass through (e.g., partnership,) or a corporation.
As a branch, the U.S. multinational does not create a subsidiary in the foreign country. It holds assets, employees, and bank accounts under its own name. With a pass through, the U.S. multinational creates a separate entity in the foreign country that is treated as a partnership under the tax law of the foreign country but not necessarily as a partnership under U.S. tax law.
U.S. multinationals can also create corporate subsidiaries in the foreign country treated as corporations under the tax law of both the foreign country and the U.S., with possibly two levels of income taxation in the foreign country plus U.S. income taxation of earnings repatriated to the U.S. as dividends.
Under U.S. entity classification rules, certain types of entities can “check the box” to elect their classification to be taxed as a corporation with two levels of tax, a partnership with pass-through taxation, or even be disregarded for U.S. federal income tax purposes. The check the box election allows U.S. multinationals to engage in more effective global tax planning.
Toll charges, transfer pricing and treaties
When establishing a foreign corporate subsidiary, the U.S. multinational will likely need to transfer certain assets to the new entity to make it fully operational. However, in many cases, the U.S. multinational cannot perform the transfer without recognizing taxable income. In the international context, the IRS imposes certain outbound “toll charges” on the transfer of appreciated property to a foreign entity, which are usually provided for in IRC Section 367 and subject to various exceptions and nuances.
Instead, the U.S. multinational may prefer to license intellectual property to the foreign subsidiary for a fee rather than transfer the property outright. However, licensing requires the company and foreign subsidiary to adhere to transfer pricing rules, as dictated by IRC Section 482. The U.S. multinational and the foreign subsidiary must interact in an arms-length manner regarding pricing and economic terms. Furthermore, any such arrangement may attract withholding taxes when royalties are paid across a border.
Are you GILTI?
Certain U.S. multinationals opt to focus on deferring the income recognition at the U.S. level. In doing so, they simply leave overseas profits overseas and delay repatriating any of the earnings to the U.S.
Despite the general merits of this form of planning, U.S. multinationals will be subject to certain IRS anti-deferral mechanisms, commonly known as “Subpart F” and GILTI. Essentially, U.S. shareholders of certain foreign corporations are forced to recognize their pro rata share of certain types of income generated by these foreign entities at the time the income is earned instead of waiting until the foreign entity formally repatriates the income to the U.S.
The end goal
Essentially, all effective international tax planning boils down to treasury management. Effective and early tax planning can properly allow a company to better achieve its initial goal: profitability.
If global expansion is on the horizon for your company, consult a licensed professional for advice concerning your specific situation.
Pandemic risks eclipse treasury priorities as businesses diversify investments to mitigate impact
The Covid-19 pandemic has shunted aside existing challenges to sit atop treasurers’ priority lists, according to “The resilient treasury: Optimising strategy in the face of covid-19”, a survey run by the Economist Intelligence Unit (EIU) and sponsored by Deutsche Bank.
The results show that treasurers are looking to diversify their investments in a bid to mitigate the pandemic impacts, including heightened liquidity, foreign-exchange and interest-rate risk. As many as 55% plan to increase investments in long-term instruments, with 48% increasing investments in bank deposits, another 48% in local investment products, and 47% in money-market funds.
“The Covid-19 pandemic has drastically altered business plans in 2020. It has placed a certain level of strain on treasury processes, but the challenge it presents has been managed by traditional treasury skills. It is clear that pandemic risk will be on the treasury checklist for years to come, but it is one of many risks the department faces and will continue to manage,” says Melanie Noronha, the EIU editor of the report.
Despite Covid-19 looming large, other challenges wait in the wings. Notably, the replacement of the London Interbank Offered Rate was identified by 38% of respondents as the main challenge of their function.
Technology, meanwhile, continues to be a pressing issue, with treasury teams becoming increasingly reliant on IT solutions. Here, data quality is rising up the list of concerns. Already highlighted as very or somewhat concerning in 2019 by 69% of respondents, the figure rose to 78% in 2020. Acquiring the necessary skill sets to realise the full benefits of this data and technology is also a continuing priority – with some progress registered from last year. In 2020, 30% of respondents say they have all the skills they need to manage technological change, up from 22% in 2018.
“Treasury’s focus on technology is not only helping teams operate more efficiently in a remote-working environment, it has long played – and continues to play – a key role in realising their long-term priorities,” notes Ole Matthiessen, Head of Cash Management, Corporate Bank, Deutsche Bank. The survey shows that
Release 1 | 2 managing relationships with banks and suppliers (highlighted by 32% of respondents) and collaborating with other functions of the business (also 32%) remain top of the agenda – and seamless digital systems will help give treasurers the bandwidth and insight to be more effective partners for both internal and external stakeholders.
Based on a global survey of 300 treasury executives, conducted between April and May, the survey explores stakeholders’ attitudes among corporate treasurers towards the drivers of strategic change in the treasury function – from the pandemic through to regulation and technology – and their priorities for the next five years.
Digital collaboration: Shaping the Future of Finance
By Ryan Lester, Senior Director of Customer Experience Technologies at LogMeIn
With heightened economic uncertainty and increased customer expectation becoming the norm in the banking industry, it is understandable that the sector is struggling to keep afloat. Due to its precarious nature, banking institutions are trying their best to ensure they remain relevant in the competitive landscape and guarantee that their customers continue to be a priority.
When it comes to the first half of this year, the pandemic has shown how easy it is for industries to fail. Customers and companies alike had to get used to the new normal, as physical locations started to close. The banking industry felt this first hand, as banks were made to restructure how their business ran, with restricted opening hours and a wider push to motivate people to use online banking.
While some had already embraced digital options prior to the pandemic, this proved to be a stark contrast to the elderly population, who frequently visited branches to access their finances. Moving forward, banks have to adopt new methods to ensure customers get the most out of our their accounts, without their experience suffering.
Heightened Customer Expectations
When the pandemic reached its peak, people were encouraged to use online banking, as telephone contact was under strain with long waiting times and pressure mounting on contact centre agents. According to Fidelity National Information Services (FIS), which works with 50 of the world’s largest banks, there was a 200% jump in new mobile banking registrations in early April, while mobile banking traffic rose 85%.
With branches remaining closed, customers were continuously being urged to limit the amount of calls they made to the most urgent cases and consider whether they could solve their answers through mobile online banking or checking the company website. Although already being adopted in pockets of the industry, this was a real catalyst that spurred banks to up their game on digital channels and with self-service tools.
Banks are challenged with precariously balancing customer needs with the cost of personalised support. With the demographic of customers changing over the last few years, customers are becoming increasingly younger and more comfortable with technology. Influenced by the “Amazon Effect”, their expectations have raised to an all-time high, placing record strain on the sector
Customer experience isn’t just about support anymore, it’s about serving your customer at every point in the journey. Companies have an opportunity to elevate the experience they provide by moving beyond one-and-done interactions to create continuous engagements with their customers. It is starting to become a primary competitive differentiator in the market and one that doesn’t have a lot of variation. Deploying AI chatbot technology will be able to strategically help banks improve customer experience and raise the level of support that agents provide.
Digital collaboration: Working around the Clock
The benefits of adopting digital channels and self-service tools are second to none. By implementing chatbots, fuelled by conversational AI, banks will be able to help serve a wide range of customer queries and ensure they are protected from fraud and scams.
Conversational AI is exactly what it sounds like: a computer programme that engages in a conversation with a human. When it comes to service delivery, conversational AI can be deployed across multiple channels to engage with customers in ways that effectively address evolving customer needs. At a time defined by COVID-19, self-service tools such a conversational chatbots can work around the clock to solve customer queries in a concise and timely way. Of course, self-service tools won’t completely replace human agents in the banking industry, but they will help companies re-distribute customer traffic and workflows in ways that enhance customer experience. Self-service tools fuelled by conversational AI can also improve employee experience because service employees can handle fewer, but higher-level service tasks that chatbots might escalate to them.
Adopting new tools to help facilitate consistent and concise answers and help maintain customer experience is on the forefront of many industry minds. Banks such as the Natwest Group have seen this first-hand and are testament to the benefits that a good digital experience can provide. Simon Johnson, Capability Consultant, Digital at NatWest Group highlights NatWest’s use of digital tools during lockdown, “Over the last few months, we’ve learnt how to use digital tools to help our employees remotely. From a banking perspective, there have been a lot of changes including base rates, waive fees and the best ways of contacting our vulnerable customers, ensuring we keep them protected from frauds and scams.
“By introducing our Bold360 chatbot interface, Ella, we’ve been able to get relevant information out quickly, apply the best practice and ensure that our customer journeys are being developed correctly. Due to the volume of questions, some of our customers were finding themselves waiting longer than usual. So digital channels become essential to helping reduce the wait time. Using Bold360, we were able to mitigate issues and answer questions in a more timely way through our chatbot.
“Moving forward, as we open more digital services, we are analysing our data to see if customer will return back to their usual way of banking, now that they’ve seen what a good digital experience can provide. Either way, with Ella, we are ready.”
Chatbots and Humans: The Best Option for Customer Service
Over the last year, banking institutions have recognised the power that digital collaboration can have to their success. Delivering exceptional customer service and support is key for any business wanting to stay competitive in today’s market and banks are especially challenged with precariously balancing customer needs with the cost of personalised support. Leveraging the right technology, such as AI-powered chatbots, will enable the banking industry to provide better support and a more robust customer experience in the long term. Other institutions must follow suit, or risk becoming obsolete.
What to Know Before You Expand Across Borders
By Sean King, Director of International Tax at McGuire Sponsel The American retail giant, Target Corporation, has a market cap...
81% of Business Managers in the Manufacturing Industry Agree that a Modern IT infrastructure Accelerates Innovation, Creativity, and Productivity
83% of business decision makers are convinced that slow running networks and applications are inhibiting these three success factors 78%...
From fundamentals to digital evolution: Deutsche Bank and ACT release comprehensive guide for treasurers
The Association for Corporate Treasurers (ACT), in partnership with Deutsche Bank, has today announced the release of “The Group Treasurer:...
Sectigo Selected by Baidu to Provide SSL Services for All-New Baidu Trust SSL Certificates
Sectigo, a leading provider of automated digital identity management and web security solutions, announced that Baidu (NASDAQ: BIDU), a leading Chinese search...
Stella McCartney Transforms Financial Consolidation And Lease Accounting With Board
Board revamps financial analysis, consolidation and reporting for luxury lifestyle brand’s IFRS 16 compliance Board International, the leading provider of...
Satisfaction with Credit Card Issuers in Canada Remains Flat Amid COVID-19, J.D. Power Finds
Tangerine Bank Ranks Highest in Overall Credit Card Customer Satisfaction for Second Consecutive Year With 73% of credit card customers...
The benefits of automated pension plans
While many people will prefer to speak to fellow human beings when discussing their investments, automation is already part of...
Pandemic risks eclipse treasury priorities as businesses diversify investments to mitigate impact
The Covid-19 pandemic has shunted aside existing challenges to sit atop treasurers’ priority lists, according to “The resilient treasury: Optimising...
Boost for consumers as banks recognise room for improvement on service and delivery
42% of banks are looking to improve service provision and boost customer satisfaction in the year ahead Less than half...
By Paddy Osborn, Academic Dean, London Academy of Trading Whether you’re negotiating a business deal, playing a sport or trading...