In the global financial services industry, new technology is driving a “transformational” change as Martin Wheatley, CEO of the FCA, put it earlier this year. Whether it’s mobile banking, high frequency trading (HFT), payments technology, portfolio analysis or big data, all the main drivers behind industry change are technology-led.
This trend certainly presents market participants with both opportunities and challenges. Whilst for many, these developments open important new opportunities, Mr Wheatley also noted that for others “the skies are darker”.
“The concern here is that financial services will become a kind of tech-led Wild West, if you like – full of cybercrime, data losses, runaway algos, flash crashes and hash-crashes of the type we saw last year, when hackers took over the twitter feed of AP [Associated Press].”
On the other side of the Atlantic, US regulators are all too familiar with these difficulties. In fact, as our recent Global Enforcement Review (GER) reveals, technology is also transforming the way regulators monitor and control the industry.
Published in April, our GER report found that from 2006 to 2013, the SEC increased the number of employees by approximately 22%, but overall expenditure by 62%. For Hong Kong’s SFC, the increases were 51% and 120% in the same period.
If not on headcount, where is the extra money going? It is technology, rather than headcount, that accounts for this extra expenditure by regulators.
As SEC chairwoman Mary Jo White observed earlier this year: “It is not only our job to keep pace with this rapidly changing environment, but, where possible, also to harness and leverage advances in technology to better carry out our mission”.
Likewise, while expenditure increases have trailed the rise in headcount at the FCA, creating advanced Information Systems (IS) has been a primary objective, as stated in the regulator’s 2014/15 Business Plan.
Our research showed that in the US for 2013, nearly a quarter (24%) of all actions were enforcement actions over insider trading, market manipulation, financial fraud, issuer disclosure and similar activities. Meanwhile, in Hong Kong and the UK, insider dealing and market manipulation accounted for the lion’s share of the total value of fines against individuals. They were the primary causes of enforcement actions in Hong Kong, and second largest in the UK. Across the world, regulators are meticulously focussed on ensuring clean, fair and orderly markets.
Significant investment in technology, by regulators and firms alike, is the inevitable consequence of the regulators’ focus on market abuse.
Technology is central to detecting this abuse and developing cases against wrongdoers. This is perhaps most obvious when it comes to high frequency and algorithmic trading. Regulations, such as the EU’s revised Markets in Financial Instruments Directive (MiFID), may impose restrictions on the usage of high-tech transaction tools, but enforcement will depend on systems like the US’s National Exam Analytics Tool (NEAT) and Market Information Data Analytics System (MIDAS) or the UK’s ZEN database.
By capturing and analysing millions of transactions a day, these surveillance technologies improve the capacity of regulators to detect insider trading, front running, market manipulation and other forms of misconduct. These investments are also, in a sense, pooled: increasing cross-border cooperation. This has been demonstrated in recent market abuse cases around Libor and Forex rigging and illustrates that regulators are willing, and able, to share data to identify abuse. Creating clear channels for exchange of information across jurisdictions and with law enforcement agencies has been a significant accomplishment in global regulators’ investigative and prosecutorial procedures.
Nevertheless, it is not just regulators who must bear the burden of facing up to new technological challenges. The financial services industry itself must play its part and the regulators have made this clear.
Regulations such as MiFID and European Market Infrastructure Regulations (EMIR) already point to a greater need in terms of transaction reporting requirements from regulated firms and also non-financial counterparties. As technology enables regulators to more easily “crunch data”, the demand to supply it also grows. The growing capabilities of the regulators will also mean greater scrutiny of the data being supplied. There will be greater pressure from regulators for firms to produce accurate and timely data in the correct form.
As noted in the GER report, firms will be expected to police themselves and demonstrate that they are maintaining strong cultures and systems to detect and discourage misconduct. Manual, sample-based approaches to monitoring and even the continuing use of spread sheets in some firms will give way to more sophisticated and automated systems that capture and analyse entire sets of data. There is simply no other way to monitor transactions effectively and guard against abuse, even for firms executing only a modest number of trades.
With the advent of MiFIR and MAR, for firms other than large sell-side investment banks – from which regulators have long required detailed transaction monitoring and reporting – this will continue to present challenges. Theoretically at least, those banks already have significant internal monitoring and reporting capabilities in place; in Kinetic Partners’ experience, many buy side-firms do not.
Updating technology and capacity will require investments of time and money, and considerable support from advisors. However, regardless of what regulators require, it is in the interest of businesses to embrace new technologies.
Regulators have emphasised that ultimate responsibility properly lies with individuals, rather than the firm. It is individuals who are increasingly the focus of sanctions. Even the SEC, which has historically been more focussed on actions against individuals than some other regulators, remains keen to stress its continued commitment to pursuing those individuals who are accountable.
Regulators are significantly more likely to look positively on firms which have established robust systems to detect and report abuse themselves, or at least demonstrated a proactive commitment. However, with the increased technological complexity of markets, and enhanced regulatory surveillance capabilities, firms and individuals judged to have contributed to, rather than curtailed, the Wild West, will be ever more likely to find themselves in the regulatory cross-hairs.
Uncertain Times for the Financial Sector… Is Open Source the Solution?
By Kris Sharma, Finance Sector Lead, Canonical
Financial services are an important part of the economy and play a wider role in providing liquidity and capital across the globe. But ongoing political uncertainty and the consequences of the COVID-19 crisis have deep implications for the UK’s financial services sector.
In a post-Brexit world, the industry is facing regulatory uncertainty at a whole different scale, with banking executives having to understand the implications of different scenarios, including no-deal. To reduce the risk of significant disruption, financial services firms require the right technology infrastructure to be agile and responsive to potential changes.
The role of open source
Historically, banks have been hesitant to adopt open source software. But over the course of the last few years, that thinking has begun to change. Organisations like the Open Bank Project and Fintech Open Source Foundation (FINOS) have come about with the aim of pioneering open source adoption by highlighting the benefits of collaboration within the sector. Recent acquisitions of open source companies by large and established corporate technology vendors signal that the technology is maturing into mainstream enterprise play. Banking leaders are adopting open innovation strategies to lower costs and reduce time-to-market for products and services.
Banks must prepare to rapidly implement changes to IT systems in order to comply with new regulations, which may be a costly task if firms are solely relying on traditional commercial applications. Changes to proprietary software and application platforms at short notice often have hidden costs for existing contractual arrangements due to complex licensing. Open source technology and platforms could play a crucial role in helping financial institutions manage the consequences of Brexit and the COVID-19 crisis for their IT and digital functions.
Open source software gives customers the ability to spin up instances far more quickly and respond to rapidly changing scenarios effectively. Container technology has brought about a step-change in virtualisation technology, providing almost equivalent levels of resource isolation as a traditional hypervisor. This in turn offers considerable opportunities to improve agility, efficiency, speed, and manageability within IT environments. In a survey conducted by 451 Research, almost a third of financial services firms see containers and container management as a priority they plan to begin using within the next year.
Containerisation also enables rapid deployment and updating of applications. Kubernetes, or K8s for short, is an open-source container-orchestration system for deploying, monitoring and managing apps and services across clouds. It was originally designed by Google and is now maintained by the Cloud Native Computing Foundation (CNCF). Kubernetes is a shining example of open source, developed by a major tech company, but now maintained by the community for all, including financial institutions, to adopt.
The data dilemma
The use cases for data and analytics in financial services are endless and offer tangible solutions to the consequences of uncertainty. Massive data assets mean that financial institutions can more accurately gauge the risk of offering a loan to a customer. Banks are already using data analytics to improve efficiency and increase productivity, and going forward, will be able to use their data to train machine learning algorithms that can automate many of their processes.
For data analytics initiatives, banks now have the option of leveraging the best of open source technologies. Databases today can deliver insights and handle any new sources of data. With models flexible enough for rich modern data, a distributed architecture built for cloud scale, and a robust ecosystem of tools, open source platforms can help banks break free from data silos and enable them to scale their innovation.
Open source databases can be deployed and integrated in the environment of choice, whether public or private cloud, on-premise or containers, based on business requirements. These database platforms can be cost effective; projects can begin as prototypes and develop quickly into production deployments. As a result of political uncertainty, financial firms will need to be much more agile. And with no vendor lock-in, they will be able to choose the provider that is best for them at any point in time, enabling this agility while avoiding expensive licensing.
As with any application running at scale, production databases and analytics applications require constant monitoring and maintenance. Engaging enterprise support for open source production databases minimises risk for business and can optimise internal efficiency.
Additionally, AI solutions have the potential to transform how banks deal with regulatory compliance issues, financial fraud and cybercrime. However, banks need to get better at using customer data for greater personalisation, enabling them to offer products and services tailored to individual consumers in real time. As yet, most financial institutions are unsure whether a post-Brexit world will focus on gaining more overseas or UK-based customers. With a data-driven approach, banks can see where the opportunities lie and how best to harness them. The opportunities are vast and, on the journey to deliver cognitive banking, financial institutions have only just scratched the surface of data analytics. But as the consequences of COVID-19 continue and Brexit uncertainty once again moves up the agenda, moving to data-first will become less of a choice and more of a necessity.
The number of data sets and the diversity of data is increasing across financial services, making data integration tasks ever more complex. The cloud offers a huge opportunity to synchronise the enterprise, breaking down operational and data silos across risk, finance, regulatory, customer support and more. Once massive data sets are combined in one place, the organisation can apply advanced analytics for integrated insights.
Uncertainty on the road ahead
Open source technology today is an agile and responsive alternative to traditional technology systems that provides financial institutions with the ability to deal with uncertainty and adapt to a range of potential outcomes.
In these unpredictable times, banking executives need to achieve agility and responsiveness while at the same time ensuring that IT systems are robust, reliable and managed effectively. And with the option to leverage the best of open source technologies, financial institutions can face whatever challenges lie ahead.
The end of the cookie and the new era of digital marketing
By Biog Richard Wheaton, UK MD of data company fifty-five
If you are following the current announcements around data governance in digital marketing, you may be forgiven for thinking that digital media performance measurement is coming to an end. Europe’s GDPR and California’s CPA have set out the legal frameworks for new requirements for privacy compliance, and Apple’s ITP safeguards block the use of 3rd party cookies on their devices. Google is committed to following this, with similar blocks for its Chrome browser by 2022.
There are going to be some online targeting tactics that you will not be able to deploy in this new world, and for users with iPhones and Apple Mac computers, those tactics are already significantly curtailed. By 2022 there will be a whole raft of use cases in targeting tools like DMPs (Data Management Platforms) that will be rendered useless on the vast majority of devices.
So, what can you do? Are we set back to the days of purely contextual targeting, in which you bought the “eyeballs” of customers of a specific financial product or a given industrial sector by advertising in their vertical trade journal? Do we just have to send the same ad to everyone, and hope that some of them are in the market for our niche products?
Moving towards the end of 3rd party cookies
For marketers of financial products and services, these are important questions because managing your audiences is important for avoiding waste and for addressing your engaged clients and prospects in targeted and effective ways. The ability to track and remarket to customers has been a mainstay of digital marketing for the past decade. This has all been enabled by the dropping of cookies – files that sit on your PC and record your browsing activity. With moves from regulators and tech companies to protect user anonymity, the cookie while not quite dead, is certainly crumbling.
The upshot is that marketers cannot rely on 3rd party data anymore and should be wary of how any use of 3rd party data will be viewed by the regulator. This renders even the most basic consumer journey starting with a mobile search and ending with a desktop ecommerce transaction hard to measure in many cases. But the use of anonymized 1st party data can still yield critical insights, and this is where we need to focus our efforts.
From individual to anonymized data
So, what has fundamentally changed in digital measurement? The overarching shift is from a world where brands could access the log-file data of each individual user to a world of anonymized user behaviour, based on vast pools of data.
The ownership of these data pools and the ability to collect and model it sits with the brand that has the direct contact with the consumer. For large banks, insurance and pension companies, this is a resource for efficient and effective digital marketing, based on aggregated insights. For the adtech giants – Google, Amazon, Facebook – this will provide increased levels of targeting options within their tools.
The adtech identity graph
Google are in some ways the most advanced in delivering a scalable solution, which is worth examining. Google uses an undisclosed, supposedly huge sample of users for which they have collected the right consent first. They can accurately track across publishers, devices and even offline channels. From this they extrapolate to the entire population, applying machine learning or more traditional online polling. The end report is always aggregated and only available above a certain volume threshold. Facebook and Amazon are constantly developing similar aggregated audiences for targeting with their inventory.
These audience aggregation capabilities may appear to be similar to Nielsen’s panel-based measurement model. But this really is digital marketing on rocket fuel. It allows brands to leverage the tech brands’ enormous identity graphs and universal app tracking capabilities. And because their identity graphs are vast and data collection in real-time, the final estimate is not only accurate, but most of all, granular and actionable – the two main limitations of traditional panel-based systems.
Google launches machine-learning managed ad frequency
Practical applications of these tools are valuable to marketers. For examples, Google has recently announced the launch of a new machine-learning powered feature to manage ad frequency across its Google Ad Manager platform. It will predict how many times a user was exposed to an ad for reach and frequency analysis, based on its undisclosed sample of users who Google can track with full certainty.
This announcement comes after some similar releases across the entire stack. ‘Cross-environment estimated conversions’ will show you when customers convert on the same device, as well as when customers click a search ad on one device or browser but convert in a different environment. These are rich insights enabling you to optimize any campaigns.
Integrating the new reports into decision-making
Financial marketers will use these new features as they become available in the Google Marketing Platform, but it is also vital for these decision-makers to have their own measures of effectiveness and performance. Most of the features above are still in Beta, rarely compatible with third party systems and often difficult to export. While many media agencies are judged on ‘cost per acquisition’, this is far more difficult in a world where the conversion is estimated and therefore not auditable. This makes it absolutely essential for brand marketers to undertake their own measurement frameworks and integrate insights into their own sales and demand trackers.
The secret lies in your 1st party data given the fact that it is comparatively less impacted than 3rd party data, with the one-day notice in Safari and as-yet unaffected in Chrome. There is still a scope for projects that focus on maximizing how you capture and use your first person data, both to employ smarter segmentation of your media activation, and to obtain insights by separating out your Chrome from your ITP lines to tailor your campaign activation and measurement.
It will also be valuable for marketers in finance sectors to use their own data to build better targeting capabilities, and to monitor and validate the information they are getting from the large tech companies, whose insights are otherwise largely unauditable.
There is also a time-limit to the availability of these measurable insights, because when Google removes open-ended access to your 1st party cookies, the ability of brands to obtain these insights will reduced to almost zero.
A measurement mindset
While people might mourn the loss of the present system, in truth very few brands have had the genuine discipline or technical knowledge to make their measurement truly insightful. And in reality, even for brands that made the effort to obtain genuinely robust data, the figures in their analyses were not 100% accurate, with data discrepancies and ad fraud skewing the findings providing a false picture. Some will also be uncomfortable with the aggregation of power within the big tech companies, but the options will be increasingly few to work outside their platforms.
In the post-cookie world, what is really required is a change of mindset when measuring performance, which will be as important to you as the tools and technology that you use. We are in a new era and those who are prepared to adapt their thinking are best placed to succeed.
Setting up secure remote working for financial services
By Pete Watson, CEO, Atlas Cloud
Financial advisors, insurers, banks and brokers; the entire financial services sector has been forced to rethink operating models to keep staff safe and clients served throughout a global pandemic. Traditionally, the industry as a whole has been slower to embrace home working and the solutions that power it, as roles were heavily office-based, or branch-based in the case of traditional banks.
Underpinned by stringent regulations and with access to an abundance of sensitive data, the process of remote cloud-based working is not only a discussion of logistics. It also throws up huge security considerations for firms making the transition. With 95% of financial services applications now said to possess vulnerabilities, it is easy to understand any reservations these companies might have about implementing a Software-as-a-Service working environment.
However, these same concerns may also hinder success in the long run, as a more flexible digital approach moves from value-added to business-critical in a remote setting. The question is, how do firms create a setup that both minimises risk and maximises success?
Understanding your needs
As with every new technology, the first step is understanding why your business needs it. Like all modern workplaces, financial service firms expect a lot from their IT infrastructure. Users require better connectivity, access and speed from their systems and decision makers must implement a network connectivity infrastructure that can meet these demands. Today, this setup must also account for every device that employees are using to work remotely.
Next is the fundamental security considerations. It goes without saying that financial services epitomise the need for watertight IT systems. Financial service firms store their sensitive information within key applications, all of which vary based on the service requirements. With sensitive customer and company data stored in applications and on devices, their networks are at constant risk of compromise.
Protecting this information is a necessity and managing a large scale shift to the cloud must cover every base – particularly as cybercriminals have raised their game to ‘exploit the chaos’ during lockdown. However, there often is a big difference between where those key applications are hosted.
Setting up securely
If financial firms have older applications, then data will typically be stored on-premise on company servers. Where this is the case, moving to a cloud-hosted virtual desktop environment offers a quick way to get remote working underway securely and with minimal friction. Modern Desktop-as-a-Service (DaaS) solutions can replicate the familiar desktop setting users know and love, but allow them to access all data, email and applications from anywhere and on any device. With all data managed in a secure cloud setting, no sensitive data is stored directly on the end-user device. This improves all-important cyber resilience without having to invest large resources in upgrading to new iterations of required software applications.
In other cases, many firms have already migrated legacy on-premise applications to SaaS-based applications, which alleviates the hosting burden with access enabled through your everyday web browser. However, where sensitive data is concerned, it is not quite so easy to just give employees access to a SaaS platform, where they can use any device and download often sensitive data to uncontrolled personal devices outside of the company network.
This issue has intensified in lockdown. A recent survey found that 23% of those working in financial services are now using their laptop to work, with 43% storing work on their personal devices. Inevitably, more devices in use leads to more potential entry points. A large amount of trust is placed with individuals to make sure they’re working safely. But when handling large caches of highly sensitive data, even the smallest oversight or breach could prove disastrous when adequate levels of threat protection are not in place. With 20% of home workers having taken no action to mitigate potential cyber threats during lockdown, successful security becomes about the safeguards that firms actively put in place.
To alleviate risk, companies would previously look to IP address blocking, which restricts access to SaaS logins to only ‘whitelisted’ IP addresses, such as the company premises. When working out of the office, you could then use a VPN to route your internet back through the office and allow access. However, as many have learned, a VPN-based approach has inherent security challenges that can be exploited by attackers to gain access to a network. A weak identity or unguarded device can allow unwanted visitors access to data through a VPN, with many often flying under the radar undetected. What’s more, in a time where access to on-premise devices and networks is limited, the significant housekeeping to stay on top of patching VPNs is by no means scalable or effective.
Assessing your options
Hosted virtual desktops offer a convenient solution to these issues, as firms no longer need to rely on the security of every device, and all security patching and updates can be applied to all users simultaneously. Yet, for firms already running their business from SaaS applications that enable access from anywhere – albeit not as effectively – this can seem an unnecessary expense. So, how do you then unlock the security benefits without overhauling your IT approach?
There are various solutions that offer add-on security features to your existing SaaS-based setup. The likes of Citrix Workspace can deliver a secure multi-factor login into one controlled cloud environment, providing access to pre-approved company apps and file storage with one-click access. Although a seemingly simple change, this additional layer not only keeps firms in control, but also affords user access to business-critical information and apps from any Internet-enabled location.
This calibre of financial data naturally makes firms across the sector susceptible to an array of other cyber threat tactics. Financial services are no stranger to spam emails which include viruses, or calculated impersonation attacks designed to deceive staff with malicious attachments, URLs and other pieces of content. When assessing an workspace setup, financial service companies must consider exploring multi-level assessment solutions that deliver advanced checks to protect them from this popular method of attack. What’s more, modern disaster recovery from certified suppliers can reduce risk of network downtime, eliminating the potential reputational damage (as TSB suffered) and FCA fines and maintain continuity with data loss measured in seconds, not minutes.
Remote working is here to stay, and the time to act on security is now. While it can be a daunting task for finance companies, simply ignoring the growing cyber risks of a modern working environment could be catastrophic. By working with the right partners to put these safeguards in place, financial services can arm their workforce with secure remote working at scale, keeping threat at bay and maintaining the standard of client care needed to assure customers in times of change.
How to communicate when the world is in crisis
By Callum Jackson Account Executive at communications agency Cicero/AMO Across sectors both private and public, the coronavirus crisis has brought...
Efficiency vs productivity: how to maximise the output of streamlined teams
By Julie Lock, commercial director at Mitrefinch With the furlough scheme heading towards its conclusion over the coming weeks, the...
Uncertain Times for the Financial Sector… Is Open Source the Solution?
By Kris Sharma, Finance Sector Lead, Canonical Financial services are an important part of the economy and play a wider...
The end of the cookie and the new era of digital marketing
By Biog Richard Wheaton, UK MD of data company fifty-five If you are following the current announcements around data governance in...
Time for the adaptive profession – APM reveals findings of its Projecting the Future report
The project profession is at the forefront of change, but needs to continually develop skills to stay relevant 15 September,...
Setting up secure remote working for financial services
By Pete Watson, CEO, Atlas Cloud Financial advisors, insurers, banks and brokers; the entire financial services sector has been forced...
Ensuring ATMs aren’t the weakest link to banking cybersecurity
By Elida Policastro, Regional VP – Cybersecurity division at Auriga Digital banking brings huge benefits to customers, but the risks...
A sleeping digital giant wakes? 4 key trends accelerating payments transformation in the US
By Lauren Jones, International Payments Ambassador, Icon Solutions The US payments industry is undoubtedly ripe for change. Before the unprecedented...
Return to Work Doesn’t Mean Business as Usual When it Comes to Travel and Expense
By Rob Harrison, MD UK & Ireland, SAP Concur The last few months have been an exercise in adaptability for...
Why technology is key to the future of auditing
By Piers Wilson, Head of Product Management at Huntsman Security The Financial Reporting Council (FRC), which is responsible for corporate governance,...