Consumer cloud services evolve so fast that they create expectations unmatched by business services. Is the “business class” cloud appearing on the horizon? Or is it something that enterprise customers and ICT should work together to create? AsksJames Walker, President CEF
“What is our cloud strategy?” – A question increasingly demanded of the CIO, as if expected to unroll a definitive roadmap across territory that is still evolving as fast as tidal saltmarsh.
Blame it on the consumer cloud, where new services are constantly emerging and adapting in the face of on-going feedback from millions of users. Compared with the efficiency, immediacy, friendliness and clarity of favourite services – such as eBay, Google, Skype and Amazon books – business cloud services lag way behind.
Employees are enjoying a better online experience playing games, watching movies and shopping at home than they get at work. “Business Class” used to be the by-word for superior service – in today’s cloud it feels to younger employees more like travel by Freight, let alone Economy Class. Nor is it just a question of current staff blaming IT for all their woes: bright new graduates are increasingly opting for employment in companies with more advanced strategy in terms of social media and cloud services.
Nor can this demand for cloud strategy be dismissed as a passing fad. Sure, the cloud is currently high on the Gartner Hype Cycle, but there is no denying the importance to business of its basic promises of agile, scalable services and a shift from high CapEx to the more business-friendly OpEx financial model.
In addition to this business efficiency argument, there are two other growing pressures to migrate to the cloud. The first is the increasing role of the cloud in cementing supply chain and other stakeholder partnerships – your suppliers, your business partners and your customers all expect to see more automated processes connecting them in the cloud.
Then there is the growing expectation of big data mining and its competitive advantages. For a large multinational with millions of customers the computational burden would be at supercomputer levels – a massive investment in capital expenditure that would be far better bought and paid for as a service.
The changing role of the CIO
IT still sees itself as largely responsible for building and maintaining an efficient enterprise communications infrastructure. In fact today’s CIO is increasingly playing the role of a portfolio manager.
Network and storage technology is becoming good enough to be taken for granted, like good plumbing. What really matters now is the mass of applications running on it and, in a multinational enterprise, the CIO could be responsible for a portfolio of several thousand applications and their variations.
We cannot simply grade these applications in terms of importance or criticality. There are certain functions – say employee expense reporting – that may be vital to the individual but make very little demand on the system in terms of latency and QoS, but others do require specific yet diverse standards of service.
Point of Sale services cannot be held up for more than a few seconds without causing queues, unhappy customers and an erosion of the company’s reputation for good service – they must get high priority. Although videoconferencing demands far higher levels of service in real time, it is relatively flexible in terms of scheduling, where one minute’s delay would seldom do much long-term damage. When it comes to financial trading and machine-to-machine services, then we are no longer talking about seconds or minutes, but microsecond delays. Financial information that is minutely out of date could actually be dangerous in such systems – in this case it had better be lost than delayed.
Think in terms of portfolio management – providing so many applications and meeting such diverse demands – and cloud Software as a Service (SaaS) becomes hugely attractive. No more need to keep up with security patches, software updates, and consistent versions across the enterprise and other application lifecycle overheads: SaaS delivers shiny new, yet proven, applications on tap and you only pay for what you need.
So why does anyone still buy software? Why is SaaS from the public cloud still only responsible for around ten to fifteen per cent of all business applications? It has a lot to do with uncertainty about the security and reliability of cloud services.
Uncertainty about the security and reliability of cloud computing has itself more to do with perception than underlying reality. Instinct tells us that data is safest when we don’t let it out of our sight, when we trust it to our own systems and not to some unknown servers in an unspecified location run by strangers.
It is a similar instinct to one that many airline passengers feel when, instead of sitting in a car with a clear view of the road ahead, they are seated with hundreds of others in a vast flying building manned by a pilot they cannot see and with no view of what lies ahead. Basically, this is a sound instinct, because humans were evolved to move on land and not fly above the clouds at hundreds of miles per hour.
But for that very reason the entire multi-billion pound air travel industry depends upon proving that instinct wrong by making air travel as safe as possible using every technical and strategic means. Thus it is that we regularly hear statistics proving that flying is by far the safest means to travel long distance, and that the biggest danger to the air traveller lies in the journey to the airport, and not during the flight.
So it is with public cloud computing. Of course it is risky trusting data to strangers but, for that very reason, the multi-billion pound cloud industry uses every technical and strategic means to protect that data and deliver reliable service. Data may feel safer in one’s own private cloud, but it is highly unlikely that the private cloud will have the same levels of protection from cyber-attack, the same levels of redundancy and the same quality of infrastructure as in a reputable public cloud service. Yes, public clouds do make tempting targets for hackers, just as planes make tempting targets for terrorists, but in both cases the levels of defence go way beyond what most private alternatives could afford.
As in the case of air travel, the greatest risk lies not within the cloud, but in the journey to it via the Internet. Quite apart from the risks of hacking and denial of service attacks when an enterprise uses public cloud services, there is the question of reliability in a non-deterministic Internet connection. Is it possible to do really serious business via teleconference when the Internet connection gets overloaded in mid-negotiation?
There is one very sound solution to this last problem, and that is to connect to the cloud not via public Internet but via a dedicated private, or virtual private line connection via Carrier Ethernet. We will return to this later, but first we must consider whether business cloud services really merit that connection.
Rebuilding “Business Class” in the cloud
Why is it that consumer cloud services are so far ahead of business offerings in terms of simplicity, functionality and sheer practicality?
It was suggested that this is because of the massive market for consumer services. Many of these services begin by costing nothing to the user, who soon numbers in the millions. The new service is an amazing innovation, so who cares if it is a little clunky when you are paying nothing for it? But with all that feedback plus such a massive test market the service evolves very fast in competition with other services, and a brilliant public service begins to emerge.
Business services, on the other hand, begin with hundreds, not millions, of customers who have to pay for what they get. The competition is still there, but not the massive trial and error potential of a free service that allows such dynamic evolution. So business services seem clumsy, unfriendly and poor performers compared with what an employee can enjoy at home or on their smartphone.
The difference is relative, not absolute. The biggest business cloud providers such as AWS or SalesForce do have thousands of business customers and they are getting the level of feedback that allows constant refinement. The gap begins to close between the best of business services and consumer offerings. But there is a danger that this could led to a fragmented cloud service for business, one where the enterprise becomes locked into a particular cloud service that may be very good but does not allow flexibility to choose and swap suppliers when needed.
This is not just about allowing customer mobility to ensure competition, there may also be critical reasons for wanting a service based in a local datacentre – bringing compute power close to reduce “data gravity and ensure low latency. It can also be necessary for conformance with data protection legislation, when that becomes more important than the very high level of service that is offered by some far distant datacentre.
So, business cloud services will surely evolve with time, but will they evolve in a manner that ultimately serves the business community?
The role of the CEF
The CloudEthernet Forum believes that the a development of cloud services described above needs every encouragement, but it should also be supported by a collaborative process that involves every type of cloud stakeholder – customers as well as providers and carriers – in order to ensure that cloud development is not only fast and well-targeted but also leads to an open market based on universal standards and certified conformance to recognised needs.
Why is this of particular concern to an Ethernet forum? The answer is that Ethernet is rapidly becoming the glue holding the whole cloud concept together. Within the datacentre high speed Ethernet is mostly replacing specialist technologies such as Infiniband, while Carrier Ethernet has overtaken every other protocol in the WAN connecting datacentres.
So, just as Ethernet required new attributes to make it suitable for the WAN, so does it need refinement to make it suitable for the special demands of the cloud. And, just as Carrier Ethernet was made possible by the commitment of all its stakeholders to work together to create standards, so will CloudEthernet be best developed by a wide-ranging partnership.
A business class cloud
As was suggested, the applications in the CIO’s portfolio have very diverse needs. Some are already sitting happily on the public Internet service, but others are too fussy to be trusted to existing services. Business needs levels along the lines of “Silver, Gold and Platinum Service” so it can pay for just the level needed for any application, and not be committed to top rates for the less critical work.
But it needs more than that. Basic problems remain with current cloud technology. These include the requirement for a dedicated network path: this may be due to legislation about where sensitive personal data is stored, or what routes it travels along, or it may be due to the need to predict latency for market trading, or simply to minimise it for optimal compute power. A giant cloud provider may now offer the added reliability of fail-safe transfer to a second datacentre in case of problems, but what are the legal and operational implications of this transfer to the customer?
Another issue is the time it takes to establish links between different WANs and providers in order to provide cloud service across regions and continents. Work is needed to standardise services so that business can swap providers as fast as market movements dictate.
The CEF has already laid the groundwork by identifying five fundamentals required for a universal CloudEthernet, under the headings Virtualization, Automation, Security, Programmability and Analytics (VASPA). Details of these priorities are already available in a free White Paper available from the CEF website.
Back to “Cloud strategy”
Has this article helped the CIO formulate a reply to that demand for a “cloud strategy”?
I would seriously suggest that any long term cloud strategy for a large potential cloud user, such as a multinational business, must include involvement in shaping the cloud to deliver the sort of service a business requires. And the best way to get involved is by joining the co-operative cloud providers, carriers, NEMs and users that are already CEF members and are shaping tomorrow’s cloud while it is still in the formative stages.
Secondly I would repeat the message that the public cloud really is safe and reliable, and that the greatest risk of breakdown lies in the Internet connection to it – and that is best addressed by using a dedicated Ethernet connection for critical services.
Finally, I would recommend avoiding the public versus private cloud debate and think simply in terms of hybrid solutions. The argument for public cloud hinges largely on economy: the “pay as you go” costing that is so much better for business than massive capital outlay. However, if you look more closely at the figures, it can still be more cost-effective to build your own private cloud and run routine work internally.
The key word here is “routine”. If business was always in steady state, running a predictable workload, then there would be many advantages in sticking to a private cloud solution. The problem lies in the spikes and sudden demands – do you have to over-supply your resources in case of the occasional peak demand? The rule of thumb here is: “own the base and rent the spikes”.
In other words, the optimal solution could be a hybrid strategy that provides in house facilities to support most everyday business, plus a flexible contract with a public cloud provider that takes care of sudden peak demands.
This is the way to really save money, not pitting CapEx versus OpeEx but balancing the two over the longer term. And is this not what strategy is all about?
TCI: A time of critical importance
By Fabrice Desnos, head of Northern Europe Region, Euler Hermes, the world’s leading trade credit insurer, outlines the importance of less publicised measures for the journey ahead.
After months of lockdown, Europe is shifting towards rebuilding economies and resuming trade. Amongst the multibillion-euro stimulus packages provided by governments to businesses to help them resume their engines of growth, the cooperation between the state and private sector trade credit insurance underwriters has perhaps missed the headlines. However, this cooperation will be vital when navigating the uncertain road ahead.
Covid-19 has created a global economic crisis of unprecedented scale and speed. Consequently, we’re experiencing unprecedented levels of support from national governments. Far-reaching fiscal intervention, job retention and business interruption loan schemes are providing a lifeline for businesses that have suffered reductions in turnovers to support national lockdowns.
However, it’s becoming clear the worst is still to come. The unintended consequence of government support measures is delaying the inevitable fallout in trade and commerce. Euler Hermes is already seeing increase in claims for late payments and expects this trend to accelerate as government support measures are progressively removed.
The Covid-19 crisis will have long lasting and sometimes irreversible effects on a number of sectors. It has accelerated transformations that were already underway and had radically changed the landscape for a number of businesses. This means we are seeing a growing number of “zombie” companies, currently under life support, but whose business models are no longer adapted for the post-crisis world. All factors which add up to what is best described as a corporate insolvency “time bomb”.
The effects of the crisis are already visible. In the second quarter of 2020, 147 large companies (those with a turnover above €50 million) failed; up from 77 in the first quarter, and compared to 163 for the whole of the first half of 2019. Retail, services, energy and automotive were the most impacted sectors this year, with the hotspots in retail and services in Western Europe and North America, energy in North America, and automotive in Western Europe
We expect this trend to accelerate and predict a +35% rise in corporate insolvencies globally by the end of 2021. European economies will be among the hardest hit. For example, Spain (+41%) and Italy (+27%) will see the most significant increases – alongside the UK (+43%), which will also feel the impact of Brexit – compared to France (+25%) or Germany (+12%).
Companies are restarting trade, often providing open credit to their clients. However, there can be no credit if there is no confidence. It is increasingly difficult for companies to identify which of their clients will emerge from the crisis from those that won’t, and whether or when they will be paid. In the immediate post-lockdown period, without visibility and confidence, the risk was that inter-company credit could evaporate, placing an additional liquidity strain on the companies that depend on it. This, in turn, would significantly put at risk the speed and extent of the economic recovery.
In recent months, Euler Hermes has co-operated with government agencies, trade associations and private sector trade credit insurance underwriters to create state support for intercompany trade, notably in France, Germany, Belgium, Denmark, the Netherlands and the UK. All with the same goal: to allow companies to trade with each other in confidence.
By providing additional reinsurance capacity to the trade credit insurers, governments help them continue to provide cover to their clients at pre-crisis levels.
The beneficiaries are the thousands of businesses – clients of credit insurers and their buyers – that depend upon intercompany trade as a source of financing. Over 70% of Euler Hermes policyholders are SMEs, which are the lifeblood of our economies and major providers of jobs. These agreements are not without costs or constraints for the insurers, but the industry has chosen to place the interests of its clients and of the economy ahead of other considerations, mindful of the important role credit insurance and inter-company trade will play in the recovery.
Taking the UK as an example, trade credit insurers provide cover for more than £171billion of intercompany transactions, covering 13,000 suppliers and 650,000 buyers. The government has put in place a temporary scheme of £10billion to enable trade credit insurers, including Euler Hermes, to continue supporting businesses at risk due to the impact of coronavirus. This landmark agreement represents an important alliance between the public and private sectors to support trade and prevent the domino effect that payment defaults can create within critical supply chains.
But, as with all of the other government support measures, these schemes will not exist in the long term. It is already time for credit insurers and their clients to plan ahead, and prepare for a new normal in which the level and cost of credit risk will be heightened and where identifying the right counterparts, diversifying and insuring credit risk will be of paramount importance for businesses.
Trade credit insurance plays an understated role in the economy but is critical to its health. In normal circumstances, it tends to go unnoticed because it is doing its job. Government support schemes helped maintain confidence between companies and their customers in the immediate aftermath of the crisis.
However, as government support measures are progressively removed, this crisis will have a lasting impact. Accelerating transformations, leading to an increasing number of company restructurings and, in all likelihood, increasing the level of credit risk. To succeed in the post-crisis environment, bbusinesses have to move fast from resilience to adaptation. They have to adopt bold measures to protect their businesses against future crises (or another wave of this pandemic), minimize risk, and drive future growth. By maintaining trust to trade, with or without government support, credit insurance will have an increasing role to play in this.
What Does the FinCEN File Leak Tell Us?
By Ted Sausen, Subject Matter Expert, NICE Actimize
On September 20, 2020, just four days after the Financial Crimes Enforcement Network (FinCEN) issued a much-anticipated Advance Notice of Proposed Rulemaking, the financial industry was shaken and their stock prices saw significant declines when the markets opened on Monday. So what caused this? Buzzfeed News in cooperation with the International Consortium of Investigative Journalists (ICIJ) released what is now being tagged the FinCEN files. These files and summarized reports describe over 200,000 transactions with a total over $2 trillion USD that has been reported to FinCEN as being suspicious in nature from the time periods 1999 to 2017. Buzzfeed obtained over 2,100 Suspicious Activity Reports (SARs) and over 2,600 confidential documents financial institutions had filed with FinCEN over that span of time.
Similar such leaks have occurred previously, such as the Panama Papers in 2016 where over 11 million documents containing personal financial information on over 200,000 entities that belonged to a Panamanian law firm. This was followed up a year and a half later by the Paradise Papers in 2017. This leak contained even more documents and contained the names of more than 120,000 persons and entities. There are three factors that make the FinCEN Files leak significantly different than those mentioned. First, they are highly confidential documents leaked from a government agency. Secondly, they weren’t leaked from a single source. The leaked documents came from nearly 90 financial institutions facilitating financial transactions in more than 150 countries. Lastly, some high-profile names were released in this leak; however, the focus of this leak centered more around the transactions themselves and the financial institutions involved, not necessarily the names of individuals involved.
FinCEN Files and the Impact
What does this mean for the financial institutions? As mentioned above, many experienced a negative impact to their stocks. The next biggest impact is their reputation. Leaders of the highlighted institutions do not enjoy having potential shortcomings in their operations be exposed, nor do customers of those institutions appreciate seeing the institution managing their funds being published adversely in the media.
Where did the financial institutions go wrong? Based on the information, it is actually hard to say where they went wrong, or even ‘if’ they went wrong. Financial institutions are obligated to monitor transactional activity, both inbound and outbound, for suspicious or unusual behavior, especially those that could appear to be illicit activities related to money laundering. If such behavior is identified, the financial institution is required to complete a Suspicious Activity Report, or a SAR, and file it with FinCEN. The SAR contains all relevant information such as the parties involved, transaction(s), account(s), and details describing why the activity is deemed to be suspicious. In some cases, financial institutions will file a SAR if there is no direct suspicion; however, there also was not a logical explanation found either.
So what deems certain activities to be suspicious and how do financial institutions detect them? Most financial institutions have sophisticated solutions in place that monitor transactions over a period of time, and determine typical behavioral patterns for that client, and that client compared to their peers. If any activity falls disproportionately beyond those norms, the financial institution is notified, and an investigation is conducted. Because of the nature of this detection, incorporating multiple transactions, and comparing it to historical “norms”, it is very difficult to stop a transaction related to money laundering real-time. It is not uncommon for a transaction or series of transactions to occur and later be identified as suspicious, and a SAR is filed after the transaction has been completed.
FinCEN Files: Who’s at Fault?
Going back to my original question, was there any wrong doing? In this case, they were doing exactly what they were required to do. When suspicion was identified, SARs were filed. There are two things that are important to note. Suspicion does not equate to guilt, and individual financial institutions have a very limited view as to the overall flow of funds. They have visibility of where funds are coming from, or where they are going to; however, they don’t have an overall picture of the original source, or the final destination. The area where financial institutions may have fault is if multiple suspicions or probable guilt is found, but they fail to take appropriate action. According to Buzzfeed News, instances of transactions to or from sanctioned parties occurred, and known suspicious activity was allowed to continue after it was discovered.
How do we do better? First and foremost, FinCEN needs to identify the source of the leak and fix it immediately. This is very sensitive data. Even within a financial institution, this information is only exposed to individuals with a high-level clearance on a need-to-know basis. This leak may result in relationship strains with some of the banks’ customers. Some people already have a fear of being watched or tracked, and releasing publicly that all these reports are being filed from financial institutions to the federal government won’t make that any better – especially if their financial institution was highlighted as one of those filing the most reports. Next, there has been more discussion around real-time AML. Many experts are still working on defining what that truly means, especially when some activities deal with multiple transactions over a period of time; however, there is definitely a place for certain money laundering transactions to be held in real time.
Lastly, the ability to share information between financial institutions more easily will go a long way in fighting financial crime overall. For those of you who are AML professionals, you may be thinking we already have such a mechanism in place with 314b. However, the feedback I have received is that it does not do an adequate job. It’s voluntary and getting responses to requests can be a challenge. Financial institutions need a consortium to effectively communicate with each other, while being able to exchange critical data needed for financial institutions to see the complete picture of financial transactions and all associated activities. That, combined with some type of feedback loop from law enforcement indicating which SARs are “useful” versus which are either “inadequate” or “unnecessary” will allow institutions to focus on those where criminal activity is really occurring.
We will continue to post updates as we learn more.
How can financial services firms keep pace with escalating requirements?
By Tim FitzGerald, UK Banking & Financial Services Sales Manager, InterSystems
Financial services firms are currently coming up against a number of critical challenges, ranging from market volatility, most recently influenced by COVID-19, to the introduction of regulations, such as the Payment Services Directive (PSD2) and Fundamental Review of the Trading Book (FRTB). However, these issues are being compounded as many financial institutions find it increasingly difficult to get a handle on the vast volumes of data that they have at their disposal. This is no surprise given that IDC has projected that by 2025, the global “datasphere” will have grown to a staggering 175 zettabytes of data – more than five times the amount of data generated in 2018. As an industry that has typically only invested in new technology when regulations deem it necessary, many traditional banks are now operating using legacy systems and applications that haven’t been designed or built to interoperate. Consequently, banks are struggling to leverage data to achieve business goals and to gain a clear picture of their organisation and processes in order to comply with regulatory requirements. These challenges have been more prevalent during the pandemic as financial services firms were forced to adapt their operations to radical changes in customer behaviour and increased demand for digital services – all while working largely remotely themselves.
As more stringent regulations come in to play and financial services firms look to keep pace with escalating requirements from regulators, consumer demand for more online services, and the ever-evolving nature of the industry and world at large, it’s vital they do two things. Firstly, they must begin to invest in the technology and processes that will allow them to more easily manage the data that traditional banks have been collecting and storing for upwards of 50 years. Secondly, they must innovate. For many, the COVID-19 pandemic will have been a catalyst for both actions. However, the hard work has only just begun.
Traditionally, due to tight budgets and no overarching regulatory imperative to change, financial institutions haven’t done enough to address their overreliance on disconnected legacy systems. Even when faced with the new wave of regulation that was implemented in the wake of the 2008 banking crash, financial services organisations generally only had to invest in different applications on an ad hoc basis to meet each individual regulation. However, as new regulations require the analysis of larger data sets within smaller processing windows, breaking down any and all data siloes is essential and this will require financial institutions that are still reliant on legacy systems to implement new technologies to meet the regulatory stipulations.
With this in mind, solutions which offer high-quality data analytics and enhanced integration will be key to the success of financial institutions and crucial to eliminate data silos. This will enable organisations to achieve a faster and more accurate analysis of real-time and historical data no matter where they are accessing the data from within smaller processing windows to keep pace with regulatory requirements, while also benefiting from low infrastructure costs.
This technology will also play a huge part in helping financial institutions scale their online operations to meet demand from customers for digital services. According to PNC Bank, during the pandemic, it saw online sales jump from 25% to 75%. Therefore, having data platforms that are able to handle surges in online activity is becoming increasingly important.
Real-time analysis of data
While the precise solution financial services institutions need will differ based on the organisation, broadly speaking, the more data they are storing on legacy solutions, the more they are going to require an updated data platform that can handle real-time analytics. Even organisations that have fewer legacy systems are still likely to require solutions that deliver enhanced interoperability to help provide a real-time view across the business and enable them to meet the pressing regulatory requirements they face. Let’s also not lose sight of the fact that moving transactional data to a data warehouse, data lake, or any other silo will never deliver real-time analytics, therefore, businesses making risk decisions based on this and thinking it is real-time is completely inappropriate.
As such, financial services firms require a data platform that can ingest real-time transactional data, as well as from a variety of other sources of historical and reference data, normalise it, and make sense of it. The ability to process transactions at scale in real-time and simultaneously run analytics using transactional real-time data and large sets of non-real-time data, such as reference data, is a crucial capability for various business requirements. For example, powering mission-critical trading platforms that cannot slow down or drop trades, even as volumes spike.
Not only will having access to real-time data enable financial institutions to meet evolving regulatory requirements, but it will also allow them to make faster and more accurate decisions for their organisation andcustomers. With many financial services firms operating on a global basis, this is vital to help them keep up not only with evolving regulations but also changing circumstances in different markets in light of the pandemic. This data can also help them understand how to become more agile, help their employees become productive while working remotely, and how to build up operational resilience. These insights will also be vital as financial institutions need to consider the likelihood of subsequent waves of the virus, allowing them to gain a better understanding of what has and hasn’t worked for their business so far.
The financial services sector is fast-paced and ever-changing. With the launch of more digital-only banks, traditional institutions need to innovate to avoid being left behind, with COVID-19 only highlighting this further. With more than a third (35%) of customers increasing their use of online banking during this period, it is those banks and financial services firms with a solid online offering that have been best placed to answer this demand. As financial institutions cater to changing customer requirements, both now and in the future, implementing new technology that provides access to data in real-time will help them to uncover the fresh insights needed to develop new and transformative products and services for their customers. In turn, this will enable them to realise new revenue streams and potentially capture a bigger slice of the market. For instance, access to data will help banks better understand the needs of their customers during periods of upheaval, as well as under normal circumstance, which will allow them to target them with the specific services they may need during each of these periods to not only help their customers through difficult times but also to ensure the growth of their business. As financial institutions not only look to keep pace with but also gain an advantage over their competitors, using data to fuel excellent customer experiences will be essential to success.
With the current economic uncertainty and market volatility, it’s critical that financial services are able to meet the changing requirements coming from all angles. With COVID-19 likely to be the biggest catalyst for financial institutions to digitally transform, they will be better able to cater to rapidly evolving landscapes and prepare for continued periods of remote working. As they look to achieve this, replacing legacy systems with innovative and agile technology solutions will be crucial to ensure they can gain the accurate and complete view of their enterprise data they need to comply with new and changing regulations, and better meet the needs of consumers in an increasingly digital landscape, whether they are located in an office or working remotely.
Corporate treasuries under pressure need multi-banking trade finance technology
By Andrew Raymond, CEO, Bolero International The pressures on corporate treasuries in global trade have continued to mount since an...
How can financial services companies deliver great customer service and retain customer loyalty?
By Chris Angus, Senior Director, 8×8 The reality many banks are facing now is that given Amazon Prime can deliver...
Embracing digital automation without compromising on customer experience
By Mang-Git NG, CEO & Founder of Anvil Community banks have always prided themselves on their ability to serve their...
Two-thirds of finance professionals are now more efficient due to the Covid-19 crisis
The Covid-19 crisis is making a big impact on the efficiency of the UK’s finance departments, with 66% of financial...
Two thirds of people believe their work travel patterns have changed permanently
Alphabet research shows accelerating demand for mobility and EVs after lockdown Only 35% of people expect to return to normal...
TCI: A time of critical importance
By Fabrice Desnos, head of Northern Europe Region, Euler Hermes, the world’s leading trade credit insurer, outlines the importance of...
What should I invest and How do I invest
By Imogen Clarke With all the uncertainty that has arisen from 2020, with lockdown threatening businesses and the warning of...
Death of the workplace friendship: study shows how remote working is eroding our meaningful connections with colleagues
Employee experience platform Perkbox’s research on 1,296 employees and 300 business leaders reveal 65% think the ‘new way of working’...
Half of UK’s finance sector confirms diversity should be more of a priority in the workplace, with calls for action across the industry
Almost half (45%) of Britain’s banking/financial services workforce think their employer could do more when it comes to diversity, according to a...
American Express and Amazon Business Launch Co-branded Credit Cards for Small Businesses in the UK
The co-branded Cards offer flexible benefits and payment optionality by allowing small businesses to decide between earning rewards or adjusting...