No one group has all the expertise needed to create a perfect cloud environment – it needs expertise in application behaviour, data centre design, network and machine virtualization, wide area networking and so much more. Getting this all working together calls for collaboration, and the CloudEthernet Forum is the way to do it, says CEF President, James Walker
Sometimes society needs to pause, waiting for one single word or symbol that encapsulates its fundamental need, before moving on. Like a seed crystal dropped into a super-saturated solution, the word “cloud” has become the rallying cry for a range of “as a service” business models now spreading like wildfire. And yet the basic idea – that it makes economic sense to invest heavily in central resources and save money on cheaper access to those resources – has been around since the days of the mainframe computer.
The personal computer almost destroyed that model, as people discovered that they really liked holding their own resources, but it re-emerged with client-server and the savings made possible by allowing a “thinner client”. While the rise of the web revived the idea of centralisation, with a truly thin client accessing services from the Internet, early moves in this direction gathered little momentum. This was partly because Internet access in the 1990s was still too slow, but also because people still cherished the autonomy of having all their assets in their own PC.
What really shifted this caution was the arrival of the smartphone: a new type of thin client appeared that seemed to hold the whole world in its Internet grasp. People did not have to shift perspective and embrace the SaaS model, they just found they were already using it, and the word of this new aeon was “cloud”. The result has been a surge in cloud uptake that took even its strongest advocates by surprise.
The signs are everywhere, as massive new datacentres are springing up in the coldest places: Dell’Oro Group predicts that within five years more than 75% of Ethernet ports will be sold into data centres, with similar predictions for compute and storage gear from Gartner and Forrester. So the total worldwide market for cloud infrastructure and services is expected to grow to $206B in 2016, and the cloud will be the hub for most business investments well into the next decade.
The early adopters are those who accept this total virtualisation process and are quite happy to shift it to other platforms. But, as we move towards a mass cloud market, the industry is beginning to reach people still wedded to the merits of autonomy. These include a sense of independence with more palpable SLAs, of security (or at least more manageable risk), control over data integrity in the face of increasing legislation and so on.
This means that the industry will soon be facing a much steeper sales incline – and this is just when it can least afford to slip. If the cloud fails now, it could send the whole market tumbling back down the slope.
The bad news is that the cracks in the cloud structure have already started to show. The good news is that this has been recognised in time and the industry has launched the CloudEthernet Forum and is already rallying to tackle fundamental issues and ensure a reliable, scalable and secure cloud for the coming generation.
A more detailed analysis of the challenges and suggested steps to their resolution is available in the CEF white paper The Benefits and Challenges Ahead for Cloud Operators . There are, however, two main factors that need first to be understood to provide context for the technical challenges.
The first is scale. It is understood that the market is rapidly expanding, even more rapidly than expected – but this is a familiar challenge in the IT world with lots of new users coming on line. What is different is the explosion in virtual machines that is unbounded by the physical limitations usually imposed by the requirement to install hardware. In a virtualised environment every VM is equivalent to a new location added to the network and, even in a low-density datacentre we could be speaking of many tens of thousands of such “locations”. Already we hear of new giant datacentres hosting over a million VMs: string a few of these together and we will very soon be addressing tens of millions of new network locations.
Ethernet has, quite rightly so far, proved itself as the optimal technology for these datacentres, but it is worth remembering that it is based on a concept designed in the 1970s to string a few computers in the Palo Alto research centre together so they could share a printer. It has developed over 3 decades by adding switches to extend the service from tens to many thousands of locations. This is a natural evolution in response to growing demand. But the coming VM explosion is way beyond natural, and today’s switch designs simply don’t have the memory to hold tables for tens of millions of locations. And a move to create new generation “super-switches” would go against the basic economics that makes Ethernet so suitable.
Doesn’t SDN point the way to a solution – keeping the switches simple and centralising this massive routing burden onto the network controller? It’s an attractive idea and may well be a part of the solution, but it is not really what SDN is fundamentally about. The real attraction of SDN is to use central control as a basis to deliver smart new functionality and flexibility to the overall network by virtualising it and creating a more nimble communications infrastructure. Forcing an additional massive “heavy lifting” administrative burden onto the controller in this way shifts the emphasis from software-defined towards software-relieved – reducing what could have been a breakthrough into a sticking plaster solution. NFV, similarly, may have a role to play, but its immediate effect would be to increase the number of functions running on VMs.
If we are to find new ways to streamline the process, reducing the grunt work rather than moving it all to the control layer, then it will need a fundamental rethink: it will need a cloud Ethernet.
The second big issue centres on collaboration: the problems are different when you begin linking remote datacentres. Yes, it also increases scale, but the real challenge is bringing together mature disciplines with already established boundaries: the people who build data centres and design applications are not WAN experts, just as telcos have much to learn about the needs of applications and data centre architecture.
An enterprise cloud solution typically brings together at least four major players in addition to Network Equipment Manufacturers: datacentre experts, WAN service providers, cloud service providers, usually some exchange provider like Equinix, Telx or CoreSite who may be hosting ten thousand logically discrete tenants in a single datacentre. These are big worlds needing to find common ground or a connecting bridge. If that does not happen, then any failure in cloud delivery will widen the rift as each discipline starts blaming the others for any system failure.
Collaboration is the key. Before the cloud’s Ethernet foundations start to show its cracks, we need the whole industry to work together to reinforce those foundations. There are already giant players in this game: in 2012 AWS, Google and Microsoft accounted for 40% of all the Ethernet ports shipped worldwide. While that gives some idea of their massive investment, the total being less than 50% also tells us that not one of these giants is yet big enough to dominate the scene and dictate its own cloud connectivity ‘standards’ for global usage. So standards need to be created before the market fragments.
Taking a familiar example: the outstanding success of Carrier Ethernet happened because vendors collaborated to create and certify global standards in the name of MEF – rather than battling each other to see whose technology could take the lead. The users could buy certified services and equipment without having to waste time choosing technologies, the service providers and vendors made faster sales, and world business gained by the acceleration of high performance, lower cost WAN services brought about by Carrier Ethernet.
A similar level of collaboration by cloud stakeholders is needed now. The CEF is gathering expertise in application behaviour, datacentre design, security, network and machine virtualization, wide area networking and so much more. Those who join are collaborating to build a firm foundation for tomorrow’s cloud – a cloud Ethernet meeting the needs of scalability, determinism, availability at the speed of VMs being made and torn down. Those who stand aside may find themselves delivering services on a creaking platform, pointing the finger of blame at everyone but themselves.
BNP Paribas joins forces with Orange Business Services to deploy SD-WAN for 1,800 retail sites in France
- Co-construction approach ensures business continuity during deployment
BNP Paribas has chosen Orange Business Services to deploy an SD-WAN solution in more than 1,800 bank branches across France. Focused on developing and integrating new digital solutions, BNP Paribas continues to provide the highest standards to improve user experience for customers and employees alike.
By integrating Flexible SD-WAN from Orange Business Services, BNP Paribas benefits from a modern and agile technological platform to accelerate its digital transformation. This enables quick and easy deployment of multiple services, such as dynamic routing and path selection, with scalability and flexibility. It also empowers administrators to monitor infrastructure performance and resolve potential network congestion through simple software modifications, thereby optimizing application performance. By deploying SD-WAN, BNP Paribas can take advantage of a fully secure hybrid network that is natively multi-cloud, multi-access and multi-application. The Bank will also benefit from optimized and centralized management and intelligent routing for its new infrastructure.
Close collaboration between business and IT for greater agility
From the start of the project, experts from Orange Business Services and BNP Paribas built the solution design together and prioritized the features to be deployed. More than 3,600 access lines—two per branch, including one Internet access line – are currently being rolled out with a focus on maintaining business continuity for each site during the migration. In addition to the SD-WAN overlay, firewalls for enhanced security are also part of this deployment.
“It was paramount for us to choose a partner who already had proven experience implementing and operating SD-WAN solutions. Orange Business Services stood out as this trusted partner. In addition to their IT expertise, the Orange teams demonstrated a great ability to understand our business challenges, and they knew what needed to be done to support our end-to-end digital transformation. This close collaboration between our teams from the very beginning of the project was one of the keys to its success and to a smooth roll out,” said Bernard Gavgani, Chief Information Officer at BNP Paribas Group.
“We are delighted to support BNP Paribas in their transformation program and deploy the first large-scale SD-WAN project in the retail banking industry for the French market. An indepth understanding of our customers’ business needs is essential to co-develop customized and innovative solutions. Orange Business Services will continue to accompany BNP Paribas’ central and local teams to learn and develop their SD-WAN skills,” said Nadine Foulon-Belkacémi, Executive Vice President, French Major Clients at Orange Business Services.
How to ensure you bullet proof your IT in a hybrid finance workplace
By Caleb Mills, Chief Technical Officer at Doherty Associates outlines the dangers faced by finance and private equity firms when it comes to IT infrastructure in a pandemic. Caleb warns that maintaining security is critical as firms continue to work remotely in the current lockdown while making plans to return to the new blended workplace in 2021.
2020 was a year of rapid change – for the technology sector in particular. Virtually overnight, IT firms had to meet the growing demands of many businesses accelerating their technology plans in a bid to stay ahead of the new virtual business environment we suddenly found ourselves in. Covid-19 forced many organisations to automatically relax their security policies so that employees could operate in the remote-only world which followed the UK’s first national lockdown in March.
Can personal devices ever be compliant?
When the announcement of the first March lockdown was made, employees were sent home to work, and largely did so on their personal devices; home PCs, personal mobile devices or shared laptops. Compliance calls for organisational data to be encrypted and kept private, access to be audited and for its transmission to be only over secure channels. Many of these requirements are not met if the use of personal devices is allowed carte blanche – so it’s very likely that some firms are falling short of their compliance obligations.
Added to this is the fact that many employees do not want to allow their organisation to install management software, enforce policies, or limit their freedom on the use of personal devices. They may feel that their company is infringing personal liberties or ‘spying on them’. The most simple and effective (yet costly) solution is to issue company devices for all staff – although there may be some resistance from some to having two devices.
There is an option for controlling company data on personal devices that can satisfy some compliance requirements. Technologies now exist to allow organisational data to be kept in a separate virtual container on the device where policies around encryption and such can be enforced without contravening your employees’ privacy. The company portion of the device can be kept in a secure bubble, without enforcing rules or infringing on individual’s freedom with their own personal devices.
New risks and responsibilities
The accelerated adoption of remote working has meant many risk and compliance teams are still rushing to catch up. Many firms have not thoroughly identified the risks associated with remote or hybrid working, which continue to evolve as the constant demands for businesses change. Even those who have identified risks are likely only considering the ones they understand. In many cases, compliance teams need assistance from a cyber security expert who can help define the risks they are not aware they are taking. An expert will understand the wide and varied attack vectors and provide context and insight into how they could impact risk. The changing environment might call for updates to your IT use policy, cyber security policy, or other IT related policies.
Navigating risk and liability
The approach for managing risk must start by having a clear understanding of what your organisation’s risk appetite is. It is not possible to mitigate or eliminate all risks – there will always be some residual risk and it is important for your organisation to know what level of risk it is willing to accept.
When creating treatment plans for each of your risks, the business should consider the many different angles for controlling and mitigating. There are many technical controls which can enforce your policies, but often organisational controls such as processes or workflows can be just as effective. Choosing to adopt a program like Cyber Essentials can help to ensure that your organisation meets certain requirements. Even the very low bar of its framework can help you to ask pertinent questions about your organisation’s security posture.
Changing security boundaries
In days gone by, businesses took some comfort from knowing they had a secure network. They invested in firewalls to build a border around their network, and they trusted workers and the data they accessed to be protected against security threats. Now, many things have changed.
Data is no longer kept solely on servers in the office, it’s now stored largely in the cloud. And, thanks to Covid-19, many users are now operating outside of this safe and secure network too. The net effect of these two key changes is that the approach of building a highly secure boundary around your network no longer delivers the desired results. The post-pandemic workplace, even more so in finance and private equity, needs to be productive and secure from anywhere in the world.
The modern hacker is not just focused on defeating a firewall – they want to steal your firm’s data – and the way they achieve that is typically to hijack an individual’s identity. Modern security now focuses on protecting the data and the identity of workers by using multiple layers of security controls. This multi-layer, or “onion” approach, works on the assumption that a determined attacker can breach anyone or two layers of security protection. To keep your organisation protected, you should have multiple security controls in place to ensure coverage to help keep your environment safe.
Securing and supervising data rooms in a hybrid world
Data rooms provide a critical function by allowing third party organisations to securely access confidential data, so it’s important that the sensitivity of this is considered before embarking on any data room project. Appropriate policies about how the data should be accessed and used can then be enforced by the technology, and these clearly defined policies will allow for tightly configured security controls to limit access appropriately.
For example, data room guests might be allowed to view documents, but prevented from downloading them or copying and pasting content from them. Modern capabilities even include the ability to “timebomb” documents – for example to block access to documents after an NDA has expired.
Finally, consider taking Cyber Insurance. This can provide help with investigations, guidance on reporting to the ICO, help with public relations and communications, and help cover other expenses incurred as part of a cyber event.
The ongoing events of 2020 have changed the way we work forever. New risks and opportunities have continued to emerge through this period, and it’s ever more apparent that the world will never go back to how it worked before. Hybrid working is here to stay so we need to understand the implications and take appropriate steps to ensure we meet our compliance obligations and control risk exposure through a mixture of controls to stay ahead of the game.
Fraud prevention and user experience: how finance institutions can navigate the increasingly complex digital challenge
By Frank Teruel, Chief Operating Officer at Adara.
Well, here we go again. As Covid-19 cases continue to surge and local officials impose restrictions, brick and mortar companies are once again limited in their ability to deliver their services forcing them to double down on their remote location strategies. From curbside pickup, to Uber eats, to an “Amazon Prime” Christmas, the surge in the last 90 days has exacerbated an already difficult business environment. And banking has been no exception. In fact, the consequences have been more significant given the nature of in-person banking…just imagine the difficulties inherent in bringing that service curbside. So, what exactly has the pandemic done to banking?
With in-branch services limited, less digitally-savvy, first time, digital customers have made the jump to online banking products like Klarna for real-time loans, and digital-only bank account offerings like Starling Bank, Atom Bank, and Monzo. The move has been a digital consumer bonanza for online financial institutions which, while happy with the new customers, are left with the daunting task of determining who they are and whether to trust them.
Who are these people?
Unquestionably, the transition to digital banking has brought positive benefits to first-time users such as greater convenience and transparency, but it has also resulted in a less than desirable outcome: increasing levels of fraud and sophistication in attacks. The reality is that many of these nascent digital customers have very little on-line transaction history and virtually no digital identities as a reference point for identity and verification. Consequently, with more and more of them moving their banking online, traditional methods of determining “who is on the other end of a transaction” have been tested, and in many cases become less effective or completely obsolete. And here’s the rub; with little to no digital identities, limited transaction history, and never before seen devices, financial organisations are defaulting to more draconian verification methods…stepping up customers, challenging them with KYC questions, and generally increasing transaction friction.
Go easy on them
Yet, before we castigate the institutions in question, a degree of understanding is warranted. After all, the tsunami of new online transactions represents a significant challenge. And what’s a bank to do? If a bank falsely deems a transaction fraudulent, they are ruining a customer’s on-line experience by creating unnecessary friction which predictably leads to transaction abandonment. On the other hand, if the bank defaults to no friction and allows all of these new transaction to run, it could be a fraudster’s paradise…”free loans, credit cards, and other financial instruments on isle 3!” So, determining the person on the other end of the transaction is paramount to a customer’s experience as well as the future success of a financial institution in maintaining account holders and preventing fraud losses.
Fraud fighting is a full-time, real-time business…and it isn’t cheap!
Fraudsters learn, adapt, share insights, and then repeat the cycle. Always searching for vulnerabilities, they create new and complex methods to circumvent detection. For example, during the pandemic, banks and other on-line business are seeing a significant increase in spear phishing, cross-site scripting, and man in the middle attacks. Tack on impersonation scams, scary intrusions like the SolarWinds hack that has left government and commercial organizations scrambling, and incessant BOT attacks and its clear the velocity and diversity of attacks add a whole new level of complexity…and the dollars start adding up quickly! Not only do these attacks result in higher costs for financial organisations but they create significant brand blow back. Online businesses, especially banks, need to rethink their identity verification protocols landing on those that balance digital identity solutions that factor in identity, context, and behavior. Context is a key element to ensure a balanced approach and mitigate overreactions. Because with fraud currently surging, there is a danger that financial institutions over-correct and prevent customers from completing legitimate transactions. Organisations need to resist the urge to implement stringent measures or checks and instead they need to be smarter in rooting out fraudulent purchases in the first place.
Harmonizing Identity…the only real verification
Effective identity verification starts with harmonizing all of a customer’s disparate digital personas into one digital identity. Every customer has multiple digital personas with which they transact in the online world. For example, assuming email as the common credential denominator (along with an appropriate password), a customer may use Gmail to access Lloyds Bank, Yahoo for a Gumtree ad, and Hotmail to place a Tesco grocery order. While each of these are all different digital personas, they represent the same customer. Understanding the collective personas and associating them with one harmonized identity provides the necessary confidence as to the integrity of the identity. Next, understanding the context derived from prior transaction history, device information, location, and intent data like searches and outcome data, allows organizations to also add a predictive element to the analysis. Together, these factors help build confidence that the customer is legitimate and, equally importantly, whether to trust the customer within the context of the transaction.
The Time Space Continuum
Context can help interpret behavior that would otherwise be immediately flagged for fraud. Consider a customer that has just purchased a jacket at John Lewis using a London IP address and then immediately purchases a package tour from an IP in Reykjavik. Relying solely on the IP signals would flag the second transaction as likely fraud. However, if the person routinely uses a VPN while making transactions and using the Reykjavik IP is a known transactional attribute, then the time distance conundrum is nonexistent because the 1173-mile journey is just a quick digital hop…it’s likely legitimate and hasn’t broken the laws of physics. If it’s also determined that the customer has been searching for Iceland trips and subsequently books a flight, the tour purchase becomes even more credible because every transaction that uses those elements is mapped to that digital identity and strengthens the association between that specific transactional element and the identity’s behavior.
While one imaginary purchase is certainly not the whole picture, financial institutions need to make these decisions around fraud in an instant. Without the proper contextual information, banks and other organisations can easily flag a genuine purchase as fraudulent and is so doing, aggravate the customer and because of transaction abandonment, hand market share to their competitor. The only solution that fits the bill is to harmonize digital identities. Armed with a verified identity, deterministic facts (the customer is in London) and probabilistic measures (the customer is planning on visiting Iceland) the transaction assessment will be accurate and eliminate friction.
Use digital identities that are:
- built on global consortiums that provide a global view of a customer
- harmonized to ensure no single digital persona is compromised or acting anomalously
- provide both deterministic and probabilistic insights
- and most importantly, built on consented and permissioned data
COVID-19 and the subsequent increased use of digital banking products have opened the door to fraud. While fraudsters try to take advantage of the current situation, financial institutions can ruin their schemes by ensuring that they work with the best data partners in order to provide the context needed to verify transactions and reduce friction.
FSS and India Post Payments Bank AePS Partnership Advances Financial Inclusion in India
New Delhi, January 12th,2020: FSS (Financial Software and Systems), a leading global payment processor and provider of integrated payment products,...
Seven lessons from 2020
Rebeca Ehrnrooth, Equilibrium Capital and CEMS Alumni Association President Attending a New Year’s luncheon on 31 December 2019, we...
Over a quarter of Brits now have an account with a digital-only bank
The number of Brits with a digital-only bank account has gone up by a percentage increase of 16% Almost 1...
How fintech companies can facilitate continued growth
By Jackson Lee, VP Corporate Development from Colt Data Centre Services The fintech industry is rapidly growing and, in the...
Co-construction approach ensures business continuity during deployment BNP Paribas has chosen Orange Business Services to deploy an SD-WAN solution in...
2021 Predictions: Operational Resilience Takes Center Stage
Breaking down barriers between Risk and Business Continuity By Brian Molk, Fusion Risk Management What a year! Simply put, the global...
Five Workplace Culture Trends of 2021
5 January 2021 – 2020 – a year like no other – is responsible for driving organisational change, especially workplace...
The Impact of the Digital Economy on the Banking and Payments Sector
By Gerhard Oosthuizen, CTO Entersekt. New banking regulations, digital consumers, the eradication of passwords, contactless technology – these are just...
Be Future-Ready: The Case for Payments as a Service (Paas)
By Barry Tarrant, Director, Product Solutions, Fiserv Over the years, financial institutions have faced a myriad of changes in regulations,...
Mark Wright – No Longer an Apprentice
Just for context, you won The Apprentice and became Lord Sugar’s business partner in 2014 – you set up your...