With the recent release of latest 3GPP 5G New Radio (NR) standards and the first true 5G networks potentially coming on stream in 2018, 5G-Ready transport networks continue to evolve into semi-unmapped territory. The forthcoming generation is not a forklift replacement so much as an extension and evolution of existing 4G mobile transport infrastructure – but “let’s wait and see” is not an option. Looking closely at their infrastructure through a 5G lens, go-ahead mobile operators see opportunities to ensure that all upgrades and extensions will be steps in the right direction – towards the 5G future.
Jon Baldry Metro Marketing Director at Infinera explains.
With the first commercial 5G rollouts now announced for 2018, it is highly likely that we can expect the first 5G handsets to be announced at next year’s Mobile World Congress – initially at a premium price for people who would demand the ultimate despite minimal 5G network availability.
It’s human nature – like desiring a Ferrari in a world with blanket 70mph speed limits.
It will make headlines, for sure, but the real news is not so obvious. With the transition from 2G to 3G to 4G the public has got used to the idea that a new standard means a whole new network, with providers competing to be the first to roll it out. Operators are still vying to be first to market with 5G, but this time we are not replacing the 4G network,we are extending its reach into smaller 5G cells and evolving towards new 5G standards. From the outset, 4G was not set in stone: ever since its launch there have been a continuing series of further 4G releases to add new functionality, many of them geared towards supporting new 5G networks.
The real driver for 5G is not a single set of applications, but a broad group that are typically clustered around one of three centres of gravity:
Enhanced broadband naturally extends the capabilities of 4G bandwidth per user to enable 5G to challenge in the residential and business broadband markets. 5G will deliver at least ten times faster than 4G to enable cloud storage of HD video clips and support for 4K video.
Massive machine type communications is geared towards the Internet of Things (IoT) with up to a million connections per square kilometre, or 100 devices in a room, where 4G now manages a few thousand per cell. Although initial IoT deployments have majored on a population of very simple devices such as traffic sensors and smart meters sending and receiving relatively tiny pulses of data, these are just the ripples preceding a potential tsunami.
Connecting surveillance cameras to the system will add a lot of traffic, but it is forthcoming ultra-reliable and low latency communications applications such as driverless cars, industrial control and telemedicine that will really pile on the pressure. These applications require secure communications that never fail, with network latency dropped by a factor of 10 from 4G standards to an impressive 1ms to give undetectable response times.
Many of the envisaged 5G services will use a blend of these capabilities, for example virtual reality (VR) will require both high capacity and ultra-reliability with low latency. VR is more than just a game: it has potential to transform education, training, virtual design and healthcare. If a surgeon is to diagnose accurately, or even perform a remote operation, the resolution of the virtual reality image must be close to the resolution of a human retina.This requires at least 300 Mbps, almost 60 times higher than current HD video, with undetectable latency and of course ultra-reliability.
These are the sort of facts, figures and exciting applications that make the headlines. The real work, however is to provide a whole network that can support such service levels. What does this mean for the mobile provider who already has a huge investment in 4G infrastructure?
How to get there
5G achieves its massive bandwidth by operating on higher frequency bands, in the millimetre wave spectrum. At these frequencies the signals do not travel as far and they are more readily obstructed by walls, obstacles, rain or mist, requiring clear line-of-sight access.
A slow download to a smartphone, or a break in a phone conversation, is annoying but seldom disastrous – and earlier technologies were no better. Extreme reliability is, however, essential for driverless cars or other critical 5G services. We cannot afford any blind spots, where a building shadows the signal.
Full availability means that many more smaller cells must be added to the network. Existing 4G access must be extended like capillaries in a fine network of small cells feeding back to existing transport arteries. This requires a huge investment, partly compensated by the fact that 5G antennae can be much smaller and use less power. They will also conserve power by focusing signals more accurately rather than beaming equally in all directions at once.
To support this more dynamic cell behaviour, we need greater intelligence towards the edge.As well as using multiple antennae to aim signals more efficiently, 5G will also recognise the type of signals being sent and reduce power when less is needed. Having a host of small cells in close proximity also enables Coordinated Multi-Point (CoMP) – a technique whereby nearby base stations respond simultaneously and cooperate to improve quality of service.
While the new radio access network does its best to minimise latency, it comes to nothing if some signals have to travel all the way to and from a distant data centre. So another trend will be for Mobile Edge Computing (MEC) – where caching, compute power and critical applications will be pushed closer to the network edge to reduce latency and congestion in the transport network and optimize quality of service.
Existing cellular networks rely heavily on fibre optic links to connect cell towers to the core network. Although high speed wireless can bridge the gap when time or cost makes it impossible to lay fibre, the only technology to consistently support 5G’s surge in demand and quality of service will be fibre. Each cell of a capillary 5G network is far smaller than a typical 4G, but there are so many of them and the applications so demanding the total bandwidth demand in the transport network will be massive.
So it is necessary to extend fibre as close as possible to the small cells in order to meet this demand. This “fibre-deep” evolution will not be achieved by simply multiplying existing fibre equipment and building it out into the metro space as needed – that would be a colossally expensive operation both in terms of real estate and equipment costs. Instead there will be a need to install many more compact and power efficient network nodes, wherever they can be economically accommodated. This could include remote telecom huts, street cabinets, cupboards or cell sites – locations quite unsuitable for housing racks of equipment that is optimized for a controlled telco environment.
Selecting suitable equipment will no longer be a simple matter of asking a preferred supplier to meet the required performance levels, it will be necessary to look much more closely at the specifications to see if devices are sufficiently rugged, compact and power efficient to survive where space and power supply are limited, and temperature and humidity levels more extreme. With a massive increase in the amount of fibre installations, commissioning and operating expenses will also soar, unless extra care is taken to choose the most compact, reliable and easy to maintain optical equipment.
Leading optical equipment suppliers are well aware of these challenges and are developing solutions more suitable for fibre-deep networking. The latest access optimised units can deliver 100Gbps at a mere 20 watts, packing over 400Gbps into one standard rack unit – about eight times the density of previous generation equipment.
What’s more, the industry has been working to bring the International Telecommunications Union’s (ITU) vision of autotuneable WDM-PON optics up to the performance levels required to support the reach and capacity requirements of 5G networks. This eases the pressures of commissioning and maintaining extensive DWDM optical networks by replacing the technicians’ burden of determining and adjusting wavelengths at every installation. Autotuneable technology will automatically select the correct wavelength without any configuration by the remote field engineer enabling them to treat DWDM installations with the same simplicity as grey optics.
Pressure on the transport network
This far denser 5G access environment, even with greater intelligence located towards the edge, will put heavy pressure on the upstream infrastructure. In between times of change, buying patterns tend to stabilise towards the convenience of familiar, single vendor provision. With the shift to 5G we are already seeing greater competitive pressure between mobile operators, and between wholesale operators hosting 5G transport services. This is forcing buyers to demand higher performance, greater efficiency and more demanding specifications – driving a shift towards more aggregated best-of-breed solutions.
Higher performance is not all that is needed, there other significant changes taking place as 4G networks evolve towards 5G. Datacenter technology, such as spine leaf switching and network slicing, will increasingly migrate to the transport network to provide the flexibility to support more distributed intelligence and the need for MEC. Where 4G started with high performance dumb pipes connecting cell towers to the core, we are now evolving, towards a more flexible software-defined transport architecture.
As well as greater capacity, there are other demands that will not be met by many existing optical solutions.Among the refinements required for 5G, Carrier Aggregation enables the use of several different carriers in the same frequency bands to increase data throughput, rather as CoMP (described above) makes use of neighbouring cells. These solutions require new levels of synchronization precision, as well as low latency. Mobile operators now buying equipment need to look closely at the specifications to ensure that they are not investing in systems that will become obsolete as 5G rolls out. There are already some nominally 4G mobile transport networks that meet the demanding 5G synchronization and latency specifications.
5G-readiness is an ongoing development, and we can expect more early announcements of 5G services on the basis that they meet 5G speeds or other criteria, without providing the full 5G mobile service. Like owning a Ferrari, it’s a combination of marketing hype and status. Providers and nations are understandably keen to demonstrate 5G way ahead of the timescales favoured by the 3GPP standards body.
Major sporting events, with their massive global TV coverage, offer a stunning opportunity for operators to showcase their 5G capabilities. The 2012 London Olympics were the first “smartphone Olympics”, where spectators could simultaneously view the games close-up on their handsets. The 2020 Summer Olympics in Tokyo and the 2022 Winter Olympics in Beijing will vie with each other to highlight the way these nations are driving mobile 5G, as Europe once drove 3G and North America drove 4G. Europe and North America are also looking to showcase 5G, such as Elisa’s recent announcement of what is claimed to be the world’s first commercial 5G service in Finland. By 2022 we can expect there could be a significant number of Beijing Winter Olympics spectators using 5G Virtual Reality devices to spectacular effect.
Meanwhile mobile operators need to work steadily towards these capabilities with 5G-Ready mobile transport that an optimise 4G networks today and provide the high performance required for full 5G in the future. Operators can avoid investing in soon-to-be obsolete mobile transport technology, by seeking advice from experts at the leading edge of optical network equipment and design.
Taking control of compliance: how FS institutions can keep up with the ever-changing regulatory landscape
By Charles Southwood, Regional VP – Northern Europe and MEA at Denodo
The wide-spread digital transformation that has swept the financial services (FS) sector in recent years has brought with it a world of possibilities. As traditional financial institutions compete with a fresh wave of challenger banks and fintech startups, innovation is increasing at an unprecedented pace.
Emerging technologies – alongside the ever-evolving concept of online banking – have provided a platform in which the majority of customer interactions now take place in a digital format. The result of this is a never-ending stream of data and digital information. If used correctly, this data can help drive customer experience initiatives and shape wider business strategies, giving organisations a competitive edge.
However, before FS organisations can utilise data-driven insights, they need to ensure that they can adequately protect and secure that data, whilst also complying with mandatory regulatory requirements and governance laws.
The regulation minefield
Regulatory compliance in the FS sector is a complex field to navigate. Whether its potential financial fraud or money laundering, risk comes in many different forms. Due to their very nature – and the type of data that they hold – FS businesses are usually placed under the heaviest of scrutiny when it comes to achieving compliance and data governance, arguably held to a higher standard than those operating in any other industry.
In fact, research undertaken last month discovered that the General Data Protection Regulation (GDPR) has had a greater impact on FS organisations than any other sector. Every respondent working in finance reported that the changes made to their organisation’s cyber security strategies in the last three years were, at least to some extent, as a result of the regulation.
To make matters even more confusing, the goalpost for 100% compliance is continually moving. In fact, between 2008 and 2016, there was a 500% increase in regulatory changes in developed markets. So even when organisations think they are on the right track, they cannot afford to become complacent. The Markets in Financial Instruments Directive (MiFID II), the requirements for central clearing and the second Payment Service Directive (PSD2), are just some examples of the regulations that have forced significant changes on the banking environment in recent years.
Keeping a handle on this legal minefield is only made more challenging by the fact that many FS organisations are juggling an unimaginable amount of data. This data is often complex and of poor quality. Structured, semi-structured and unstructured, it is stored in many different places – whether that’s in data lakes, on premise or in multi-cloud environments. FS organisations can find it extremely difficult just to find out exactly what information they are storing, let alone ensure that they are meeting the many requirements laid out by industry regulations.
A secret weapon
Modern technologies, such as data virtualisation, can help FS organisations to get a handle on their data – regardless of where it is stored or what format it is in. Through a single logical view of all data across an organisation, it boosts visibility and real-time availability of data. This means that governance, security and compliance can be centralised, vastly improving control and removing the need for repeatedly moving and copying the data around the enterprise. This can have an immediate impact in terms of enabling FS organisations to avoid data proliferation and ‘shadow’ IT.
In addition to this, when a new regulation is put in place, data virtualisation provides a way to easily find and access that data, so FS organisations can respond – without having to worry about alternative versions of that data – and ensures that they remain compliant from the offset. This level of control can be reflected even down to the finest details. For example, it is possible to set up access to governance rules through which operators can easily select who has access to what information across the organisation. They can alter settings for sharing, removing silos, masking and filtering through defined, role-based data access. In terms of governance, this feature is essential, ensuring that only those who have the correct permissions to access sensitive information are able to do so.
Compliance is a requirement that will be there forever. In fact, its role is only likely to increase as law catches up with technological advancement and the regulatory landscape continues to change. For FS organisations, failure to meet the latest legal requirements could be devastating. The monetary fines – although substantial – come second to the potential reputation damage associated with non-compliance. It could be the difference between an organisation surviving and failing in today’s climate.
No one knows what is around the corner. Whilst some companies may think they are ahead of the compliance game today, that could all change with the introduction of a new regulation tomorrow. The best way to ensure future compliance is to get a handle on your data. By providing total visibility, data virtualisation is helping organisations to gain back control and win the war for compliance.
TCI: A time of critical importance
By Fabrice Desnos, head of Northern Europe Region, Euler Hermes, the world’s leading trade credit insurer, outlines the importance of less publicised measures for the journey ahead.
After months of lockdown, Europe is shifting towards rebuilding economies and resuming trade. Amongst the multibillion-euro stimulus packages provided by governments to businesses to help them resume their engines of growth, the cooperation between the state and private sector trade credit insurance underwriters has perhaps missed the headlines. However, this cooperation will be vital when navigating the uncertain road ahead.
Covid-19 has created a global economic crisis of unprecedented scale and speed. Consequently, we’re experiencing unprecedented levels of support from national governments. Far-reaching fiscal intervention, job retention and business interruption loan schemes are providing a lifeline for businesses that have suffered reductions in turnovers to support national lockdowns.
However, it’s becoming clear the worst is still to come. The unintended consequence of government support measures is delaying the inevitable fallout in trade and commerce. Euler Hermes is already seeing increase in claims for late payments and expects this trend to accelerate as government support measures are progressively removed.
The Covid-19 crisis will have long lasting and sometimes irreversible effects on a number of sectors. It has accelerated transformations that were already underway and had radically changed the landscape for a number of businesses. This means we are seeing a growing number of “zombie” companies, currently under life support, but whose business models are no longer adapted for the post-crisis world. All factors which add up to what is best described as a corporate insolvency “time bomb”.
The effects of the crisis are already visible. In the second quarter of 2020, 147 large companies (those with a turnover above €50 million) failed; up from 77 in the first quarter, and compared to 163 for the whole of the first half of 2019. Retail, services, energy and automotive were the most impacted sectors this year, with the hotspots in retail and services in Western Europe and North America, energy in North America, and automotive in Western Europe
We expect this trend to accelerate and predict a +35% rise in corporate insolvencies globally by the end of 2021. European economies will be among the hardest hit. For example, Spain (+41%) and Italy (+27%) will see the most significant increases – alongside the UK (+43%), which will also feel the impact of Brexit – compared to France (+25%) or Germany (+12%).
Companies are restarting trade, often providing open credit to their clients. However, there can be no credit if there is no confidence. It is increasingly difficult for companies to identify which of their clients will emerge from the crisis from those that won’t, and whether or when they will be paid. In the immediate post-lockdown period, without visibility and confidence, the risk was that inter-company credit could evaporate, placing an additional liquidity strain on the companies that depend on it. This, in turn, would significantly put at risk the speed and extent of the economic recovery.
In recent months, Euler Hermes has co-operated with government agencies, trade associations and private sector trade credit insurance underwriters to create state support for intercompany trade, notably in France, Germany, Belgium, Denmark, the Netherlands and the UK. All with the same goal: to allow companies to trade with each other in confidence.
By providing additional reinsurance capacity to the trade credit insurers, governments help them continue to provide cover to their clients at pre-crisis levels.
The beneficiaries are the thousands of businesses – clients of credit insurers and their buyers – that depend upon intercompany trade as a source of financing. Over 70% of Euler Hermes policyholders are SMEs, which are the lifeblood of our economies and major providers of jobs. These agreements are not without costs or constraints for the insurers, but the industry has chosen to place the interests of its clients and of the economy ahead of other considerations, mindful of the important role credit insurance and inter-company trade will play in the recovery.
Taking the UK as an example, trade credit insurers provide cover for more than £171billion of intercompany transactions, covering 13,000 suppliers and 650,000 buyers. The government has put in place a temporary scheme of £10billion to enable trade credit insurers, including Euler Hermes, to continue supporting businesses at risk due to the impact of coronavirus. This landmark agreement represents an important alliance between the public and private sectors to support trade and prevent the domino effect that payment defaults can create within critical supply chains.
But, as with all of the other government support measures, these schemes will not exist in the long term. It is already time for credit insurers and their clients to plan ahead, and prepare for a new normal in which the level and cost of credit risk will be heightened and where identifying the right counterparts, diversifying and insuring credit risk will be of paramount importance for businesses.
Trade credit insurance plays an understated role in the economy but is critical to its health. In normal circumstances, it tends to go unnoticed because it is doing its job. Government support schemes helped maintain confidence between companies and their customers in the immediate aftermath of the crisis.
However, as government support measures are progressively removed, this crisis will have a lasting impact. Accelerating transformations, leading to an increasing number of company restructurings and, in all likelihood, increasing the level of credit risk. To succeed in the post-crisis environment, bbusinesses have to move fast from resilience to adaptation. They have to adopt bold measures to protect their businesses against future crises (or another wave of this pandemic), minimize risk, and drive future growth. By maintaining trust to trade, with or without government support, credit insurance will have an increasing role to play in this.
What Does the FinCEN File Leak Tell Us?
By Ted Sausen, Subject Matter Expert, NICE Actimize
On September 20, 2020, just four days after the Financial Crimes Enforcement Network (FinCEN) issued a much-anticipated Advance Notice of Proposed Rulemaking, the financial industry was shaken and their stock prices saw significant declines when the markets opened on Monday. So what caused this? Buzzfeed News in cooperation with the International Consortium of Investigative Journalists (ICIJ) released what is now being tagged the FinCEN files. These files and summarized reports describe over 200,000 transactions with a total over $2 trillion USD that has been reported to FinCEN as being suspicious in nature from the time periods 1999 to 2017. Buzzfeed obtained over 2,100 Suspicious Activity Reports (SARs) and over 2,600 confidential documents financial institutions had filed with FinCEN over that span of time.
Similar such leaks have occurred previously, such as the Panama Papers in 2016 where over 11 million documents containing personal financial information on over 200,000 entities that belonged to a Panamanian law firm. This was followed up a year and a half later by the Paradise Papers in 2017. This leak contained even more documents and contained the names of more than 120,000 persons and entities. There are three factors that make the FinCEN Files leak significantly different than those mentioned. First, they are highly confidential documents leaked from a government agency. Secondly, they weren’t leaked from a single source. The leaked documents came from nearly 90 financial institutions facilitating financial transactions in more than 150 countries. Lastly, some high-profile names were released in this leak; however, the focus of this leak centered more around the transactions themselves and the financial institutions involved, not necessarily the names of individuals involved.
FinCEN Files and the Impact
What does this mean for the financial institutions? As mentioned above, many experienced a negative impact to their stocks. The next biggest impact is their reputation. Leaders of the highlighted institutions do not enjoy having potential shortcomings in their operations be exposed, nor do customers of those institutions appreciate seeing the institution managing their funds being published adversely in the media.
Where did the financial institutions go wrong? Based on the information, it is actually hard to say where they went wrong, or even ‘if’ they went wrong. Financial institutions are obligated to monitor transactional activity, both inbound and outbound, for suspicious or unusual behavior, especially those that could appear to be illicit activities related to money laundering. If such behavior is identified, the financial institution is required to complete a Suspicious Activity Report, or a SAR, and file it with FinCEN. The SAR contains all relevant information such as the parties involved, transaction(s), account(s), and details describing why the activity is deemed to be suspicious. In some cases, financial institutions will file a SAR if there is no direct suspicion; however, there also was not a logical explanation found either.
So what deems certain activities to be suspicious and how do financial institutions detect them? Most financial institutions have sophisticated solutions in place that monitor transactions over a period of time, and determine typical behavioral patterns for that client, and that client compared to their peers. If any activity falls disproportionately beyond those norms, the financial institution is notified, and an investigation is conducted. Because of the nature of this detection, incorporating multiple transactions, and comparing it to historical “norms”, it is very difficult to stop a transaction related to money laundering real-time. It is not uncommon for a transaction or series of transactions to occur and later be identified as suspicious, and a SAR is filed after the transaction has been completed.
FinCEN Files: Who’s at Fault?
Going back to my original question, was there any wrong doing? In this case, they were doing exactly what they were required to do. When suspicion was identified, SARs were filed. There are two things that are important to note. Suspicion does not equate to guilt, and individual financial institutions have a very limited view as to the overall flow of funds. They have visibility of where funds are coming from, or where they are going to; however, they don’t have an overall picture of the original source, or the final destination. The area where financial institutions may have fault is if multiple suspicions or probable guilt is found, but they fail to take appropriate action. According to Buzzfeed News, instances of transactions to or from sanctioned parties occurred, and known suspicious activity was allowed to continue after it was discovered.
How do we do better? First and foremost, FinCEN needs to identify the source of the leak and fix it immediately. This is very sensitive data. Even within a financial institution, this information is only exposed to individuals with a high-level clearance on a need-to-know basis. This leak may result in relationship strains with some of the banks’ customers. Some people already have a fear of being watched or tracked, and releasing publicly that all these reports are being filed from financial institutions to the federal government won’t make that any better – especially if their financial institution was highlighted as one of those filing the most reports. Next, there has been more discussion around real-time AML. Many experts are still working on defining what that truly means, especially when some activities deal with multiple transactions over a period of time; however, there is definitely a place for certain money laundering transactions to be held in real time.
Lastly, the ability to share information between financial institutions more easily will go a long way in fighting financial crime overall. For those of you who are AML professionals, you may be thinking we already have such a mechanism in place with 314b. However, the feedback I have received is that it does not do an adequate job. It’s voluntary and getting responses to requests can be a challenge. Financial institutions need a consortium to effectively communicate with each other, while being able to exchange critical data needed for financial institutions to see the complete picture of financial transactions and all associated activities. That, combined with some type of feedback loop from law enforcement indicating which SARs are “useful” versus which are either “inadequate” or “unnecessary” will allow institutions to focus on those where criminal activity is really occurring.
We will continue to post updates as we learn more.
Mastercard Delivers Greater Transparency in Digital Banking Applications
Mastercard collaborates with merchants and financial institutions to include logos in digital banking applications Research shows that ~25% of disputes...
Success beyond voice: Contact centres supporting retail shift online
As the nation continues to overcome the challenges presented by COVID-19, customers have shifted their channel preferences, and contact centres have demonstrated...
7 Ways to Grow a Profitable Hospitality Business
Hospitality requires charisma and innovation The hospitality industry is a multibillion-dollar industry with lots of career opportunities in hotels, theme...
AML and the FINCEN files: Do banks have the tools to do enough?
By Gudmundur Kristjansson, CEO of Lucinity and former compliance technology officer Says AML systems are outdated and compliance teams need better...
Finding and following your website’s ‘North Star Metric’
By Andy Woods, Design Director of Rouge Media The ‘North Star Metric’ (NSM) is one of many seemingly confusing terms...
Taking control of compliance: how FS institutions can keep up with the ever-changing regulatory landscape
By Charles Southwood, Regional VP – Northern Europe and MEA at Denodo The wide-spread digital transformation that has swept the financial...
Risk assessment: How to plan and execute a security audit as a small business
By Izzy Schulman, Director at Keys 4 U Despite the current global coronavirus pandemic and the uncertainty it has placed...
Buying enterprise professional services: Five considerations for business leaders in turbulent times
By James Sandoval, Founder and CEO, MeasureMatch The platformization of professional services provides businesses with direct, seamless access to the skills...
Wireless Connectivity Lights the Path to Bank Branch Innovation
By Graham Brooks, Strategic Account Director, Cradlepoint EMEA As consumers cautiously return to the UK high street in the past...
Financial Regulations: How do they impact your cloud strategy?
By Michael Chalmers, MD EMEA at Contino How exactly do financial regulations affect your cloud strategy? It’s a question many of...