By Alex McMullan, EMEA CTO, Pure Storage
In the same way that a CFO sets out the strategy and framework for investment and cash flow, CIOs and CTOs need to take control of making dataflow work for their organisations. Data is now a currency, but one which carries extra responsibility for the holder, especially where our personal information is involved. This data currency will come under regulation in 2018, where failing to get a clean audit will have similar reputational and monetary consequences as failing a finance audit.
Against this backdrop, as we head into 2018, below are four factors that CIOs and CTOs should consider, to ensure that they get the most value from their organisations’ data.
Cloud control – hybrid architectures will dominate
The debate around whether to use cloud technologies is history. Multi-cloud deployment has now become the norm. With increased awareness that uncapped or per-second pricing can spiral out of control, a return to hybrid architecture, which marries the strengths of controllable cost, high performance on-premises systems, with burstable, global cloud services is underway.
The increased requirements for data control are going to further boost the attractiveness of the hybrid model. A trend that will only accelerate as each business seeks to balance its own technology demands against the relative TCOs of public vs on-premises cloud platforms.
As the transition from virtualization to cloud-native and containerized applications gathers momentum, there will be an increasing focus on seamless private/public cloud application and data mobility through to 2020. Those technologies which are able to integrate security with performance and multi-tenancy to enable this mobility on-demand, will become the market and mind-share leaders. Being able to efficiently migrate and repatriate data is going to be a key feature of cloud capability as organisations review the splits between where applications and data are most advantageously hosted from both a legal and operational perspective. Returning data to centralised pools in each legislative region is likely to become more common. This will increasingly be augmented (even at the edge) by directly linked high performance storage and compute at a local level, for mission critical real time or highly sensitive applications.
Data stewardship must become a core competency in 2018: IT and culture must change to support that
GDPR implementation is imminent, and for some organisations the initial cost of compliance will be substantial. It fundamentally requires companies to be good data stewards. That means knowing and showing where personal data is, where it is not and demonstrating fine grained data control from ingestion to deletion.
Becoming an exemplary data steward or responding effectively to security incidents, is near impossible with systems that take days to backup, index or restore data. To get to the stage where this level of control and safeguarding is possible we are going to see significant investment during 2018 in faster networks, search and indexing. Tools and platforms to improve the visibility, manageability and performance of data pipelines in general will also see substantial investment.
That said, organisations that rely on technology alone for GDPR compliance will struggle. There will also be significant cultural and procedural changes needed to achieve and maintain GDPR compliance. The right cultural approaches need to be led by senior management, and the right tools need to be implemented to support that behaviour. IT can help but it has to be part of an end to end approach, starting with the data architect and permeating the organisation from the back office through to every customer facing representative.
Leveraging AI and Machine Learning
Data from Forrester suggests that 70% of enterprises expect to implement some form of AI over the next year. However, I believe that, across the full range of businesses, the benefits of Machine Learning (ML) are going to be felt more immediately. ML’s quick wins will be lower down in the business technology stack. ML based automation is already proven to save hours each day in the routine administration of IT infrastructure. Instrumenting systems, via the Internet of Things, and using ML to analyse the data, delivers valuable, actionable information which can be used to automatically resolve issues before they have a business impact.
In my discussions with customers, several have equated the automation and guidance provided by ML based systems with having an additional infrastructure engineer on staff 24/7. This frees up IT staff to invest time in making use of the data they are storing and securing for their organisations.
Storage conversations should become data conversations
Technically, and commercially, the problem of delivering high performance, robust, simple and scalable storage has been solved. 2018 will be seen as a tipping point, where automation and orchestration technologies abstract modern infrastructure technology operations into a REST API call from Ansible, Chef, Puppet, Kubernetes etc.
Developers can now be given access to the block, file and object storage that they need, on a common scalable platform, with certainty over performance and ongoing, non-disruptive enhancement of the underlying technology. They no longer need to have discussions about where the next terabyte is coming from, or how it will be delivered. The simplification of data management that this offers opens up the opportunity for data scientists, researchers and designers to focus on their data pipelines and enhancing design processes, rather than talking about infrastructure.
Being able to shift data and workloads between clouds to take advantage of multi-cloud and hybrid architectures as cloud usage and the data regulation landscape evolves, is going to become a key capability for IT teams. In parallel, getting data storage infrastructure and dataflow right will be essential to equip organisations to take advantage of the machine learning and AI technologies that leading businesses are already deploying. As we head in to 2018, the onus is on CIOs and CTOs to ensure that their data storage strategy and infrastructure allows the organisation to extract maximum value from their data.
Wireless Connectivity Lights the Path to Bank Branch Innovation
By Graham Brooks, Strategic Account Director, Cradlepoint EMEA
As consumers cautiously return to the UK high street in the past few weeks, banks can expect customer footfall in branch to rise accordingly. But whether it’s checking in for a mortgage appointment or cashing in a cheque, awareness of the ongoing potential health risks must be top of mind.
At the same time, the pandemic has forced a transition to the future bank branch. This means that there will be less people and more machines – digital signs, contactless devices, and new cash deposit systems.
To ensure they continue to provide a service that attracts new customers, banks must digitise their branches. And wireless technology is going to form the underlying infrastructure that makes that possible.
Wireless WAN providing reliability
Traditional banks now face their biggest challenge in history: digital-only banking. Over two-thirds of participants in a 2020 study planned to transition to a digital-only bank in the future. It’s therefore vital that traditional banks running physical branches update in-branch customer experience to compete with the new pack on the prairie. Reliability plays a big part. So does trust.
The future of in-branch experience lies in technologies such as IoT, VR/AR, and AI, all of which are highly data-intensive. Reliable connectivity is therefore critical, and banks should be shooting for zero-downtime connectivity, allowing no room for gaps in service.
To do this, banks can deploy Gigabit-class 4G LTE (LTE Advanced) or 5G adapters that bridge to a traditional ethernet connection, providing a wireless option to the wired-line router. Then, in the rare scenario where wireless connectivity is down, at least one of the WAN connections is always guaranteed to be live. The router has the autonomy to determine when failover is necessary.
Better still, the reliability of modern Gigabit 4G LTE and 5G connectivity now means that failover is often unnecessary. A branch can, therefore, run its network independent of a wired-line connection and benefit from the security and agility of a resilient wireless network, while still providing enterprise-grade connectivity.
Branch network reliability, in this way, will support the bank’s reliability as a whole. In turn, this will fuel the higher standards of customer experience needed to compete with more agile digital-only banks.
The new reality of IoT
The first organised response to stop the spread of the virus around the world was social distancing. While transparent screens can be used to block transmission, the overarching effect of these measures has been a loss of communication capabilities. This will affect banks like it has everywhere else, if not more as a space where interaction is so important.
IoT technology will be core to overcoming these barriers. Digital signage, kiosks, and surveillance cameras will all contribute to improved communication and security, and a better customer banking experience. But to enable such extensive use of IoT devices operating on a single network, banks must ensure they can accommodate such high levels of data transfer. Using Gigabit 4G LTE connectivity to extend its services beyond traditional network infrastructure, banks will achieve the required levels of bandwidth.
Hybrid connections managed in the cloud
With high volumes of data being transferred across the network, security and availability should be at the top of the agenda when digitising bank branches. But these are not always easy to implement, especially in an environment with several complex networks of endpoints.
For example, marketing teams need to push personalised content to customers on digital signs and IT teams need to set visitors up on a guest WiFi network. These operations require the guarantee of security and availability, with trust and the customer experience at the core.
Wireless networks excel in this aspect as they can employ the benefits of a cloud-based management system. Cloud-based systems make it easier for bank staff working from home, who can access the same assets and applications from their sofa as they would otherwise have in-branch. The service is the same.
Cloud management systems also provide improved network visibility, giving IT teams endpoint information from across the network as it happens. With security patches being updated on devices simultaneously, leaving reduced time for opportunistic attacks to exploit known vulnerabilities.
Equally, by using a hybrid Gigabit 4G LTE network in tandem with a wired connection, businesses can achieve simplicity from an otherwise complex challenge. The primary wired network can be used to transmit any sensitive information securely, while a separate network using the Gigabit 4G LTE connection runs other in-branch operations.
The branch’s network, in this way, is ‘air-gapped’. The secure data being processed by the operations team runs on an essentially separate network to that of the marketing team’s content. The network will also increase its ability to process more information, with its workload spread out.
The simplest solutions are often the best. In this case, exploiting a hybrid network can address the complexities of security and availability when employing enterprise-grade connectivity.
Invest now for future 5G rewards
As banks continue to adapt their branches over the course of the pandemic, they should invest in business-wide digitisation to secure a sustainable pathway to the future. To achieve this, banks must ensure their network solution enables carrier-class connectivity. It should make use of the full spectrum of connectivity – 4G LTE to 5G – and offer the full spectrum of 5G bandwidth. Branches aren’t going anywhere soon. They must ensure that their services are optimal now, and in ten years’ time.
Fortune favours the bold, and those who chose to adopt revolutionary and innovative technology early are already on their journey. Learning from this, banks that invest now to improve their future infrastructure will thrive once 5G does arrive. Good things do not come to those who ‘wait’. They come to those who prepare well in advance.
Financial Regulations: How do they impact your cloud strategy?
By Michael Chalmers, MD EMEA at Contino
How exactly do financial regulations affect your cloud strategy? It’s a question many of our customers have been scratching their heads about. Some solutions are costly and over-complex – but by asking the right questions, the wrong solutions can be avoided.
Following the Financial Conduct Authority’s (FCA) 2020 review, it’s clear that highly regulated enterprises must work harder than ever to stay within various limits which protect customers during an economically strenuous pandemic. Below, I address three questions we’re hearing from customers about how to optimise the cloud whilst sticking to FCA regulations.
- What regulations must you consider before outsourcing to a cloud provider?
If you have an application or workload that you’re looking to put into the cloud, you will have various service levels that you’ve defined for that particular stack. When you’re looking at the cloud provider and asking yourself what services to use, you’ll need to consider how that aligns to your service levels. How do I architect it to make sure that it’s aligned and that it can tolerate failure?
At the very start of that journey, before you even start putting your workloads into the cloud, you need to set the standards that you will need to adhere to. The Shared Responsibility Model is a key asset in understanding where your responsibility lies.
There are a number of things that you need to make sure are in your contract with the cloud provider. Unless you specifically sign a contract addendum with them, you can’t guarantee that useful and knowledgeable assistance is included.
While the guidelines are very clear on a number of clauses that you need to put in your contract with the cloud provider, these regulations apply to outsourcing in general. Cloud providers are very mature, so they will come with pre-packed addendums to the standard contract they offer that are customised to comply with FCA regulations. However, if you start outsourcing IT functions in a different way, e.g. if you start using a Software-as-a-Service (SaaS) provider which is delivered using the cloud, the new provider will need to be vetted to make sure that you have the right clauses in your contract with them. While cloud providers are very mature on this, most SaaS tools are not.
- How can you control or restrict where data in the cloud moves?
When it comes to data security, there are various options available on Amazon Web Services (AWS). For example, you can securely lock particular regions into an account on AWS. It’s also worth looking at your account structure policy. If you have accounts where data can’t reside outside the EU, you can put the workloads into that bundle and you can lock it down at policy level. There is an element of trust with the provider here as well.
While AWS offers prescribed controls to block certain regions, other cloud providers have different strategies. In the case of Google Cloud (GCP), you can specify service controls so that, even for managed services such as Big Query, you can lock your data in not just one region, but within your virtual private cloud. In other words, not only can you block specific regions or allow specific regions, you can specify that only things within a region can assess data within a region as a general policy.
- What does the regulator need to see to approve your exit strategy?
In terms of documentation, it’s not a case of “show me your policies and test plan” but rather “show me how you exercise it”.
Most of the time it’s a months-long process and it comes down to personal relationships: you build trust over time with the regulator as you build your exit plan. You should be able to discuss what else they would like to see in there. While there used to be a template for an exit plan in the European Banking Authority (EBA) regulations in a previous version, this has since been removed.
Regulators don’t tend to look at test reports. However, they do post a lot of information on audit reports and auditors. These auditors are there to check you’re doing what you say you’re doing. At the end of the day you are responsible for demonstrating your exit plan – it has to be coherent, consistent and compelling.
The truth is, most of the time, regulators are just trying to encourage you to do what works. That being said, sometimes their outlook doesn’t quite match with your view, or sometimes there’s an artificial difference that can be smoothed over or finessed. Occasionally you have to remember that we had 2008. What if in 2020 we have a massive AWS outage?
Multi-cloud is a natural strategy. There’s a number of high-level, cloud-native services that can be leveraged to simplify the implementation of multi-cloud in large financial institutions. Adhering to the multitude of guidelines can be complex and time-consuming, but forging the right path through the regulations will ensure that your multi-cloud is optimised to provide the most streamlined and efficient service possible to your business.
Post-COVID Mortgage Processing: Ripe for Intelligent Automation to Boost Organisational Resiliency
By Asheesh Mehra, Group CEO and Co-founder, AntWorks
As seen in many other countries, the COVID-19 pandemic sent a shockwave through the UK housing market, bringing the entire sector to a virtual standstill. As lockdown restrictions ease, we are now witnessing a housing boom like no other, as thousands have entered the market looking to capitalise on the UK government’s new stamp duty relief on properties priced up to £500,000. At the same time, however, the economic fallout from this financial crisis has resulted in almost 750,000 people losing their jobs and countless more being furloughed, leading to an increase in property remortgaging requests and payment holidays.
As a result, banks and mortgage companies now find themselves inundated with new mortgage applications, refinancing and forbearance applications. To support this, there is now a drastic need for increased manpower to manually process the plethora of mortgage enquiries in a more efficient manner. That being said, the uncertainty of future pandemic impact and restrictions being imposed at a local or global level is leaving the industry under severe pressure to deal with the demand as quickly and effectively as possible.
Like many other industries feeling the impact of the COVID-19 crisis, the mortgage sector needs to deploy digitisation in their operations in order to scale their business faster than before or risk being left behind. Artificial Intelligence, deployed in conjunction with intelligent automation, can help ease the burden on mortgage brokers and lenders by accelerating the mortgage loan process and reducing costly errors caused by manual input.
Achieving speed and scale through intelligent automation
Automation is a viable and logical solution for mortgage lenders as approximately 60 – 70 per cent of tasks in mortgage processes across the value chain are, replicable and prep-based tasks that are suitable candidates for automation. Traditionally, mortgage companies frequently conduct borrowing assessments that require careful analysis and comparison of customer data. This includes checking and establishing customer credit history as well as affordability by manually processing data from income documentation such as bank statements and payslips. This is a tedious but highly necessary process known (rather un-fondly) in the industry as the “stare and compare” stage of mortgage processing
These tasks require a huge amount of paperwork and form filling, which is not only time-consuming but also prone to human error. Furthermore, in their day-to-day routine, mortgage processors are required to literally unpackage and organise documents that are often in paper formats. This is a laborious process especially when executed across multiple mortgage applications at the same time.
This is where intelligent automation steps in to help mortgage companies take on and complete far more work, at a much faster rate and with higher accuracy. Automation can relieve mortgage workers from repetitive tasks such as manually populating the loan origination systems. This means that customers can get loans approved quickly and efficiently. In fact, a global mortgage provider leveraged the power of automation to increase the speed at which mortgage documents were being generated by up to 90 per cent without compromising the integrity of its review process. What’s more, they also managed to improve the turnaround time for the more complex documents by 100 per cent.
Cognitive Machine Reading (CMR) based solutions are the answer for companies looking to achieve straight-through processing for all their mortgage documents. CMR enables mortgage companies to overcome the challenges of digitising unstructured data and achieve faster ROI with higher accuracy with data certainty. Additionally, it can help mortgage companies to cut down on loan processing costs by up to two-thirds and mortgage origination time by 50 per cent.
The (fractal) science behind CMR is that it uses integrated AI capabilities to process highly complex unstructured data along with the more basic data formats. This data can then easily flow through the entire organisation via an end-to-end process achieved with little to no human interference.
Inevitably, all business data needs to be digitised so that it can feed analytics, drive automation, and provide those much-needed customer insights. CMR can play a part to eliminate repetitive and error-prone stare-and-compare tasks that are often a commonplace in mortgage processing. It is able to identify and process the context of data and validate it against existing information elsewhere. As a result, this speeds up the overall processing time for new mortgage and refinancing requests.
Avoiding common automation mistakes
Before kickstarting any digital transformation journey or automation projects, it is imperative that businesses look into avoiding the pitfalls of adopting the wrong automation tools. For example, utilising traditional Optical Character Recognition (OCR) technology for business processes can lead to significant data challenges which will slow down and impede automation goals. OCR is a simple data ingestion tool that is limited to only processing and automating structured data that comes in the form of fixed-field text. Given that 80 per cent of the data within most organisations is unstructured or does not have a predefined format (e.g. emails, images, signatures, social media feeds), OCR technology cannot ingest the vast majority of data trapped inside a mortgage process (or any other business process). In order to overcome this and improve its business process outcomes, one leading Insurance provider managed to process large volumes of unstructured data via CMR automation to achieve 75 per cent reduction in the manual data extraction of handwritten documents. Additionally, the company also achieved more than 92 per cent accuracy in identifying and processing handwritten content.
Critical, everyday business data contained in multiple formats such as emails, images, and handwritten material make up a large part of unstructured data. This is why businesses need to put greater emphasis on researching and identifying intelligent automation solutions that can unlock this date to achieve their business goals. CMR enables mortgage companies to significantly accelerate the course of identifying and classifying all types of documents by cutting down the reduction time for processing mortgage claims by 90 per cent with a substantial level of accuracy (75%). What’s more, it enables any organisation to automate at scale, bringing true automation as a company-wide approach rather than a segregated one.
The COVID-19 pandemic has managed to speed up the need for businesses to embrace digital transformation. This may well be the catalyst for many mortgage organisations steeped in antiquated legacy-based ways of working to refine and streamline their business operations via straight-through processing. It is clear that companies can successfully automate entire business operations to not only improve their operational efficiency but also achieve organisational resilience in a long run. And the faster mortgage lenders can tackle their processes right now, the better for the sooner they can pass those efficiencies and savings onto customers to help rebuild the economy and bolster the housing market in the UK and elsewhere.
Mastercard Delivers Greater Transparency in Digital Banking Applications
Mastercard collaborates with merchants and financial institutions to include logos in digital banking applications Research shows that ~25% of disputes...
Success beyond voice: Contact centres supporting retail shift online
As the nation continues to overcome the challenges presented by COVID-19, customers have shifted their channel preferences, and contact centres have demonstrated...
7 Ways to Grow a Profitable Hospitality Business
Hospitality requires charisma and innovation The hospitality industry is a multibillion-dollar industry with lots of career opportunities in hotels, theme...
AML and the FINCEN files: Do banks have the tools to do enough?
By Gudmundur Kristjansson, CEO of Lucinity and former compliance technology officer Says AML systems are outdated and compliance teams need better...
Finding and following your website’s ‘North Star Metric’
By Andy Woods, Design Director of Rouge Media The ‘North Star Metric’ (NSM) is one of many seemingly confusing terms...
Taking control of compliance: how FS institutions can keep up with the ever-changing regulatory landscape
By Charles Southwood, Regional VP – Northern Europe and MEA at Denodo The wide-spread digital transformation that has swept the financial...
Risk assessment: How to plan and execute a security audit as a small business
By Izzy Schulman, Director at Keys 4 U Despite the current global coronavirus pandemic and the uncertainty it has placed...
Buying enterprise professional services: Five considerations for business leaders in turbulent times
By James Sandoval, Founder and CEO, MeasureMatch The platformization of professional services provides businesses with direct, seamless access to the skills...
Wireless Connectivity Lights the Path to Bank Branch Innovation
By Graham Brooks, Strategic Account Director, Cradlepoint EMEA As consumers cautiously return to the UK high street in the past...
Financial Regulations: How do they impact your cloud strategy?
By Michael Chalmers, MD EMEA at Contino How exactly do financial regulations affect your cloud strategy? It’s a question many of...