Keith Ricketts, Marketing Director at Becrypt talks us through the complexities of accessing and sharing data, and explains how no matter how strong the encryption in use is, it is policies, procedures and staff behaviour that ensure a secure system.
As organisations embrace the 24/7 culture, with the sun never setting on global markets, staff are under pressure to cram ever more into their working day. Technology has been a great enabler, with people now working from any location, office, home and while on the move using a variety of different devices including PCs, laptops, tablets and smartphones. Often staff now expect to have all their different devices synchronised so that they can access their documents from any of their devices in any location. While all this is great for employees that want more flexible working arrangements, it is not so good for the Security Officer, who is tasked with safeguarding sensitive data, whether it be intellectual property, commercially competitive information or client data.
Becrypt recently achieved EU Certification for its DISK encryption products, the first company to do so. This means that organisations wishing to handle EU restricted data can now do so using Becrypt’s DISK Protect. Until this time there was no easy way for government’s across Europe to share data. Organisations in the commercial world face similar issues. Not only do they need to keep data safe within the organisation, and with the advent of mobile and home working, this in itself is a lot more complex than it used to be, organisations also need to be sure that when data is shared with trusted third parties, that the data continues to remain secure.
While the use of encryption and other cyber security products have an important part to play, it is how they are deployed and the policy framework around them that will ensure a robustly secure system. Modern encryption algorithms are extremely strong and typically are not directly targeted. For this reason, cyber criminals, fraudsters and even disgruntled employees, will simply target a weak point in the system, which is often the end point.
It is vital to have strong data security policies and procedures which are communicated clearly to all employees. Equally important is to consider where your data is. As organisations take advantage of cloud computing, mobile/remote working, and bring your own device (BYOD) trends, knowing where your data is, is no longer straight forward and therefore protecting it is that much more complex.
Before any data security solution is implemented it is important do know what you are trying to achieve. First you must decide policy, who has access to what data, and what can be shared with outside organisations. You need to strike a balance between keeping data confidential, protecting its integrity (ensuring that it hasn’t been tampered with) and providing accessibility.
Organisations need to think about where their data is actually stored, and particularly if using Cloud technology, where it may end up. Additionally there could be data protection issues when data crosses national borders. As well as encrypting data while in transit in the cloud, should it also be encrypted while at rest in the cloud? Could an ex-employee access that data? There needs to be processes in place to ensure that this can’t happen.
While there is much written about hacking, denial of service and other sorts of attacks, and they are certainly on the increase, for most organisations it is the threat from within, ie. staff, that is by far the highest risk to corporate data. Whether malicious, thoughtless or just unlucky, it is a fact of life that staff will lose laptops, tablets, smartphones, and therefore the data on them. Staff education and in some cases a complete change in culture and attitudes towards data protection needs to be implemented. Processes must be designed to ensure that policy is maintained, and that staff understand both the policy and why they need to follow it.
Policies should ensure that a balance is maintained between keeping data secure and allowing staff access so that they can do their job. Ideally, people should have access to no more data than they need. It is often the amalgamation of data sets that make them sensitive, for example, the combination of names, addresses and account number information. Each data set on their own is fairly meaningless, but when combined it becomes exponentially more sensitive, and valuable.
Technology can be harnessed to ensure that data security procedures are easy to follow for everyone. Encryption that works in the background on desktop PCs, laptops, tablets and even smartphones is now available. Once they have logged on end users are not even aware of it. Solutions are available to ensure that only authorised USB devices can be used to store data, or indeed that certain data cannot be saved to an external device. There are other remote working solutions that enable staff to connect to the corporate network securely, even using their home PC without any possibility of the network being compromised by malware from the home PC, or any trace of the work session, included data saved, being left on the PC after the session has been completed. Further more, this secure remote working system is carried on a USB stick, so employees don’t even need to carry a laptop, and if it is lost, as the device itself is encrypted, no data can be accessed.
All of these systems can be managed from a central console, meaning that an engineer does not need to visit a computer to install the system, or upgrade software. An audit trail can be produced so organisations are not only complying with legislation such as Sarbannes Oxley, Data Protection and the FSA, they can also prove that they have complied. Should any issues arise, the IT department or the security office can be alerted immediately and rights can be repudiated remotely, if required.
With the consumerisation of IT and staff expecting to be able to use their own devices at work, protecting data is getting more and more complex. However, there are ways to protect sensitive data and avoiding the hefty fines and bad publicity that come with a breach. Harnessing technology, adopting common sense policies and educating staff on the dangers of not treating data with the respect it deserves will go a long way to keeping out the headlines and ensuring that intellectual property, commercial sensitive information and valuable customer data remain protected.
VP Bank Selects AxiomSL to Meet Multi-Jurisdictional Risk and Regulatory Reporting Requirements
Consolidates bank’s reporting on a single platform for financial/statistical, AnaCredit, and CRR2/Basel-driven mandates including ICAAP and ILAAP, and provides foundation for strategic expansion
AxiomSL, the industry’s leading provider of risk and regulatory reporting solutions, today announces that VP Bank, one of the largest banks in Liechtenstein, has selected AxiomSL’s ControllerView® data integrity and control platform, as a foundation for its risk and regulatory compliance across Liechtenstein, Luxembourg, Singapore and Switzerland, – encompassing financial and statistical reporting such as CSSF, FINMA, AnaCredit for EBA, MAS 610 for Singapore, and CRR2- and BCBS-driven requirements including ICAAP and ILAAP for FMA.
The high-performance, fully integrated, data-driven platform will enable VP Bank to manage an array of risk and regulatory mandates on a single platform, with full transparency across all processes from ingestion, calculation, reconciliation, and validation to submission. VP Bank will use the platform strategically to further data harmonization, streamline processes, enhance automation, bolster internal controls, and strengthen risk and regulatory reporting across the enterprise.
“Selecting AxiomSL will enhance the value of our investment in regulatory technology, optimize efficiency, and deliver business insights,” stated Robert Kilga, Head of Group Financial Management & Reporting, VP Bank. “With AxiomSL’s single platform, we can ingest data in its native format from multiple sources thus creating synergies between capital, liquidity, and other business functions enterprise-wide,” he continued. “AxiomSL’s system provides intuitive, hands-on transparency into all processes from inception to filing, enhancing our confidence in the data integrity and auditability of our reporting, and enabling us to meet ever-changing regulatory requirements”.
“We are thrilled that VP Bank, such a well-respected institution, has joined our esteemed user community in the DACH region and globally,” said Claudia Thurner, EMEA General Manager, AxiomSL. “In these times of global uncertainty, complying with a wide range of regulatory and risk requirements across jurisdictions is more complex, data intensive, and time sensitive than ever. Financial institutions require a reliable technology partner who can provide global coverage while understanding the intricacies of local and regional regulatory demands,” Thurner continued. “Our industry and technical expertise will enable VP Bank to streamline their processes, scale faster, and adapt swiftly and confidently to change. We look forward to a strong and strategic collaboration with VP Bank in support of their vision and growth journey”.
With the upcoming Basel IV-driven expansion, financial institutions like VP Bank are faced with the next generation of capital requirements that can easily overwhelm systems if they lack the data transparency, proper methodologies and controls to perform calculations accurately across all risk types. These calculations may have a profound effect on the banks’ portfolio management and even the entire business model.
To address these challenges, AxiomSL’s Basel Capital Solution incorporates a flexible data dictionary architecture, seamless calculation updates, full drilldown to data and processes, transparency into model calculations, and dynamic data lineage. In addition, AxiomSL’s regulatory experts provide VP Bank with a highly efficient change-management mechanism that enables them to be current with all Basel-driven changes.
Uncertain Times for the Financial Sector… Is Open Source the Solution?
By Kris Sharma, Finance Sector Lead, Canonical
Financial services are an important part of the economy and play a wider role in providing liquidity and capital across the globe. But ongoing political uncertainty and the consequences of the COVID-19 crisis have deep implications for the UK’s financial services sector.
In a post-Brexit world, the industry is facing regulatory uncertainty at a whole different scale, with banking executives having to understand the implications of different scenarios, including no-deal. To reduce the risk of significant disruption, financial services firms require the right technology infrastructure to be agile and responsive to potential changes.
The role of open source
Historically, banks have been hesitant to adopt open source software. But over the course of the last few years, that thinking has begun to change. Organisations like the Open Bank Project and Fintech Open Source Foundation (FINOS) have come about with the aim of pioneering open source adoption by highlighting the benefits of collaboration within the sector. Recent acquisitions of open source companies by large and established corporate technology vendors signal that the technology is maturing into mainstream enterprise play. Banking leaders are adopting open innovation strategies to lower costs and reduce time-to-market for products and services.
Banks must prepare to rapidly implement changes to IT systems in order to comply with new regulations, which may be a costly task if firms are solely relying on traditional commercial applications. Changes to proprietary software and application platforms at short notice often have hidden costs for existing contractual arrangements due to complex licensing. Open source technology and platforms could play a crucial role in helping financial institutions manage the consequences of Brexit and the COVID-19 crisis for their IT and digital functions.
Open source software gives customers the ability to spin up instances far more quickly and respond to rapidly changing scenarios effectively. Container technology has brought about a step-change in virtualisation technology, providing almost equivalent levels of resource isolation as a traditional hypervisor. This in turn offers considerable opportunities to improve agility, efficiency, speed, and manageability within IT environments. In a survey conducted by 451 Research, almost a third of financial services firms see containers and container management as a priority they plan to begin using within the next year.
Containerisation also enables rapid deployment and updating of applications. Kubernetes, or K8s for short, is an open-source container-orchestration system for deploying, monitoring and managing apps and services across clouds. It was originally designed by Google and is now maintained by the Cloud Native Computing Foundation (CNCF). Kubernetes is a shining example of open source, developed by a major tech company, but now maintained by the community for all, including financial institutions, to adopt.
The data dilemma
The use cases for data and analytics in financial services are endless and offer tangible solutions to the consequences of uncertainty. Massive data assets mean that financial institutions can more accurately gauge the risk of offering a loan to a customer. Banks are already using data analytics to improve efficiency and increase productivity, and going forward, will be able to use their data to train machine learning algorithms that can automate many of their processes.
For data analytics initiatives, banks now have the option of leveraging the best of open source technologies. Databases today can deliver insights and handle any new sources of data. With models flexible enough for rich modern data, a distributed architecture built for cloud scale, and a robust ecosystem of tools, open source platforms can help banks break free from data silos and enable them to scale their innovation.
Open source databases can be deployed and integrated in the environment of choice, whether public or private cloud, on-premise or containers, based on business requirements. These database platforms can be cost effective; projects can begin as prototypes and develop quickly into production deployments. As a result of political uncertainty, financial firms will need to be much more agile. And with no vendor lock-in, they will be able to choose the provider that is best for them at any point in time, enabling this agility while avoiding expensive licensing.
As with any application running at scale, production databases and analytics applications require constant monitoring and maintenance. Engaging enterprise support for open source production databases minimises risk for business and can optimise internal efficiency.
Additionally, AI solutions have the potential to transform how banks deal with regulatory compliance issues, financial fraud and cybercrime. However, banks need to get better at using customer data for greater personalisation, enabling them to offer products and services tailored to individual consumers in real time. As yet, most financial institutions are unsure whether a post-Brexit world will focus on gaining more overseas or UK-based customers. With a data-driven approach, banks can see where the opportunities lie and how best to harness them. The opportunities are vast and, on the journey to deliver cognitive banking, financial institutions have only just scratched the surface of data analytics. But as the consequences of COVID-19 continue and Brexit uncertainty once again moves up the agenda, moving to data-first will become less of a choice and more of a necessity.
The number of data sets and the diversity of data is increasing across financial services, making data integration tasks ever more complex. The cloud offers a huge opportunity to synchronise the enterprise, breaking down operational and data silos across risk, finance, regulatory, customer support and more. Once massive data sets are combined in one place, the organisation can apply advanced analytics for integrated insights.
Uncertainty on the road ahead
Open source technology today is an agile and responsive alternative to traditional technology systems that provides financial institutions with the ability to deal with uncertainty and adapt to a range of potential outcomes.
In these unpredictable times, banking executives need to achieve agility and responsiveness while at the same time ensuring that IT systems are robust, reliable and managed effectively. And with the option to leverage the best of open source technologies, financial institutions can face whatever challenges lie ahead.
The end of the cookie and the new era of digital marketing
By Biog Richard Wheaton, UK MD of data company fifty-five
If you are following the current announcements around data governance in digital marketing, you may be forgiven for thinking that digital media performance measurement is coming to an end. Europe’s GDPR and California’s CPA have set out the legal frameworks for new requirements for privacy compliance, and Apple’s ITP safeguards block the use of 3rd party cookies on their devices. Google is committed to following this, with similar blocks for its Chrome browser by 2022.
There are going to be some online targeting tactics that you will not be able to deploy in this new world, and for users with iPhones and Apple Mac computers, those tactics are already significantly curtailed. By 2022 there will be a whole raft of use cases in targeting tools like DMPs (Data Management Platforms) that will be rendered useless on the vast majority of devices.
So, what can you do? Are we set back to the days of purely contextual targeting, in which you bought the “eyeballs” of customers of a specific financial product or a given industrial sector by advertising in their vertical trade journal? Do we just have to send the same ad to everyone, and hope that some of them are in the market for our niche products?
Moving towards the end of 3rd party cookies
For marketers of financial products and services, these are important questions because managing your audiences is important for avoiding waste and for addressing your engaged clients and prospects in targeted and effective ways. The ability to track and remarket to customers has been a mainstay of digital marketing for the past decade. This has all been enabled by the dropping of cookies – files that sit on your PC and record your browsing activity. With moves from regulators and tech companies to protect user anonymity, the cookie while not quite dead, is certainly crumbling.
The upshot is that marketers cannot rely on 3rd party data anymore and should be wary of how any use of 3rd party data will be viewed by the regulator. This renders even the most basic consumer journey starting with a mobile search and ending with a desktop ecommerce transaction hard to measure in many cases. But the use of anonymized 1st party data can still yield critical insights, and this is where we need to focus our efforts.
From individual to anonymized data
So, what has fundamentally changed in digital measurement? The overarching shift is from a world where brands could access the log-file data of each individual user to a world of anonymized user behaviour, based on vast pools of data.
The ownership of these data pools and the ability to collect and model it sits with the brand that has the direct contact with the consumer. For large banks, insurance and pension companies, this is a resource for efficient and effective digital marketing, based on aggregated insights. For the adtech giants – Google, Amazon, Facebook – this will provide increased levels of targeting options within their tools.
The adtech identity graph
Google are in some ways the most advanced in delivering a scalable solution, which is worth examining. Google uses an undisclosed, supposedly huge sample of users for which they have collected the right consent first. They can accurately track across publishers, devices and even offline channels. From this they extrapolate to the entire population, applying machine learning or more traditional online polling. The end report is always aggregated and only available above a certain volume threshold. Facebook and Amazon are constantly developing similar aggregated audiences for targeting with their inventory.
These audience aggregation capabilities may appear to be similar to Nielsen’s panel-based measurement model. But this really is digital marketing on rocket fuel. It allows brands to leverage the tech brands’ enormous identity graphs and universal app tracking capabilities. And because their identity graphs are vast and data collection in real-time, the final estimate is not only accurate, but most of all, granular and actionable – the two main limitations of traditional panel-based systems.
Google launches machine-learning managed ad frequency
Practical applications of these tools are valuable to marketers. For examples, Google has recently announced the launch of a new machine-learning powered feature to manage ad frequency across its Google Ad Manager platform. It will predict how many times a user was exposed to an ad for reach and frequency analysis, based on its undisclosed sample of users who Google can track with full certainty.
This announcement comes after some similar releases across the entire stack. ‘Cross-environment estimated conversions’ will show you when customers convert on the same device, as well as when customers click a search ad on one device or browser but convert in a different environment. These are rich insights enabling you to optimize any campaigns.
Integrating the new reports into decision-making
Financial marketers will use these new features as they become available in the Google Marketing Platform, but it is also vital for these decision-makers to have their own measures of effectiveness and performance. Most of the features above are still in Beta, rarely compatible with third party systems and often difficult to export. While many media agencies are judged on ‘cost per acquisition’, this is far more difficult in a world where the conversion is estimated and therefore not auditable. This makes it absolutely essential for brand marketers to undertake their own measurement frameworks and integrate insights into their own sales and demand trackers.
The secret lies in your 1st party data given the fact that it is comparatively less impacted than 3rd party data, with the one-day notice in Safari and as-yet unaffected in Chrome. There is still a scope for projects that focus on maximizing how you capture and use your first person data, both to employ smarter segmentation of your media activation, and to obtain insights by separating out your Chrome from your ITP lines to tailor your campaign activation and measurement.
It will also be valuable for marketers in finance sectors to use their own data to build better targeting capabilities, and to monitor and validate the information they are getting from the large tech companies, whose insights are otherwise largely unauditable.
There is also a time-limit to the availability of these measurable insights, because when Google removes open-ended access to your 1st party cookies, the ability of brands to obtain these insights will reduced to almost zero.
A measurement mindset
While people might mourn the loss of the present system, in truth very few brands have had the genuine discipline or technical knowledge to make their measurement truly insightful. And in reality, even for brands that made the effort to obtain genuinely robust data, the figures in their analyses were not 100% accurate, with data discrepancies and ad fraud skewing the findings providing a false picture. Some will also be uncomfortable with the aggregation of power within the big tech companies, but the options will be increasingly few to work outside their platforms.
In the post-cookie world, what is really required is a change of mindset when measuring performance, which will be as important to you as the tools and technology that you use. We are in a new era and those who are prepared to adapt their thinking are best placed to succeed.
Cloud in Banking: An Opportunity That Can’t be Ignored
By David Rimmer, Research Associate at Leading Edge Forum Originally offered as a better way to build IT systems, cloud...
Increased contactless spending could be linked to higher fraud and payment disputes, warns global risk expert
The rapid adoption of contactless payments during COVID-19 may be contributing to multiple strands of fraud Monica Eaton-Cardone, COO and...
Pay and Go, why seamless checkout is essential for the customer experience
By Ralf Gladis, CEO, Computop Shopping for many is therapy…until they reach the queue for the checkout. It’s easier online...
VP Bank Selects AxiomSL to Meet Multi-Jurisdictional Risk and Regulatory Reporting Requirements
Consolidates bank’s reporting on a single platform for financial/statistical, AnaCredit, and CRR2/Basel-driven mandates including ICAAP and ILAAP, and provides foundation...
How to communicate when the world is in crisis
By Callum Jackson Account Executive at communications agency Cicero/AMO Across sectors both private and public, the coronavirus crisis has brought...
Efficiency vs productivity: how to maximise the output of streamlined teams
By Julie Lock, commercial director at Mitrefinch With the furlough scheme heading towards its conclusion over the coming weeks, the...
Uncertain Times for the Financial Sector… Is Open Source the Solution?
By Kris Sharma, Finance Sector Lead, Canonical Financial services are an important part of the economy and play a wider...
The end of the cookie and the new era of digital marketing
By Biog Richard Wheaton, UK MD of data company fifty-five If you are following the current announcements around data governance in...
Time for the adaptive profession – APM reveals findings of its Projecting the Future report
The project profession is at the forefront of change, but needs to continually develop skills to stay relevant 15 September,...
Setting up secure remote working for financial services
By Pete Watson, CEO, Atlas Cloud Financial advisors, insurers, banks and brokers; the entire financial services sector has been forced...