Connect with us
Editorial & Advertiser disclosureOur website provides you with information, news, press releases, Opinion and advertorials on various financial products and services. This is not to be considered as financial advice and should be considered only for information purposes. We cannot guarantee the accuracy or applicability of any information provided with respect to your individual or personal circumstances. Please seek Professional advice from a qualified professional before making any financial decisions. We link to various third party websites, affiliate sales networks, and may link to our advertising partners websites. Though we are tied up with various advertising and affiliate networks, this does not affect our analysis or opinion. When you view or click on certain links available on our articles, our partners may compensate us for displaying the content to you, or make a purchase or fill a form. This will not incur any additional charges to you. To make things simpler for you to identity or distinguish sponsored articles or links, you may consider all articles or links hosted on our site as a partner endorsed link.

Technology

‘Generation Tech’ – young, gifted but a long way from bad

Published

on

Paul-Kenyon
Young employees take more risks with software. This doesn’t have to be a problem
By Paul Kenyon, COO, Avecto

They have been variously described as technology’s ‘Generation Y’ or ‘Generation Tech’, an undisciplined, impulsive, entitled horde of twenty-something workers older heads are inclined to see as one of the biggest security challenges ever to hit corporate networks.Paul-KenyonHaving grown up in an age of lurching software advances, ubiquitous communication and social networking, this is not a group easily dissuaded from using any and every application by the old reasoning that software can be a ‘bit risky.’ The same applies to their attitude to ‘bring your own device’(BYOD), a trend driven by the basic social reality that workers of all age groups now depend on personal devices such as smartphones and tablets and won’t take happily to the idea of being asked to leave them at home.

If the ‘Generation Y’ label sounds a bit glib there is a small but growing body of evidence that a worker’s age does play some role in shaping attitudes to technology. A recent survey by Avecto of 1,500 IT admins visiting the TechEd US and European conferences found that workers between the ages of 20 and 35 – the Gen Y demographic – were seen by 80 percent of professionals as posing a formidable obstacle to application security.

Why? The tendency of this group to download unauthorised apps was the first big concern, with nearly forty percent of admins reporting having experienced a malware incident because of this behaviour.  Three quarters of admins weren’t even sure how many unauthorised applications had been downloaded, which renders the issue of the damage caused almost moot.

Enough already
It’s not necessarily that older workers don’t participate in risky behaviour as well but that Generation Y is perhaps more active and confident in finding applications for themselves and utterly convinced of their right and need to have them. The survey implied that many admins try to cope with this by ‘flying blind’, that is they look to manage assertive users using manual procedures based on assumptions and trust. Without tools they have no obvious alternative.

Because Windows applications often demand privileges when installing or updating quite basic applications and add-ons, the easiest if most extreme response is to either fully enable or completely block such privileges. Some incorrectly assume that only esoteric apps still ask for admin rights but this is far from the truth. Here are a few common examples that will ask for privilege elevation:

•  Java
•  Flash Installer/Updater
•  Apple iTunes
•  Google Chrome
•  Firefox
•  Adobe Acrobat Updater
•  Skype
•  Blackberry Desktop Manager
•  Citrix GoToMeeting
•  Cisco WebEx
•  HP Universal Printer Driver
•  VLC Media Player
•  Adobe AIR

To this should be added countless examples of legacy and bespoke applications. Blocking or enabling offers certainty but is counter-productive; enabling privileges allows dangerous applications to run at will while removing them stops legitimate and even necessary ones from running at all.The common solution to this software checkmate that has been available since Windows Vista and Windows 7 is to allow privilege escalation on demand through User Account Control (UAC), but this too comes at a price; admins are bombarded with requests for passwords to elevate application privileges without the visibility to know whether a specific request is justified. Generation Y, meanwhile, is frustrated at even having to ask.

The Windows 7 ‘moment’
Migration to Windows 7 has turned out to be the important moment where organisations reassessed hardened assumptions about the way employees use and access applications and a growing number have concluded that the rational response is to invest in least privilege management. With this design, users can request application admin privileges on a case-by-case basis after authenticating themselves in a way that offers audited admin oversight.

The user is given the privileges he or she needs and can use applications on demand with the added benefit that admins are given some visibility into which new applications are finding their way on to the ‘required’ list of the workforce. These rights can be revoked when they are no longer needed, which could be as little as minutes later.

This model overcomes the unhelpful cultural barrier that can spring up between those whose job it is to administer software and employees who might be asking for unsanctioned but potentially beneficial applications admins haven’t even heard of.

There’s no simple answer to identifying which applications might be beneficial and which will turn out to be a productivity-sapping chore.  It depends on the type of organisation and the specific set of workers. Where might red lines be drawn?
In the blocked group will sit obviously malign applications (i.e. malware) or illegal or inconvenient (e.g. bandwidth-consuming P2P or video), but in truth the overwhelming majority will be tagged rather unhelpfully as ‘grey’, their status unknown.

A good example of this is Skype, deemed appropriate for some users and organisations but not for others required to meet regulatory constraints that an encrypted channel into and out of the organisation clearly infringes. It just depends. With application and privilege management admins will at least have an overview of an application’s popularity inside an organisation the better to make an informed decision.

Opportunity not threat
From the point of view of traditional, centralised IT, BYOD and consumer software are inherently difficult to assimilate. Admins are instinctively wary and with good reason. In conventional IT, the users are the source of most problems starting with the misuse of software.  But here’s an intriguing thought; far from being negative and risky, perhaps the way Generation Y adopts new applications could have long-term benefits if a way can be found to accommodate the behaviour.

It’s tempting to see the gulf that has grown up between admins and  users in some organisations as a culture clash of two age groups, the LAN Generation (let’s call them ‘Generation X’ because it conveniently references people born in the 1960s) and the younger Generation Y that has been the subject of this feature.

This would be a mistake although it does neatly outline the different attitude of workers who grew up with the PC and Internet in the 1980s and 1990s and for whom the challenge was simple: get things to work. Years on, for Generation Y the challenge is less a technical one than a social one: how to change the way things work.

Age, then, is better seen as a motif for divisions that grow up in all organisations between hierarchies, between those whose job it is to manage and those who carry out its most basic functions and look for as many short cuts as possible.

What the emergence of Generation Tech suggests is that technology has changed in ways that offer huge benefits and the best response is to adapt rather than deny, and to involve workers in choosing and developing applications rather than turning them into slaves to the UAC prompt and login box.

Applications are not the enemy and neither are the people who use (or want to use) them. They are the managers of tomorrow and future of all organisations that want to stick around.

Technology

VP Bank Selects AxiomSL to Meet Multi-Jurisdictional Risk and Regulatory Reporting Requirements

Published

on

VP Bank Selects AxiomSL to Meet Multi-Jurisdictional Risk and Regulatory Reporting Requirements 1

Consolidates bank’s reporting on a single platform for financial/statistical, AnaCredit, and CRR2/Basel-driven mandates including ICAAP and ILAAP, and provides foundation for strategic expansion

AxiomSL,  the industry’s leading provider of risk and regulatory reporting solutions, today announces that VP Bank, one of the largest banks in Liechtenstein,  has selected AxiomSL’s ControllerView® data integrity and control platform, as a foundation for its risk and regulatory compliance across Liechtenstein, Luxembourg, Singapore and Switzerland, – encompassing financial and statistical reporting such as CSSF,  FINMA, AnaCredit for EBA, MAS 610 for Singapore, and CRR2- and BCBS-driven requirements including ICAAP and ILAAP for FMA.

The high-performance, fully integrated, data-driven platform will enable VP Bank to manage an array of risk and regulatory mandates on a single platform, with full transparency across all processes from ingestion, calculation, reconciliation, and validation to submission. VP Bank will use the platform strategically to further data harmonization, streamline processes, enhance automation, bolster internal controls, and strengthen risk and regulatory reporting across the enterprise.

“Selecting AxiomSL will enhance the value of our investment in regulatory technology, optimize efficiency, and deliver business insights,” stated Robert Kilga, Head of Group Financial Management & Reporting, VP Bank. “With AxiomSL’s single platform, we can ingest data in its native format from multiple sources thus creating synergies between capital, liquidity, and other business functions enterprise-wide,” he continued. “AxiomSL’s system provides intuitive, hands-on transparency into all processes from inception to filing, enhancing our confidence in the data integrity and auditability of our reporting, and enabling us to meet ever-changing regulatory requirements”.

“We are thrilled that VP Bank, such a well-respected institution, has joined our esteemed user community in the DACH region and globally,” said Claudia Thurner, EMEA General Manager, AxiomSL. “In these times of global uncertainty, complying with a wide range of regulatory and risk requirements across jurisdictions is more complex, data intensive, and time sensitive than ever. Financial institutions require a reliable technology partner who can provide global coverage while understanding the intricacies of local and regional regulatory demands,” Thurner continued. “Our industry and technical expertise will enable VP Bank to streamline their processes, scale faster, and adapt swiftly and confidently to change. We look forward to a strong and strategic collaboration with VP Bank in support of their vision and growth journey”.

With the upcoming Basel IV-driven expansion, financial institutions like VP Bank are faced with the next generation of capital requirements that can easily overwhelm systems if they lack the data transparency, proper methodologies and controls to perform calculations accurately across all risk types. These calculations may have a profound effect on the banks’ portfolio management and even the entire business model.

To address these challenges, AxiomSL’s Basel Capital Solution incorporates a flexible data dictionary architecture, seamless calculation updates, full drilldown to data and processes, transparency into model calculations, and dynamic data lineage. In addition, AxiomSL’s regulatory experts provide VP Bank with a highly efficient change-management mechanism that enables them to be current with all Basel-driven changes.

Continue Reading

Technology

Uncertain Times for the Financial Sector… Is Open Source the Solution?

Published

on

Uncertain Times for the Financial Sector… Is Open Source the Solution? 2

By Kris Sharma, Finance Sector Lead, Canonical

Financial services are an important part of the economy and play a wider role in providing liquidity and capital across the globe. But ongoing political uncertainty and the consequences of the COVID-19 crisis have deep implications for the UK’s financial services sector.

In a post-Brexit world, the industry is facing regulatory uncertainty at a whole different scale, with banking executives having to understand the implications of different scenarios, including no-deal. To reduce the risk of significant disruption, financial services firms require the right technology infrastructure to be agile and responsive to potential changes.

The role of open source

Historically, banks have been hesitant to adopt open source software. But over the course of the last few years, that thinking has begun to change. Organisations like the Open Bank Project and Fintech Open Source Foundation (FINOS) have come about with the aim of pioneering open source adoption by highlighting the benefits of collaboration within the sector. Recent acquisitions of open source companies by large and established corporate technology vendors signal that the technology is maturing into mainstream enterprise play. Banking leaders are adopting open innovation strategies to lower costs and reduce time-to-market for products and services.

Banks must prepare to rapidly implement changes to IT systems in order to comply with new regulations, which may be a costly task if firms are solely relying on traditional commercial applications. Changes to proprietary software and application platforms at short notice often have hidden costs for existing contractual arrangements due to complex licensing. Open source technology and platforms could play a crucial role in helping financial institutions manage the consequences of Brexit and the COVID-19 crisis for their IT and digital functions.

Open source software gives customers the ability to spin up instances far more quickly and respond to rapidly changing scenarios effectively. Container technology has brought about a step-change in virtualisation technology, providing almost equivalent levels of resource isolation as a traditional hypervisor. This in turn offers considerable opportunities to improve agility, efficiency, speed, and manageability within IT environments. In a survey conducted by 451 Research, almost a third of financial services firms see containers and container management as a priority they plan to begin using within the next year.

Containerisation also enables rapid deployment and updating of applications. Kubernetes, or K8s for short, is an open-source container-orchestration system for deploying, monitoring and managing apps and services across clouds. It was originally designed by Google and is now maintained by the Cloud Native Computing Foundation (CNCF). Kubernetes is a shining example of open source, developed by a major tech company, but now maintained by the community for all, including financial institutions, to adopt.

The data dilemma

Kris Sharma

Kris Sharma

The use cases for data and analytics in financial services are endless and offer tangible solutions to the consequences of uncertainty. Massive data assets mean that financial institutions can more accurately gauge the risk of offering a loan to a customer. Banks are already using data analytics to improve efficiency and increase productivity, and going forward, will be able to use their data to train machine learning algorithms that can automate many of their processes.

For data analytics initiatives, banks now have the option of leveraging the best of open source technologies. Databases today can deliver insights and handle any new sources of data. With models flexible enough for rich modern data, a distributed architecture built for cloud scale, and a robust ecosystem of tools, open source platforms can help banks break free from data silos and enable them to scale their innovation.

Open source databases can be deployed and integrated in the environment of choice, whether public or private cloud, on-premise or containers, based on business requirements. These database platforms can be cost effective; projects can begin as prototypes and develop quickly into production deployments. As a result of political uncertainty, financial firms will need to be much more agile. And with no vendor lock-in, they will be able to choose the provider that is best for them at any point in time, enabling this agility while avoiding expensive licensing.

As with any application running at scale, production databases and analytics applications require constant monitoring and maintenance. Engaging enterprise support for open source production databases minimises risk for business and can optimise internal efficiency.

Additionally, AI solutions have the potential to transform how banks deal with regulatory compliance issues, financial fraud and cybercrime. However, banks need to get better at using customer data for greater personalisation, enabling them to offer products and services tailored to individual consumers in real time. As yet, most financial institutions are unsure whether a post-Brexit world will focus on gaining more overseas or UK-based customers. With a data-driven approach, banks can see where the opportunities lie and how best to harness them. The opportunities are vast and, on the journey to deliver cognitive banking, financial institutions have only just scratched the surface of data analytics. But as the consequences of COVID-19 continue and Brexit uncertainty once again moves up the agenda, moving to data-first will become less of a choice and more of a necessity.

The number of data sets and the diversity of data is increasing across financial services, making data integration tasks ever more complex. The cloud offers a huge opportunity to synchronise the enterprise, breaking down operational and data silos across risk, finance, regulatory, customer support and more. Once massive data sets are combined in one place, the organisation can apply advanced analytics for integrated insights.

Uncertainty on the road ahead

Open source technology today is an agile and responsive alternative to traditional technology systems that provides financial institutions with the ability to deal with uncertainty and adapt to a range of potential outcomes.

In these unpredictable times, banking executives need to achieve agility and responsiveness while at the same time ensuring that IT systems are robust, reliable and managed effectively. And with the option to leverage the best of open source technologies, financial institutions can face whatever challenges lie ahead.

Continue Reading

Technology

The end of the cookie and the new era of digital marketing

Published

on

The end of the cookie and the new era of digital marketing 3

By Biog Richard Wheaton, UK MD of data company fifty-five

If you are following the current announcements around data governance in digital marketing, you may be forgiven for thinking that digital media performance measurement is coming to an end. Europe’s GDPR and California’s CPA have set out the legal frameworks for new requirements for privacy compliance, and Apple’s ITP safeguards block the use of 3rd party cookies on their devices. Google is committed to following this, with similar blocks for its Chrome browser by 2022.

There are going to be some online targeting tactics that you will not be able to deploy in this new world, and for users with iPhones and Apple Mac computers, those tactics are already significantly curtailed. By 2022 there will be a whole raft of use cases in targeting tools like DMPs (Data Management Platforms) that will be rendered useless on the vast majority of devices.

So, what can you do? Are we set back to the days of purely contextual targeting, in which you bought the “eyeballs” of customers of a specific financial product or a given industrial sector by advertising in their vertical trade journal? Do we just have to send the same ad to everyone, and hope that some of them are in the market for our niche products?

Moving towards the end of 3rd party cookies

For marketers of financial products and services, these are important questions because managing your audiences is important for avoiding waste and for addressing your engaged clients and prospects in targeted and effective ways. The ability to track and remarket to customers has been a mainstay of digital marketing for the past decade. This has all been enabled by the dropping of cookies – files that sit on your PC and record your browsing activity. With moves from regulators and tech companies to protect user anonymity, the cookie while not quite dead, is certainly crumbling.

The upshot is that marketers cannot rely on 3rd party data anymore and should be wary of how any use of 3rd party data will be viewed by the regulator. This renders even the most basic consumer journey starting with a mobile search and ending with a desktop ecommerce transaction hard to measure in many cases. But the use of anonymized 1st party data can still yield critical insights, and this is where we need to focus our efforts.

From individual to anonymized data

So, what has fundamentally changed in digital measurement? The overarching shift is from a world where brands could access the log-file data of each individual user to a world of anonymized user behaviour, based on vast pools of data.

The ownership of these data pools and the ability to collect and model it sits with the brand that has the direct contact with the consumer. For large banks, insurance and pension companies, this is a resource for efficient and effective digital marketing, based on aggregated insights. For the adtech giants – Google, Amazon, Facebook – this will provide increased levels of targeting options within their tools.

The adtech identity graph

Google are in some ways the most advanced in delivering a scalable solution, which is worth examining. Google uses an undisclosed, supposedly huge sample of users for which they have collected the right consent first. They can accurately track across publishers, devices and even offline channels. From this they extrapolate to the entire population, applying machine learning or more traditional online polling. The end report is always aggregated and only available above a certain volume threshold. Facebook and Amazon are constantly developing similar aggregated audiences for targeting with their inventory.

These audience aggregation capabilities may appear to be similar to Nielsen’s panel-based measurement model. But this really is digital marketing on rocket fuel. It allows brands to leverage the tech brands’ enormous identity graphs and universal app tracking capabilities. And because their identity graphs are vast and data collection in real-time, the final estimate is not only accurate, but most of all, granular and actionable – the two main limitations of traditional panel-based systems.

Google launches machine-learning managed ad frequency
Practical applications of these tools are valuable to marketers. For examples, Google has recently announced the launch of a new machine-learning powered feature to manage ad frequency across its Google Ad Manager platform. It will predict how many times a user was exposed to an ad for reach and frequency analysis, based on its undisclosed sample of users who Google can track with full certainty.

This announcement comes after some similar releases across the entire stack. ‘Cross-environment estimated conversions’ will show you when customers convert on the same device, as well as when customers click a search ad on one device or browser but convert in a different environment. These are rich insights enabling you to optimize any campaigns.

Integrating the new reports into decision-making

Financial marketers will use these new features as they become available in the Google Marketing Platform, but it is also vital for these decision-makers to have their own measures of effectiveness and performance. Most of the features above are still in Beta, rarely compatible with third party systems and often difficult to export. While many media agencies are judged on ‘cost per acquisition’, this is far more difficult in a world where the conversion is estimated and therefore not auditable. This makes it absolutely essential for brand marketers to undertake their own measurement frameworks and integrate insights into their own sales and demand trackers.

The secret lies in your 1st party data given the fact that it is comparatively less impacted than 3rd party data, with the one-day notice in Safari and as-yet unaffected in Chrome. There is still a scope for projects that focus on maximizing how you capture and use your first person data, both to employ smarter segmentation of your media activation, and to obtain insights by separating out your Chrome from your ITP lines to tailor your campaign activation and measurement.

Richard Wheaton

Richard Wheaton

It will also be valuable for marketers in finance sectors to use their own data to build better targeting capabilities, and to monitor and validate the information they are getting from the large tech companies, whose insights are otherwise largely unauditable.

There is also a time-limit to the availability of these measurable insights, because when Google removes open-ended access to your 1st party cookies, the ability of brands to obtain these insights will reduced to almost zero.

A measurement mindset

While people might mourn the loss of the present system, in truth very few brands have had the genuine discipline or technical knowledge to make their measurement truly insightful. And in reality, even for brands that made the effort to obtain genuinely robust data, the figures in their analyses were not 100% accurate, with data discrepancies and ad fraud skewing the findings providing a false picture. Some will also be uncomfortable with the aggregation of power within the big tech companies, but the options will be increasingly few to work outside their platforms.

In the post-cookie world, what is really required is a change of mindset when measuring performance, which will be as important to you as the tools and technology that you use. We are in a new era and those who are prepared to adapt their thinking are best placed to succeed.

Continue Reading

Call For Entries

Global Banking and Finance Review Awards Nominations 2020
2020 Global Banking & Finance Awards now open. Click Here

Latest Articles

Increased contactless spending could be linked to higher fraud and payment disputes, warns global risk expert 4 Increased contactless spending could be linked to higher fraud and payment disputes, warns global risk expert 5
Finance6 seconds ago

Increased contactless spending could be linked to higher fraud and payment disputes, warns global risk expert

The rapid adoption of contactless payments during COVID-19 may be contributing to multiple strands of fraud Monica Eaton-Cardone, COO and...

Pay and Go, why seamless checkout is essential for the customer experience 6 Pay and Go, why seamless checkout is essential for the customer experience 7
Finance12 mins ago

Pay and Go, why seamless checkout is essential for the customer experience

By Ralf Gladis, CEO, Computop Shopping for many is therapy…until they reach the queue for the checkout. It’s easier online...

VP Bank Selects AxiomSL to Meet Multi-Jurisdictional Risk and Regulatory Reporting Requirements 8 VP Bank Selects AxiomSL to Meet Multi-Jurisdictional Risk and Regulatory Reporting Requirements 9
Technology32 mins ago

VP Bank Selects AxiomSL to Meet Multi-Jurisdictional Risk and Regulatory Reporting Requirements

Consolidates bank’s reporting on a single platform for financial/statistical, AnaCredit, and CRR2/Basel-driven mandates including ICAAP and ILAAP, and provides foundation...

How to communicate when the world is in crisis 10 How to communicate when the world is in crisis 11
Business14 hours ago

How to communicate when the world is in crisis

By Callum Jackson Account Executive at communications agency Cicero/AMO Across sectors both private and public, the coronavirus crisis has brought...

Efficiency vs productivity: how to maximise the output of streamlined teams 12 Efficiency vs productivity: how to maximise the output of streamlined teams 13
Business14 hours ago

Efficiency vs productivity: how to maximise the output of streamlined teams

By Julie Lock, commercial director at Mitrefinch With the furlough scheme heading towards its conclusion over the coming weeks, the...

Uncertain Times for the Financial Sector… Is Open Source the Solution? 14 Uncertain Times for the Financial Sector… Is Open Source the Solution? 15
Technology14 hours ago

Uncertain Times for the Financial Sector… Is Open Source the Solution?

By Kris Sharma, Finance Sector Lead, Canonical Financial services are an important part of the economy and play a wider...

The end of the cookie and the new era of digital marketing 16 The end of the cookie and the new era of digital marketing 17
Technology14 hours ago

The end of the cookie and the new era of digital marketing

By Biog Richard Wheaton, UK MD of data company fifty-five If you are following the current announcements around data governance in...

Time for the adaptive profession – APM reveals findings of its Projecting the Future report   18 Time for the adaptive profession – APM reveals findings of its Projecting the Future report   19
Business22 hours ago

Time for the adaptive profession – APM reveals findings of its Projecting the Future report  

The project profession is at the forefront of change, but needs to continually develop skills to stay relevant 15 September,...

Setting up secure remote working for financial services 20 Setting up secure remote working for financial services 21
Technology23 hours ago

Setting up secure remote working for financial services

By Pete Watson, CEO, Atlas Cloud Financial advisors, insurers, banks and brokers; the entire financial services sector has been forced...

Ensuring ATMs aren’t the weakest link to banking cybersecurity 22 Ensuring ATMs aren’t the weakest link to banking cybersecurity 23
Banking23 hours ago

Ensuring ATMs aren’t the weakest link to banking cybersecurity

By Elida Policastro, Regional VP – Cybersecurity division at Auriga Digital banking brings huge benefits to customers, but the risks...

Newsletters with Secrets & Analysis. Subscribe Now