Connect with us

Global Banking and Finance Review is an online platform offering news, analysis, and opinion on the latest trends, developments, and innovations in the banking and finance industry worldwide. The platform covers a diverse range of topics, including banking, insurance, investment, wealth management, fintech, and regulatory issues. The website publishes news, press releases, opinion and advertorials on various financial organizations, products and services which are commissioned from various Companies, Organizations, PR agencies, Bloggers etc. These commissioned articles are commercial in nature. This is not to be considered as financial advice and should be considered only for information purposes. It does not reflect the views or opinion of our website and is not to be considered an endorsement or a recommendation. We cannot guarantee the accuracy or applicability of any information provided with respect to your individual or personal circumstances. Please seek Professional advice from a qualified professional before making any financial decisions. We link to various third-party websites, affiliate sales networks, and to our advertising partners websites. When you view or click on certain links available on our articles, our partners may compensate us for displaying the content to you or make a purchase or fill a form. This will not incur any additional charges to you. To make things simpler for you to identity or distinguish advertised or sponsored articles or links, you may consider all articles or links hosted on our site as a commercial article placement. We will not be responsible for any loss you may suffer as a result of any omission or inaccuracy on the website. .

Business

How technology can help tackle unconscious bias in the workplace

iStock 1351021979 - Global Banking | Finance

How technology can help tackle unconscious bias in the workplace

Picture3245 - Global Banking | FinanceBy Navin ‘nuvs’ Jain, Product Evangelist, Workhuman

Implicit bias in the workplace is in approximately 20-30% of written communications – even in the most positive settings. Given that unconscious bias is unintentional, how can technology help organisations to address it and promote a more inclusive environment?

If someone says, “I don’t think women are suited to leadership roles”, this type of overt bias is readily noticeable and can be confronted directly. But identifying and addressing implicit bias is much harder. Words that might seem neutral or even friendly in intent can sometimes have an offensive or negative impact. In fact, even in the most positive work environments, Workhuman research shows that about 20 to 30 percent of written communications show implicit bias.

Because it is unintentional, it’s important to proactively identify and educate people about why specific statements may contain bias. This is where technology can play a crucial role in addressing and mitigating these biases.

By combining machine learning and expert research and data on linguistic patterns, it’s possible for technology tools to recognise and highlight various forms of bias, and even offer tailored suggestions in real-time. This can help people steer clear of non-inclusive language in their written communications.

Beyond once-a-year training sessions, what can organisations do to tackle unconscious bias?

Annual training can be beneficial to help educate employees and teach them the skills to address bias in the workplace, but in order to effectively tackle unconscious bias, organizations need to make ongoing efforts that go beyond one-off workshops. On its own, infrequent training is less effective because it only offers a brief, isolated exposure to DEI topics. Organisations can tackle unconscious bias by implementing targeted and continuous strategies, such as providing employees with daily, in-the-moment learning opportunities that reinforce diversity, equity, and inclusion (DEI) principles.

Workhuman has developed ‘Inclusion Advisor’ as part of the Workhuman Cloud® – a digital platform where employees can share recognition for people’s work contributions. Inclusion Advisor is an AI-powered micro-coaching tool that is integrated into our Social Recognition platform. This tool helps screen employee recognition messages to flag unconscious bias to the author and make suggestions to help improve inclusivity and promote ongoing learning. This tool enables employees to learn where their gaps might be in communicating in inclusive ways and prompts them to learn more about how their language is perceived and be more aware as they communicate and share recognition messages.

Why was Inclusion Advisor developed by Workhuman and how does it work?

Workhuman’s core ethos revolves around building workplaces where employees feel recognised, valued and empowered to be themselves at work, in turn driving an organisation’s strategic vision, growth, and success.

Words matter, and inclusive and respectful language is an essential part of creating a culture of inclusion and belonging. Avoiding bias in workplace communications is difficult because implicit bias can be covert and even embedded in the phrases people use daily.

The idea for Inclusion Advisor was born from a desire to examine workplace language, unpack unconscious bias, and empower people to communicate in an inclusive way. Even in the most positive of contexts, such as when people are sharing a message recognising a co-worker for a job well done, implicit bias can be present.

Inclusion Advisor is a ‘smart assistant’ that flags biased phrases in written recognition messages, and provides specific, actionable advice, helping people write more inclusive, meaningful messages. In order for Inclusion Advisor to flag bias and make recommendations to improve recognition messages, the author has to proactively request coaching, making it a completely voluntary learning process that brings people into the act of using inclusive language.

The technology uses Artificial Intelligence, Natural Language Processing and carefully crafted patterns as determined by a team of linguistic experts to classify the underlying sentiment of a message and provide personalised recommendations.

Can you share some examples of the specific ways in which the Inclusion Advisor suggests rewriting language to be more inclusive and free of bias?

Implicit bias can come in many different forms. For instance, ageist language is a common type of unconscious bias. In the message, “You may be old enough to be my grandpa, but you’ve still got it! Your experience helped us problem-solve the client’s issue”, Inclusion Advisor might suggest instead saying, “Your methods are tried and true, and you’re a remarkable teammate that helped us problem-solve the client’s issue.”

Similarly, gender is another bias-prone area. When a message states, “Kate is a great role model for other women, and was so proactive and creative on the project”, it may suggest she is only able to impact those of her own gender. Inclusion Advisor might recommend more gender-neutral language, such as “Kate is a great role model, and was so proactive and creative on the project”. These recommendations don’t change the core message, but they can completely transform the impact. Instead of receiving a message that might be perceived as backhanded or biased, the recipient hears the author’s genuine intention: that their work is valued and their contributions are highly appreciated, fostering a more positive and inclusive environment.

A correction like this doesn’t just mitigate the impact of unconscious bias for the message recipient; it has ripple effects throughout the entire organization. Workhuman’s Social Recognition platform comes with the option to make recognition moments public on the organization’s social feed so everyone can celebrate a job well done. Ensuring that the messages that make it out into your organization are free of unconscious bias is extremely important so that when reading messages colleagues have exchanged, onlookers don’t come away with a negative opinion the author who may have inadvertently offended or put down the person they were trying to celebrate. Furthermore, it’s important they don’t hold the belief that their company would condone language or behaviour that jeopardized their employees’ sense of belonging.

How receptive are people to addressing the issue of unconscious bias in workplace communications, particularly in written messages?

In an initial Inclusion Advisor pilot, 75% of the time, employees changed language identified by Inclusion Advisor as biased in their written recognition and reward messages. This suggests that people are highly receptive to making corrections that reduce biased language and improving their language awareness to enhance inclusivity and reduce bias in the workplace.

Can you share any success stories that demonstrate the positive impact of addressing unconscious bias in written communications in the workplace?

Organisations using Inclusion Advisor have seen measurable results. Pharmaceutical company, Merck found that over a six-month period, 74% of Merck employees chose to change their recognition message after Inclusion Advisor flagged potentially biased language. Participants reported that Inclusion Advisor not only influenced their recognition messages, it also positively affected their interactions in other areas.

Likewise, during LinkedIn’s pilot launch of Inclusion Advisor with 1,500 employees across the globe, employees appreciated the automatic feedback, sending out 5,500 recognition messages during the pilot. About 22 percent of these messages used Inclusion Advisor. Based on the number of employees who sent recognition messages during the pilot, it’s estimated that more than 20,000 awards each year could be made more inclusive in their language.

What role does combatting unconscious bias play in talent retention and competitiveness in the marketplace?

Internalised stereotypes and biases about different groups can negatively impact how people relate to one another and people’s sense of belonging and inclusion in the workplace. Combatting unconscious bias isn’t just the right thing to do, it helps organisations achieve better business success. According to Gartner research, inclusive, gender-diverse teams outperform less inclusive ones by 50% on average, and three out of four organisations with diverse decision makers will exceed their financial targets.

Additionally, employees in diverse and inclusive workplaces tend to be significantly more emotionally invested in their organisation, with Catalyst research showing that 35% of worker’s emotional investment and 20% of their desire to stay at a company is linked to feelings of inclusion. And given that 72% of workers say that DEI is important in their decision to stay with their organisation, it’s clear that tackling unconscious bias is crucial to retaining talent.

Implicit bias and the road ahead

Inherent biases can inadvertently harm others, and new technologies like Inclusion Advisor are paving the way for more awareness of it and more inclusive workplaces. By flagging these biases and suggesting alternatives in a natural and voluntary way, employees are given opportunities to learn in real time.

Many employees report that this micro-coaching influences their language more widely, in their emails or chats with colleagues overall, and even in discussions outside work. As AI’s capabilities continue to grow, tools like Inclusion Advisor, which are integrated and in-the-moment, are likely to become increasingly common.

Global Banking & Finance Review

 

Why waste money on news and opinions when you can access them for free?

Take advantage of our newsletter subscription and stay informed on the go!


By submitting this form, you are consenting to receive marketing emails from: Global Banking & Finance Review │ Banking │ Finance │ Technology. You can revoke your consent to receive emails at any time by using the SafeUnsubscribe® link, found at the bottom of every email. Emails are serviced by Constant Contact

Recent Post