How to find and root out unconscious bias

A drawing of a silhouette of a head with a white hand over its eyes giving a thumbs up, indicating unconscious bias
(Image credit: Shutterstock)

For many tech firms, their mission is to develop products that are suited to the broad brush of the population – things that anyone would feel confident using.

That’s a taller order than it might at first sound. We’ve known for a long time that people with different backgrounds have different understandings of the world. Many organisations intellectually recognise the merits of a diverse workforce, which can bring different views based on different life-experiences to the table when coming up with new ideas. Developers might even employ social anthropologists to help them understand users in different parts of the community better.

But there’s another more insidious problem that can be harder to deal with: Unconscious bias. How does an organisation know if it’s got that kind of bias within it and once identified, how can it be rooted out?

What is unconscious bias?

To understand something we need to know what it looks like. Hilary Stephenson MD at Sigma – a UK user experience (UX) and design agency that works with the likes of the Department for Education on UX, accessibility and inclusivity projects – defined unconscious bias as: “Making decisions and communicating from a position of privilege, and in some cases prejudice, without being aware it’s a problem or even part of your thinking or behaviour.”

Within that comes all sorts of sub-categories with two in particular – affinity bias and confirmation bias – being among the most common in business.

Affinity bias is often seen where companies make hires for good ‘cultural fit’. Ultimately this just means the range of views available when ideas are discussed is narrow rather than broad. Confirmation bias – liking people because of characteristics they have that are familiar or comfortable, for example in recruitment feeling positive about a candidate when you recognise a school, university or previous employer. Again this just narrows the pool of talent rather than broadening it.

The problem is that such biases can be buried deep, with people having no idea they hold these attitudes. Even if they’re presented with evidence through tests or training, some will dismiss it believing they are ‘better than that’.

External help is useful – but beware

It’s not just people who can hold biases, either. Unconscious bias in particular can often bleed into data, which in turn creates prejudices within systems like artificial intelligence (AI). As Big Data LDN founder, Bill Hammond, explains to IT Pro: “Data disproportionately weighted is unreliable and low-quality. It can impact all business functions which rely on the data, including an organisations’ decision-making, algorithms which determine customer interactions or the business’s ability to digitally transform.”

Hammond says that external help can be useful in overcoming problems with identifying unconscious bias in an organisation and then rooting it out. “The definition of unconscious bias itself states that businesses are often unaware their data is disproportionately weighted,” he explains. “External expertise can often see bias that has gone unnoticed to date, bringing it to the forefront and also offering strategies with which to identify and eliminate the issue.”

Sigma’s Stephenson adds a word of caution on using external help, however, saying: “There is some criticism of organisations doing one-off training sessions. That’s often just the starting point or the diagnosis and it takes a plan and commitment to build from the theory. Listen to the experts, who will have a good view on where you need to improve.”

Striving for continuous improvement

For Stephenson in her world of UX design, rooting out unconscious bias is a process of co-production, and this is about much more than the odd workshop. “It’s about building and sharing power between people who come together to focus on user needs, and to understand constraints around technology, and what financial exclusion or low digital confidence feels like,” she tells IT Pro. “This is how you make a product or service better for people. I’ve seen this done to great effect for people who’ve experienced homelessness, or those managing ongoing mental health challenges.”

Looking again at the question of data and guidance from outside an organisation, Hammond says: “External experts can also implement business processes and train or upskill internal teams on specific processes to identify and remove disproportionately weighted data moving forward. These processes can include how to neutralise data collection and data analysis.”

Both agree that there is no one-off cure-all to rooting out unconscious bias. For Hammond, firms need to ensure they are “continuously working to identify and remove data bias; they need to invest in data-driven initiatives and implement a data-driven culture throughout the organisation”.

Meanwhile Stephenson looks at continuous commitment from another angle, noting that organisations should avoid a “one size fits all” approach, even – or especially – in unexpected situations and moments of crisis, as it will undoubtedly lead to exclusion. Giving a real world example, she says: “During the pandemic, our team has worked hard to find ways of engaging people in research, design and testing activities who might be excluded because they are older, shielding, or they have specific access needs that mean remote design processes need to be adapted.

“We have hired people who have a personal understanding of bias and exclusion, but we can always do more. Continually reviewing your processes and methods from the perspective of how you might be excluding, hindering or harming is useful.”

Sandra Vogel
Freelance journalist

Sandra Vogel is a freelance journalist with decades of experience in long-form and explainer content, research papers, case studies, white papers, blogs, books, and hardware reviews. She has contributed to ZDNet, national newspapers and many of the best known technology web sites.

At ITPro, Sandra has contributed articles on artificial intelligence (AI), measures that can be taken to cope with inflation, the telecoms industry, risk management, and C-suite strategies. In the past, Sandra also contributed handset reviews for ITPro and has written for the brand for more than 13 years in total.