Half of UK adults don’t trust computer algorithms

BCS found that as few as 7% of respondents trusted algorithms to be used by the education sector

More than half (53%) of UK adults don’t trust computer algorithms to make decisions on issues ranging from welfare to education, according to a survey conducted by BCS, The Chartered Institute for IT.

The research was conducted in the aftermath of the A-levels results scandal, which saw 36% of young people having their grades marked down by an algorithm. Although it was ultimately scrapped in favour of teachers’ predictions, the issue might have resulted in a heightened distrust in automated decision-making, which is becoming increasingly prevalent.

BCS found that as little as 7% of respondents trusted algorithms to be used by the education sector, on par with social services and the armed forces. 

The trust in algorithms also differed between age groups. While only one in twenty (5%) over-55s expressed confidence in the use of algorithms, the percentage was over three times higher in the 18-24 age group at 16%.

Older people were generally less trusting about the use of algorithms in public life, with 63% of over-55s feeling negative about the idea, compared with 42% of 18-24-year-olds. 

The disparity was also reflected in attitudes towards computerised decisions made by the NHS, private healthcare, and local councils. Almost one in three (30%) 18-24-year-olds said they trusted the use of algorithms in these sectors, while for those over 55, it was 14%.

Overall, automated decision-making was most likely to generate trust when used by the NHS, at 17%, followed by financial services, at 16%, and intelligence agencies, at 12%. These services use algorithms to determine issues such as medical diagnosis, credit scoring, and national security. Police forces and tech giants were among the least trusted when it came to using algorithms to make personal decisions about the respondents, both at 11%.

BCS director of policy Dr Bill Mitchell said that, despite the deep distrust in algorithms, "there is little understanding of how deeply they are embedded in our everyday life".

"People get that Netflix and the like use algorithms to offer up film choices, but they might not realise that more and more algorithms decide whether we’ll be offered a job interview, or by our employers to decide whether we’re working hard enough, or even whether we might be a suspicious person needing to be monitored by security services," he said.

According to Dr Mitchell, the government and businesses face problems with "balancing people’s expectations of instant decisions, on something like credit for a sofa, with fairness and accounting for the individual, when it comes to life-changing moments like receiving exam grades".

"That’s why we need a professionalised data science industry, independent impact assessments wherever algorithms are used in making high-stakes judgements about people’s lives, and a better understanding of AI and algorithms by the policymakers who give them sign-off," he added. 

Following the release of the algorithm-based A-levels results, the UK government and Ofcom were faced with at least three legal challenges, which included a potential GDPR violation.

Featured Resources

Choosing a collaboration platform

Eight questions every IT leader should ask

Download now

Performance benchmark: PostgreSQL/ MongoDB

Helping developers choose a database

Download now

Customer service vs. customer experience

Three-step guide to modern customer experience

Download now

Taking a proactive approach to cyber security

A complete guide to penetration testing

Download now

Most Popular

Microsoft is submerging servers in boiling liquid to prevent Teams outages
data centres

Microsoft is submerging servers in boiling liquid to prevent Teams outages

7 Apr 2021
How to find RAM speed, size and type
Laptops

How to find RAM speed, size and type

8 Apr 2021
Roadmap 2021: What’s coming from 3CX
Advertisement Feature

Roadmap 2021: What’s coming from 3CX

30 Mar 2021