Half of UK adults don’t trust computer algorithms

BCS found that as few as 7% of respondents trusted algorithms to be used by the education sector

More than half (53%) of UK adults don’t trust computer algorithms to make decisions on issues ranging from welfare to education, according to a survey conducted by BCS, The Chartered Institute for IT.

The research was conducted in the aftermath of the A-levels results scandal, which saw 36% of young people having their grades marked down by an algorithm. Although it was ultimately scrapped in favour of teachers’ predictions, the issue might have resulted in a heightened distrust in automated decision-making, which is becoming increasingly prevalent.

BCS found that as little as 7% of respondents trusted algorithms to be used by the education sector, on par with social services and the armed forces. 

The trust in algorithms also differed between age groups. While only one in twenty (5%) over-55s expressed confidence in the use of algorithms, the percentage was over three times higher in the 18-24 age group at 16%.

Older people were generally less trusting about the use of algorithms in public life, with 63% of over-55s feeling negative about the idea, compared with 42% of 18-24-year-olds. 

The disparity was also reflected in attitudes towards computerised decisions made by the NHS, private healthcare, and local councils. Almost one in three (30%) 18-24-year-olds said they trusted the use of algorithms in these sectors, while for those over 55, it was 14%.

Overall, automated decision-making was most likely to generate trust when used by the NHS, at 17%, followed by financial services, at 16%, and intelligence agencies, at 12%. These services use algorithms to determine issues such as medical diagnosis, credit scoring, and national security. Police forces and tech giants were among the least trusted when it came to using algorithms to make personal decisions about the respondents, both at 11%.

BCS director of policy Dr Bill Mitchell said that, despite the deep distrust in algorithms, "there is little understanding of how deeply they are embedded in our everyday life".

"People get that Netflix and the like use algorithms to offer up film choices, but they might not realise that more and more algorithms decide whether we’ll be offered a job interview, or by our employers to decide whether we’re working hard enough, or even whether we might be a suspicious person needing to be monitored by security services," he said.

According to Dr Mitchell, the government and businesses face problems with "balancing people’s expectations of instant decisions, on something like credit for a sofa, with fairness and accounting for the individual, when it comes to life-changing moments like receiving exam grades".

"That’s why we need a professionalised data science industry, independent impact assessments wherever algorithms are used in making high-stakes judgements about people’s lives, and a better understanding of AI and algorithms by the policymakers who give them sign-off," he added. 

Following the release of the algorithm-based A-levels results, the UK government and Ofcom were faced with at least three legal challenges, which included a potential GDPR violation.

Featured Resources

Navigating the new normal: A fast guide to remote working

A smooth transition will support operations for years to come

Download now

Leading the data race

The trends driving the future of data science

Download now

How to create 1:1 customer experiences at scale

Meet the technology capable of delivering the personalisation your customers crave

Download now

How to achieve daily SAP releases

Accelerate the pace of SAP change to support your digital strategy

Download now

Most Popular

Microsoft hints at stand-alone successor to Office 2019 suite
Microsoft Office

Microsoft hints at stand-alone successor to Office 2019 suite

24 Sep 2020
16 ways to speed up your laptop
Laptops

16 ways to speed up your laptop

16 Sep 2020
16 ways to speed up your laptop
Laptops

16 ways to speed up your laptop

16 Sep 2020