The A-level results crisis has once again shown the limits of AI

When will those in government learn that technology isn’t magic?

Let’s be clear: artificial intelligence (AI) and its offspring are wonderful things. They can recognise patterns faster than humans can, meaning things like security threats that can be detected more quickly and accurately. They can be used to recognise hand gestures, which could have a positive impact on surgery, for example. In self-driving cars they can – theoretically at least – detect hazards and react to them quicker than humans.

But when it comes to making decisions about people’s lives, AI is often the worst possible tool to throw at the problem, as this year’s A-level results fiasco (and, undoubtedly, the GCSE results fiasco, when they’re released on 20 August) shows.

This isn’t news to anyone, either - or at least it shouldn’t be. Algorithms have repeatedly determined that black people, especially black men, are criminals. And that’s if they can even differentiate between one black person and another. Asian people, meanwhile, are terrorists, algorithms have determined. If you’re white, however, then please come this way, sir or madam, as you’re clearly virtuous thanks to your skin colour – or so a now shelved Home Office AI system determined.

While the algorithm being used to determine this year’s exam results isn’t racially biased, it’s defective in multiple other ways and is ruining people’s lives.

As revealed in tweets from BBC Newsnight’s policy editor, Lewis Goodall, the AI system that’s being used to decide students’ grades has penalised outstanding students based on the performance of past students, sometimes to an extraordinary extent. These stories include a student with predicted a B or C who the algorithm instead awarded a U because of an apparent faulty decision process. Another student, who was predicted AAA by his school and A*A*A* by UCAS, the algorithm gave BBB. And there are many, many more.

Of course, every year there are some students who are disappointed by their final grades, but this is not the same. This year’s students are effectively being penalised for not having taken the exam at all, which they were unable to do through no fault of their own.

As much as some people may declare that A-level, GCSE and Scottish Higher results don’t matter, the reality is that they do. They matter in the moment emotionally to the children who have taken them, and they matter because they are a stepping stone to the next stage of their education or to entering the workforce.

I can’t shake the feeling that the UK government, and governments elsewhere, have fallen in love with a technology they don’t fully understand. But as those blinded by love so often do, they’re sticking by the object of their affection, despite clear evidence that it’s fundamentally broken.

As I’ve said before, technology isn’t magic. In fact, it can plunge us into a dystopian world where our fate is determined by a faulty AI that pays no regard to our own character or achievements. Instead, a black box algorithm pulls in data about where we live, what school we went to, the colour of our skin.

There’s no sci-fi hero coming to save us from the misuse of technology, though – it’s up to us as a society to push back against it, before it’s too late.

Featured Resources

Managing security risk and compliance in a challenging landscape

How key technology partners grow with your organisation

Download now

Security best practices for PostgreSQL

Securing data with PostgreSQL

Download now

Transform your MSP business into a money-making machine

Benefits and challenges of a recurring revenue model

Download now

The care and feeding of cloud

How to support cloud infrastructure post-migration

Watch now

Recommended

Habana Labs chosen to AI train the Voyager supercomputer
high-performance computing (HPC)

Habana Labs chosen to AI train the Voyager supercomputer

8 Apr 2021
Biome Makers and Bayer Crop Science bring AI to agriculture
artificial intelligence (AI)

Biome Makers and Bayer Crop Science bring AI to agriculture

7 Apr 2021
IT Pro News in Review: WhatsApp under scrutiny, Ransomware refunds and people-driven hacking
public sector

IT Pro News in Review: WhatsApp under scrutiny, Ransomware refunds and people-driven hacking

2 Apr 2021
Infosys and LivePerson partner on conversational AI
artificial intelligence (AI)

Infosys and LivePerson partner on conversational AI

31 Mar 2021

Most Popular

Microsoft is submerging servers in boiling liquid to prevent Teams outages
data centres

Microsoft is submerging servers in boiling liquid to prevent Teams outages

7 Apr 2021
Data belonging to 500 million LinkedIn users found for sale on hacker marketplace
hacking

Data belonging to 500 million LinkedIn users found for sale on hacker marketplace

8 Apr 2021
How to find RAM speed, size and type
Laptops

How to find RAM speed, size and type

8 Apr 2021