The A-level results crisis has once again shown the limits of AI

When will those in government learn that technology isn’t magic?

Let’s be clear: artificial intelligence (AI) and its offspring are wonderful things. They can recognise patterns faster than humans can, meaning things like security threats that can be detected more quickly and accurately. They can be used to recognise hand gestures, which could have a positive impact on surgery, for example. In self-driving cars they can – theoretically at least – detect hazards and react to them quicker than humans.

But when it comes to making decisions about people’s lives, AI is often the worst possible tool to throw at the problem, as this year’s A-level results fiasco (and, undoubtedly, the GCSE results fiasco, when they’re released on 20 August) shows.

This isn’t news to anyone, either - or at least it shouldn’t be. Algorithms have repeatedly determined that black people, especially black men, are criminals. And that’s if they can even differentiate between one black person and another. Asian people, meanwhile, are terrorists, algorithms have determined. If you’re white, however, then please come this way, sir or madam, as you’re clearly virtuous thanks to your skin colour – or so a now shelved Home Office AI system determined.

While the algorithm being used to determine this year’s exam results isn’t racially biased, it’s defective in multiple other ways and is ruining people’s lives.

As revealed in tweets from BBC Newsnight’s policy editor, Lewis Goodall, the AI system that’s being used to decide students’ grades has penalised outstanding students based on the performance of past students, sometimes to an extraordinary extent. These stories include a student with predicted a B or C who the algorithm instead awarded a U because of an apparent faulty decision process. Another student, who was predicted AAA by his school and A*A*A* by UCAS, the algorithm gave BBB. And there are many, many more.

Of course, every year there are some students who are disappointed by their final grades, but this is not the same. This year’s students are effectively being penalised for not having taken the exam at all, which they were unable to do through no fault of their own.

As much as some people may declare that A-level, GCSE and Scottish Higher results don’t matter, the reality is that they do. They matter in the moment emotionally to the children who have taken them, and they matter because they are a stepping stone to the next stage of their education or to entering the workforce.

I can’t shake the feeling that the UK government, and governments elsewhere, have fallen in love with a technology they don’t fully understand. But as those blinded by love so often do, they’re sticking by the object of their affection, despite clear evidence that it’s fundamentally broken.

As I’ve said before, technology isn’t magic. In fact, it can plunge us into a dystopian world where our fate is determined by a faulty AI that pays no regard to our own character or achievements. Instead, a black box algorithm pulls in data about where we live, what school we went to, the colour of our skin.

There’s no sci-fi hero coming to save us from the misuse of technology, though – it’s up to us as a society to push back against it, before it’s too late.

Featured Resources

Shaping the workplaces of the future

Rise to the challenge

Download now

Enabling a hybrid future

A guide to setting up new working practices

Download now

Seven steps to successful digital innovation and transformation

What to invest in and what to avoid when pursuing digital transformation

Watch now

Defend your organisation from evolving ransomware attacks

Learn what it takes to reduce risk and strengthen operational resiliency

Download now

Recommended

US creates AI data-sharing task force
artificial intelligence (AI)

US creates AI data-sharing task force

11 Jun 2021
ISG and Cognigy partner on automated conversational AI
artificial intelligence (AI)

ISG and Cognigy partner on automated conversational AI

10 Jun 2021
Mythic launches power-sipping AI chip
Hardware

Mythic launches power-sipping AI chip

8 Jun 2021

Most Popular

Ten-year-old iOS 4 recreated as an iPhone app
iOS

Ten-year-old iOS 4 recreated as an iPhone app

10 Jun 2021
GitHub to prohibit code that’s used in active attacks
cyber security

GitHub to prohibit code that’s used in active attacks

7 Jun 2021
WWDC 2021: Apple unveils iOS 15, macOS Monterey and more
iOS

WWDC 2021: Apple unveils iOS 15, macOS Monterey and more

8 Jun 2021