The A-level results crisis has once again shown the limits of AI

When will those in government learn that technology isn’t magic?

Let’s be clear: artificial intelligence (AI) and its offspring are wonderful things. They can recognise patterns faster than humans can, meaning things like security threats that can be detected more quickly and accurately. They can be used to recognise hand gestures, which could have a positive impact on surgery, for example. In self-driving cars they can – theoretically at least – detect hazards and react to them quicker than humans.

But when it comes to making decisions about people’s lives, AI is often the worst possible tool to throw at the problem, as this year’s A-level results fiasco (and, undoubtedly, the GCSE results fiasco, when they’re released on 20 August) shows.

This isn’t news to anyone, either - or at least it shouldn’t be. Algorithms have repeatedly determined that black people, especially black men, are criminals. And that’s if they can even differentiate between one black person and another. Asian people, meanwhile, are terrorists, algorithms have determined. If you’re white, however, then please come this way, sir or madam, as you’re clearly virtuous thanks to your skin colour – or so a now shelved Home Office AI system determined.

While the algorithm being used to determine this year’s exam results isn’t racially biased, it’s defective in multiple other ways and is ruining people’s lives.

As revealed in tweets from BBC Newsnight’s policy editor, Lewis Goodall, the AI system that’s being used to decide students’ grades has penalised outstanding students based on the performance of past students, sometimes to an extraordinary extent. These stories include a student with predicted a B or C who the algorithm instead awarded a U because of an apparent faulty decision process. Another student, who was predicted AAA by his school and A*A*A* by UCAS, the algorithm gave BBB. And there are many, many more.

Of course, every year there are some students who are disappointed by their final grades, but this is not the same. This year’s students are effectively being penalised for not having taken the exam at all, which they were unable to do through no fault of their own.

As much as some people may declare that A-level, GCSE and Scottish Higher results don’t matter, the reality is that they do. They matter in the moment emotionally to the children who have taken them, and they matter because they are a stepping stone to the next stage of their education or to entering the workforce.

I can’t shake the feeling that the UK government, and governments elsewhere, have fallen in love with a technology they don’t fully understand. But as those blinded by love so often do, they’re sticking by the object of their affection, despite clear evidence that it’s fundamentally broken.

As I’ve said before, technology isn’t magic. In fact, it can plunge us into a dystopian world where our fate is determined by a faulty AI that pays no regard to our own character or achievements. Instead, a black box algorithm pulls in data about where we live, what school we went to, the colour of our skin.

There’s no sci-fi hero coming to save us from the misuse of technology, though – it’s up to us as a society to push back against it, before it’s too late.

Featured Resources

Managing security risk and compliance in a challenging landscape

How key technology partners grow with your organisation

Download now

Evaluate your order-to-cash process

15 recommended metrics to benchmark your O2C operations

Download now

AI 360: Hold, fold, or double down?

How AI can benefit your business

Download now

Getting started with Azure Red Hat OpenShift

A developer’s guide to improving application building and deployment capabilities

Download now

Recommended

AI 360: Hold, fold, or double down?
Whitepaper

AI 360: Hold, fold, or double down?

14 Jan 2021
White House launches AI Initiative Office
Policy & legislation

White House launches AI Initiative Office

13 Jan 2021
How to become a machine learning engineer
Careers & training

How to become a machine learning engineer

23 Dec 2020
Nearly half of IT leaders fear AI will replace them by 2030
artificial intelligence (AI)

Nearly half of IT leaders fear AI will replace them by 2030

9 Dec 2020

Most Popular

IT retailer faces €10.4m GDPR fine for employee surveillance
General Data Protection Regulation (GDPR)

IT retailer faces €10.4m GDPR fine for employee surveillance

18 Jan 2021
Citrix buys Slack competitor Wrike in record $2.25bn deal
collaboration

Citrix buys Slack competitor Wrike in record $2.25bn deal

19 Jan 2021
Should IT departments call time on WhatsApp?
communications

Should IT departments call time on WhatsApp?

15 Jan 2021