Streaming AI for visa applications is biased, rights group claims
The Home Office is facing a legal challenge over a "secret" algorithm used to filter immigration candidates
A streaming algorithm used by the Home Office to filter UK visa applications is to be the subject of a legal case launched by immigration rights campaigners.
The legal challenge will look at how this artificial intelligence program affects immigration policy decisions, which the campaigners say is a racially biased and unlawful system.
The challenge is being brought by the Joint Council for the Welfare of Immigrants and a human rights group called Foxglove. The two organisations are seeking a judicial review to find out what the algorithm is and what it is used for.
Foxglove formed in 2015 as an organisation to stand up to big tech companies and governments that use digital services that harm or mistreat the general public. It is made up of lawyers, tech experts and communications specialists and takes its name from a European wildflower (digitalis sp) that can both harm and cure.
The Home office has declined a request to explain how the software works, according to Foxglove's operational director Martha Dark. She said that it is "hiding behind an immigration exemption in the Freedom of Information Act".
"Remember Theresa May's racist migration policy that put 'Go Home' vans on our streets and gave us the Windrush scandal if you thought it was history, think again," Dark wrote on a fundraising page. "The hostile environment continues. And it is now going digital.
"We've discovered a secret technology that basically works as a digital hostile environment. We need your help to expose it and to take the Home Office to court."
According to Foxglove, the Home Office is using a "secretive algorithm", described as a digital "streaming tool", to sift visa applications. The algorithm scans applications and directs them into a fast lane, denoted by the colour green, a slow lane which is yellow, or a "full digital pat-down" which is red.
"They've given a shadowy, computer-driven process the power to affect someone's chances of getting a visa," Dark said. "And as best as we can tell, the machine is using problematic and biased criteria, like nationality, to choose which 'stream' you get in. People from rich white countries get Speedy Boarding; poorer people of colour get pushed to the back of the queue."
The campaigners say the algorithm may be discriminating on the basis of crude characteristics like nationality or age, rather than assessing applicants fairly, on the merits. IT Pro contacted the Home Office which said the tool "complies fully with the relevant legislation under the Equalities Act 2010".
"We have always used processes that enable UK Visas and Immigration to allocate cases in an efficient way," a Home Office spokesperson said. "The streaming tool is only used to allocate applications, not to decide them."
Gracie Bradley, Liberty policy and campaigns manager, said that decisions which impact people's rights should be made by people and not machines.
"It's vital immigration decisions are fully transparent so people can understand and challenge them - opaque, unaccountable algorithms make that impossible," she told IT Pro. "It's especially concerning this tool appears to stream people based on nationality - a recipe for discriminatory decision-making."
What you need to know about migrating to SAP S/4HANA
Factors to assess how and when to begin migrationDownload now
Your enterprise cloud solutions guide
Infrastructure designed to meet your company's IT needs for next-generation cloud applicationsDownload now
Testing for compliance just became easier
How you can use technology to ensure compliance in your organisationDownload now
Best practices for implementing security awareness training
How to develop a security awareness programme that will actually change behaviourDownload now