Technology used to process UK visa applications is to be scrapped by the government after a legal challenge.
Campaigners argued the tool was “racist” and “discriminatory” for filtering applicants based on nationality.
The Home Office confirmed the system, which has been used in some form since 2015, will end on 7 August, adding its interim replacement will not take nationality into account.
The move has been welcomed by the two groups who had been seeking a judicial review.
Foxglove and the Joint Council for the Welfare of Immigrants (JCWI) were asking the courts to declare the system unlawful and in breach of the Equality Act 2010.
Their claim centred on a computer algorithm used to assign a traffic light score to all UK visa applications, before they were even seen by human case workers.
This initial “red, amber or green risk rating” played “a major role in determining” whether the application was approved or denied, according to the JCWI.
Campaigners say some nationalities were automatically given a red traffic light risk score, and that people from these countries were more likely to be denied a visa.
The system uses data on past visa denials to decide which countries should be deemed higher risk in the future, therefore reinforcing the bias into the system over time.
The problem is known as a “feedback loop” – where past discrimination is fed into a computer program, which learns from it, and then amplifies it in the future.
Chai Patel, legal policy director of JCWI, said: “This streaming tool took decades of institutionally racist practices, such as targeting particular nationalities for immigration raids, and turned them into software.”
The successful challenge is thought to be the first of its kind against an algorithm-driven decision making system in the UK.
The pressure groups involved want more scrutiny of how such tools are used across government.