The UK is expanding its use of facial recognition technology developed by Israeli company Corsight AI, known for its controversial use by the Israeli military and issues with accuracy and racial bias, raising significant privacy and civil liberties concerns. Critics warn that this widespread surveillance, endorsed by the Home Secretary, risks wrongful targeting, disproportionately affects minority communities, and threatens to create a dystopian “big brother” society.
The UK Home Secretary, Shabana Mahmood, is advancing the largest deployment of facial recognition technology for police in England and Wales. The current use of 10 live facial recognition vans is set to expand to over 50, aiming to help law enforcement identify more individuals on watch lists. This expansion raises significant privacy and civil liberties concerns, especially regarding the technology’s origins and accuracy.
The facial recognition software powering these police vans is developed by the Israeli company Corsight AI, introduced into the UK market through its contractor Digital Barriers. Corsight AI’s technology has a controversial history, having been tested and used by the Israeli military’s spy unit 8200 during operations in Gaza. This unit is notorious for mass surveillance, including capturing and storing private phone calls of Palestinians, highlighting the invasive nature of the technology.
Corsight AI’s system works by scanning faces and automatically matching them against lists of wanted individuals, often without the consent of those being scanned. In Gaza, Israeli soldiers equipped with Corsight AI cameras set up checkpoints to scan Palestinians fleeing conflict zones, identifying those allegedly linked to Hamas. This process has led to serious ethical concerns, as the AI determines military targets based on facial recognition, raising the risk of wrongful targeting.
The technology is also known for its inaccuracies. Israeli security officials have admitted that Corsight AI has mistakenly flagged innocent civilians as Hamas militants. Despite these flaws, Essex police in the UK have already adopted this technology for live facial recognition, sparking fears about wrongful arrests and the erosion of civil liberties. Campaigners warn that such surveillance tools contribute to a “big brother” society, disproportionately affecting minority communities.
Moreover, the Home Office has acknowledged that facial recognition technology is more likely to misidentify Black and Asian individuals compared to white people, exacerbating concerns about racial bias. Nevertheless, the Home Secretary has expressed enthusiasm for creating a pervasive surveillance state, likening it to a modern-day panopticon. Critics warn that this approach threatens privacy and civil rights, drawing dystopian parallels to George Orwell’s 1984 as the UK moves toward widespread facial recognition by 2026.