The Court Has Spoken, The Cameras Are Staying
On 21 April 2026, the High Court of England and Wales handed down a judgment that will either reassure you or terrify you, depending on how you feel about cameras that can identify your face in a crowd before you've had your morning coffee. The Metropolitan Police's Live Facial Recognition technology, LFR, to its friends, and it has many friends in high places, has been declared entirely lawful. Sleep soundly, London.
The case was brought by two claimants with rather compelling personal stakes in the matter. First, there's Shaun Thompson, a Black community volunteer who was stopped, detained, and threatened with arrest after LFR mistook him for his own brother, who, to add a delicious twist, was wanted for allegedly assaulting Mr Thompson himself. The machine managed to victimise the victim twice. Second, Silkie Carlo, director of the civil liberties organisation Big Brother Watch, who stopped attending the Notting Hill Carnival and chose not to join a protest at the King's Coronation, precisely because the police had announced LFR would be deployed there. In a democratic society, choosing not to exercise your right to protest because the state is watching is called a "chilling effect." In this judgment, it is acknowledged, noted, and then largely moved past.
The claimants were not arguing that facial recognition technology is evil in principle. They were raising a narrower, more technical point: does the Metropolitan Police's policy governing LFR have sufficient "quality of law"? Under the European Convention on Human Rights, any interference with your right to privacy, free expression, or freedom of assembly must be not just legal but lawfully structured, meaning it cannot leave so much discretion to individual officers that deployment becomes, in effect, a matter of whim. The standard, borrowed from centuries of legal tradition, is that government must be a matter of laws, not of men. Or, in modern London, not of men standing next to a van with a camera on top.
The court traced this question through the Metropolitan Police's revised policy, introduced in September 2024 after an earlier version was quietly shelved following, you guessed it, this very lawsuit. The new policy is genuinely elaborate. It divides deployments into three "Use Cases." Use Case A covers crime hotspots, defined as small geographical hexagons where crime data places an area in the top 25% of its borough. Use Case B covers protective security operations, like airports or large events where there is specific intelligence of a threat. Use Case C covers situations where intelligence suggests a specific wanted person will be at a specific location. There are authorising officers, proportionality assessments, community impact assessments, equality assessments, a Central Tasking and Co-ordination Group, Gold Commanders, Silver Commanders, and Bronze Commanders. It is, on paper, the most bureaucratically conscientious surveillance apparatus imaginable.
The claimants' lawyers, led by Dan Squires KC, argued that despite all this procedural scaffolding, the "where" question, where exactly can the police point a camera capable of scanning thousands of faces per minute, remained troublingly vague. To support this, they commissioned a Professor of Operational Research from UCL to crunch the numbers. His conclusion: under Use Case A alone, LFR could theoretically be deployed across somewhere between 62% and 85% of the entire Metropolitan Police District. The Metropolitan Police countered with their own modelling, arriving at a more modest 7.6% of deployable public space. Both sets of figures were, in the end, dismissed by the court as irrelevant, a policy may cover a large or small proportion of a city and still be perfectly lawful or unlawful; geography alone tells you nothing about legality. The expert evidence was thrown out entirely. Several hundred pages of mathematical back-and-forth were, judicially speaking, a game of ping pong that nobody asked for.
What the court found persuasive was the interconnected architecture of the policy itself. The "why," "who," and "where" questions are linked. You cannot simply decide to point a facial recognition camera at a shopping street because it seems like a nice day for it. There must be a defined use case, a watchlist constructed according to specific criteria, an authorised offence type relevant to the area, and a mandatory proportionality check that explicitly requires officers to consider whether deployment near a protest, a mosque, or a school is justified given its likely chilling effect on fundamental rights. The court found this was substantially more robust than what the police in South Wales had been operating when a previous facial recognition case, R v Bridges, found their policy unlawful in 2020.
The Equality and Human Rights Commission intervened to remind the court that the world has changed rather significantly since 2020, that artificial intelligence has accelerated dramatically, that LFR at protests poses particular democratic risks, and that perhaps this all deserves a fresh appraisal. The court noted these concerns politely, agreed they might be important, and concluded they were not the question before it. The question before it was whether the current policy has the quality of law. It does.
There is something genuinely interesting and genuinely unsettling about how this judgment works. The court is explicit that it is not evaluating whether LFR is a good idea, whether it is disproportionately deployed in areas with higher Black populations, whether the chilling effect on protest is acceptable in a democracy, or whether handing the state a tool that can scan a million and a half faces in eight months is wise public policy. Those are questions for Parliament. The court's job is narrower: does the policy contain sufficient constraints to avoid arbitrariness? And here, it does.
Which means the Metropolitan Police can continue scanning your face on the way to the shops, provided a sufficiently detailed form has been filled in first. Mr Thompson, who was detained for being the wrong person, received a settlement in a separate part of the case. Ms Carlo, who stopped attending public events to avoid being watched, lost. The cameras will keep running.
There is, of course, an appeal process. There is also, apparently, a Notting Hill Carnival.
You’re already paying for the cameras with your taxes, so why not pay us for the privilege of knowing they’re there? Support our work and help us buy enough coffee to survive the next 900-page judicial review. Thank you!
References
'R (Thompson and Carlo) v Commissioner of Police of the Metropolis' (2026). EWHC 915 (Admin), High Court of Justice, King's Bench Division, Administrative Court, Divisional Court, 21 April 2026. Available at: https://www.judiciary.uk/wp-content/uploads/2026/04/AC-2024-LON-001764-R-Thompson-and-Carlo-version-for-hand-down-21-04-2026.pdf (Accessed: 22 April 2026).
Data Protection Act 2018 c.12. Available at: https://www.legislation.gov.uk/ukpga/2018/12/contents (Accessed: 22 April 2026).
Human Rights Act 1998 c.42. Available at: https://www.legislation.gov.uk/ukpga/1998/42/contents (Accessed: 22 April 2026).
Metropolitan Police Service (2024). MPS Overt LFR Policy Document. Metropolitan Police Service, London. Available at: https://www.met.police.uk/SysSiteAssets/media/downloads/force-content/met/advice/lfr/policy-documents/lfr-policy-document2.pdf (Accessed: 22 April 2026).
Metropolitan Police Service (2024) Data Protection Impact Assessment. Metropolitan Police Service, London. Available at: https://www.met.police.uk/SysSiteAssets/media/downloads/force-content/met/advice/lfr/impact-assessments/lfr-dpia2.pdf (Accessed: 22 April 2026).
Metropolitan Police Service (2024) Overt LFR Legal Mandate. Metropolitan Police Service, London. Available at: https://www.met.police.uk/SysSiteAssets/media/downloads/force-content/met/advice/lfr/new/lfr-legal-mandate.pdf (Accessed: 22 April 2026).
Metropolitan Police Service (2024) Equality Impact Assessment. Metropolitan Police Service, London. Available at: https://www.met.police.uk/SysSiteAssets/media/downloads/force-content/met/advice/lfr/impact-assessments/lfr-eia2.pdf (Accessed: 22 April 2026).
Equality and Human Rights Commission (2022). Artificial Intelligence: Meeting the Public Sector Equality Duty. EHRC, London. Available at: https://www.equalityhumanrights.com/guidance/artificial-intelligence-meeting-public-sector-equality-duty-psed (Accessed: 22 April 2026).
National Physical Laboratory (2023) Facial Recognition Technology in Law Enforcement Equitability Study Final Report, March 2023. NPL, Teddington. Available at: https://science.police.uk/site/assets/files/3396/frt-equitability-study_mar2023.pdf (Accessed: 22 April 2026).
Big Brother Watch (2024) Live Facial Recognition. Available at: https://bigbrotherwatch.org.uk (Accessed: 22 April 2026).