There is already a “comprehensive legal framework” in place to govern the police’s use of live facial recognition (LFR), the UK government has said in response to a Lords committee that found the technology is being deployed without proper scrutiny or accountability.
Writing to the Home Secretary on 27 January 2024, the Lords Justice and Home Affairs Committee (JHAC) outlined the findings of its brief investigation into the use of LFR by UK police, noting there is no clear legal basis for their deployments, and no rigorous standards or systems of regulation in place to control how technology is used by police.
On the back of its findings, the committee made a number of recommendations, which included, for example, creating a new legislative framework specifically for facial recognition; publishing of national regulations on how “extensive crowd-scanning activity” is being assessed for lawfulness, including key questions around proportionality and necessity; and carrying out regular assessments of public attitudes towards the technology.
Responding to the JHAC in a letter published on 8 April 2024, the government said it’s committed to empowering the police to use the tools and technology it needs, adding that LFR has already helped forces quickly and accurately identify people wanted for serious crimes or who pose a high risk of harm.
It said that while only four police forces in the UK have deployed LFR (the Met, South Wales Police, Northamptonshire and Essex), all forces are “routinely” applying retrospective facial recognition (RFR) to images captured by CCTV to identify suspects in the footage.
“Use [of facial recognition] is governed by data protection, equality and human rights laws, and can only be used for a policing purpose where necessary, proportionate and fair,” it said, adding that the legal framework also includes common law powers to prevent and detect crime, the Police and Criminal Evidence Act 1984 (PACE), the College of Policing Authorised Professional Practice (CoP’s APP) on LFR, and various policies published by forces themselves.
For each recommendation made by the JHAC, the government either explained how current oversight is sufficient, or how it’s already doing what is being recommended.
Previous investigation
A previous investigation by the JHAC into how police are using a variety of algorithmic technologies described the situation as “a new Wild West” characterised by a lack of strategy, accountability and transparency from the top down.
In July 2022, the government also rejected the findings and recommendations of that investigation, claiming there is already “a comprehensive network of checks and balances” to manage how police are using various algorithmic technologies.
Responses to specific concerns
Lords said that while they accept the value of LFR to police, “we are deeply concerned” that its use is being expanded without proper scrutiny or accountability.
“We believe that, as well as a clear, and clearly understood, legal foundation, there should be a legislative framework, authorised by Parliament for the regulation of the deployment of LFR technology,” said the JHAC. “We also believe that the government must lead a wider public debate about the use of LFR technology, as used now and as it develops, to ensure public confidence and support.”
Lords added that “government should not wait for the legality of LFR deployment to be tested again in the courts”, referring to an August 2020 court decision that found South Wales Police (SWP) used the tech unlawfully after failing to conduct a data protection impact assessment or fulfil its Public Sector Equality Duty (PSED) to consider how its policies and practices could be discriminatory.
Based on senior police officers’ insistence that LFR is reserved for only the most serious criminality, Lords also questioned how this is defined (given that people are included on watchlists for shoplifting and unspecified traffic offences), and how that definition impacts the police’s assessments of necessity and proportionality.
Writing back to the committee, the government reiterated its claim there is already a “comprehensive legal framework” six times throughout the response, and further noted that the Bridges vs SWP dispute was a test case to clarify the law around LFR.
It said the issues raised by the case have since been solved, as the CoP APP now outlines what categories of people could be included in watchlists and under what circumstances could the technology be used; while independent testing of the recognition algorithms is ensuring the forces are now satisfying their PSED.
On the question of how serious criminality is defined, and how this impacts necessity and proportionality decisions, the government said: “While the focus of LFR use is on tackling the most serious crimes and vulnerabilities, as described to you by the police representatives, it can be used at the same time to tackle less serious offences. Indeed, the Divisional Court said that by including all those who were wanted on warrant, there was potentially a considerable additional benefit to the public interest.”
The government added that police forces are best placed to determine the composition of watchlists, and that safeguards around people’s images being included are already in place (including the need for reasonable suspicion and approval by a higher-ranking officer): “In each case there needs to be appropriate justification and authorisation, always passing the tests of necessity, proportionality and use for a policing purpose.”
In her appearance before the JHAC in December 2023, the Met’s director of intelligence, Lindsey Chiswick, outlined how watchlists are “pulled together not based on an individual, but based on those crime types” attached to people’s custody images: “It’s then taken to approval from an authorising officer. In the Met, the authorising officer is superintendent-level or above.”
Academic Karen Yeung, an interdisciplinary professorial fellow in law, ethics and informatics at Birmingham Law School, challenged the proportionality and necessity of this approach during the evidence session, claiming the coercive power of the state means police must be able to justify each entry to the watchlists based on the specific circumstances involved, rather than their blanket inclusion via “crime types”.
Commenting on the fact there have been relatively few arrests despite the technology having scanned hundreds of thousands of faces across dozens of deployments, the government added: “The value of LFR to policing and the public cannot be assessed by simply comparing the total numbers of those passing the LFR system to the resulting arrests only, but needs to look wider to its ability to disrupt and deter criminality and keep the public safe.”
In response to a specific recommendation from Lords to make it mandatory to include police facial recognition algorithms in the Algorthmic Transparency Reporting Standard (ATRS), the government said the Home Office will work with the Department for Science, Innovation and Technology (DSIT) and Cabinet Office to assess best approaches.
While the government recently expanded ARTS in February 2024 so it is mandatory for central government bodies to register their algorithms, it currently does not extend to local bodies such as councils or police forces.
A long-running issue
Both Parliament and civil society have repeatedly called for new legal frameworks to govern law enforcement’s use of biometrics – including the UK’s former biometrics commissioner, Paul Wiles; an independent legal review by Matthew Ryder QC; the UK’s Equalities and Human Rights Commission; and the House of Commons Science and Technology Committee, which called for a moratorium on LFR as far back as July 2019.
In an exclusive interview with Computer Weekly, the outgoing biometrics and surveillance camera commissioner for England and Wales, Fraser Sampson, also highlighted a number of issues with how UK police had approached deploying its facial-recognition capabilities, and warned that the future oversight of police tech is at risk as a result of the government’s proposed data reforms.
Sampson specifically noted the thinness of the evidential basis around its effectiveness in tackling serious crimes, and further highlighted the risk of the UK slipping into “all-encompassing” facial-recognition surveillance.
In its 2024 Spring Budget, the government announced that an additional £230m would be invested in a range of “time and money-saving technology” for police, including LFR, artificial intelligence and automation.