Facial recognition technology and the police
What's the cost to privacy?
A police van moves slowly down the street. It has mounted cameras on its roof.
The cameras are scanning the faces of every pedestrian that walks past, and Artificial Intelligence (AI) technology inside the van is comparing them with faces on a predetermined police watchlist of wanted criminals.
When there is a match, the computer alerts the officers on duty. An approach, and possibly an arrest, is made.
This is live facial recognition technology in action.
For many, this scenario conjures images from Orwell's 1984, with its all-seeing eye of the surveillance state. For others, it is an effective and harmless use of AI technology to crack down on crime.
At the end of 2023, live facial recognition had been used in five London boroughs. As of May 2024, it has been used in 14.
There are plans to bring it to more.
If it hasn't already arrived, live facial recognition technology might be coming to a borough near you.
What is it, and what is its cost to privacy?
Live security cameras are ubiquitous in London. The police have begun rolling out new technology that uses AI to "recognise" people on the street if they match a face on a predetermined watchlist.
Live security cameras are ubiquitous in London. The police have begun rolling out new technology that uses AI to "recognise" people on the street if they match a face on a predetermined watchlist.
"We've got to make sure the government justifies itself every inch of the way."
Lord Clement-Jones
“Keep asking.” Lord Clement-Jones, former Chair of the Selection Committee on Artifical Intelligence, is talking to me from his office in the House of Lords. He’s rushed for time – the bells are about to ring for a vote on the government’s controversial Rwanda bill. It’s a three-line whip, and he’s nervous about missing it. But he wants me to finish asking my questions - this is an issue he feels passionately about.
I’m trying to understand Clement-Jones’s concerns about the use of live facial recognition technology – concerns that he raised in a Lords debate in April 2022, when he said that we are “sleepwalking into a surveillance society”.
In London, the police use live facial recognition technology to compare a live video feed of faces to a predetermined watchlist of around 10,000 people. The technology effectively “recognises” a person when there is a match and alerts the police officers on duty.
"We are sleepwalking into a surveillance society."
Police in the UK first trialled it at Notting Hill Carnival in August 2016, using police cameras to capture live images of people in the crowd.
South Wales Police was the first to start using the technology regularly on the street in 2017.
Now, in 2024, the Metropolitan Police is rolling it out rapidly all over London.
In April this year, the Daily Mail reported that there had been 152 arrests in London made using live facial recognition technology since April 2023.
According to tweets from police forces alerting local residents to the use of the technology, live facial recognition technology has been deployed regularly in London since January 2022.
"Just because you can do something, doesn't mean you do."
Lord Clement-Jones is calling for a moratorium on the use of the technology.
“This government talks about pro-innovation policy,” Clement-Jones says, “and it talks about regulation in disparaging terms. It doesn’t really believe in regulation. Because it believes that by regulating, you’re stopping this wonderful innovation. I believe in innovation for a purpose – for a responsible purpose. Just because you can do something, doesn’t mean you do.”
Even over Zoom, Clement-Jones’ frustration with the government’s eagerness to roll-out live facial recognition technology is clear.
In October 2023, Chris Philp, the Policing Minister, wrote to police chiefs to say that “developing facial recognition technology as a crime fighting tool is a high priority”. He said that he expects “all forces to use it to its full potential,” and that “with a concerted effort it should be possible to double the number of searches by May 2024”.
Clement-Jones thinks that Philp is pushing for the technology without sufficient concern about the effect it could have on our civil liberties.
“All they want is a technology that they can use to demonstrate that they’re doing something,” Clement-Jones says, “or that the police are doing something. But I’m afraid I don’t think that’s good enough in a society where we have civil liberties and human rights.
“We’ve got to put markers down and make sure the government justifies itself every inch of the way.”
Clement-Jones is not alone is calling for a moratorium on the use of the controversial technology. Big Brother Watch has called the technology a “mortal threat to privacy”.
It has organised a campaign for a moratorium supported by 65 parliamentarians across the parties, including David Davis, Sir Ed Davey, Caroline Lucas, and Shami Chakrabarti.
London boroughs where live facial recognition technology was used up to December 2022. Map from Wikimedia Commons, Creative Commons Licence.
London boroughs where live facial recognition technology was used up to December 2022. Map from Wikimedia Commons, Creative Commons Licence.
London boroughs where live facial recognition technology was used up to December 2023. Map from Wikimedia Commons, Creative Commons Licence.
London boroughs where live facial recognition technology was used up to December 2023. Map from Wikimedia Commons, Creative Commons Licence.
London boroughs where live facial recognition technology has been used as of May 2024. Map from Wikimedia Commons, Creative Commons Licence.
London boroughs where live facial recognition technology has been used as of May 2024. Map from Wikimedia Commons, Creative Commons Licence.
The concerns of critics are four-fold. Firstly, in Clement-Jones' words, “there’s no legal basis at the moment”.
In 2020, a Court of Appeal judgment found that South Wales Police had been deploying live facial recognition technology unlawfully. The judgment said that there are “fundamental deficiencies in the legal framework currently in place” regarding the technology.
Specifically, the court said that “too much discretion is currently left to individual police officers”.
“It is not clear,” the judgment said, “who can be placed on the watchlist, nor is it clear that there are any criteria for determining where AFR [automated facial recognition] can be deployed”.
The Home Office says that the Court of Appeal found that “there is a legal framework for police to use LFR [live facial recognition],” and that the court's issue was specifically with how it had been deployed by the South Wales Police.
The Home Office also says that the College of Policing has released guidance on the use of live facial recognition technology.
But critics say that there needs to be specific legislation enacted beyond internal guidelines.
The technology has never been officially debated in Parliament.
In May this year, Big Brother Watch, an organisation dedicated to the protection of privacy in the UK, announced that they were launching legal action on behalf of two members of the public who the technology “wrongly flagged as criminals”.
“Sometimes it doesn’t feel like there are that many people talking about this stuff.”
Secondly, according to Clement-Jones, “it’s not accurate”.
Live facial recognition technology works by attempting to match live facial images to images saved in a police database. When there is a match, the police are notified, and they proceed to stop the individual and make an on-the-ground assessment.
Inevitably, there will occasionally be an erroneous match, and someone will be stopped despite not being on the watchlist. The Daily Mail reported that, as of 10th April 2024, 10 out of 362 people were mistakenly identified as matching the face of a wanted criminal on the watchlist – a false alert rate of 2.8%.
For some, like Lord Clement-Jones, that’s too high. For the police, it’s not enough to curtail the technology’s use.
"Live facial recognition technology turns our city streets into mass-scale police line-ups."
It's not just the accuracy that is the problem. Clement-Jones stresses that “it’s the surveillance that’s the issue”.
This third concern, articulated in detail by Big Brother Watch, is that the use of the technology constitutes an invasion of privacy.
They say that the technology “turns our city streets into mass-scale police line-ups”. While the police do tweet when and where they will be using the technology on the day, and put up signs on the streets they are surveying, campaigners against live facial recognition are concerned about a lack of consent to being face-scanned, and the implications this holds for a surveillance society.
The court case against South Wales Police was brought and won by Ed Bridges, who said that his data had already been captured before he could see the sign warning him that the technology was in use.
It was, he argued, “a fundamental invasion of my privacy”.
Fourthly, there is the issue of the watchlists. Specifically, concerns arise regarding who is on them, and how they are selected.
Lord Clement Jones tells me that he is worried that the technology is “scooping people who have been accused of misdemeanours”. Searching for these minor criminals, the argument goes, does not warrant such a vast surveillance programme.
The police say that the watchlists only include dangerous criminals, including people convicted with sex offences, people violating bail conditions, and people failing to respond to a court summons.
"This is coming down the track inexorably."
During one week in March 2024, the BBC reported that the Met arrested 17 people using live facial recognition technology in Croydon and Tooting.
One of these people was a 23-year-old who was found with two rounds of blank ammunition. He was on the watchlist because he was wanted for possession of points and blades.
The police contend that the technology is making our streets safer. Others suggest that the scale of the action is disproportionate to the types of crime it is targeting.
“This is coming down the track inexorably,” says Lord Clement-Jones, who is about to leave to vote on the Rwanda bill. “Without proper control over our biometric data, our civil liberties are being eroded. We’ve just had 1984 quoted by an ex-Conservative commissioner in the EU, Lord Tugendhat. People do quote 1984 still, which is pretty worrying.
“Sometimes it doesn’t feel like there are that many people talking about this stuff.”
CCTV is a common sight in Croydon.
CCTV is a common sight in Croydon.
Private companies also provide security services using live cameras.
Private companies also provide security services using live cameras.
The Metropolitan police is adding to its security measures with live facial recognition technology.
The Metropolitan police is adding to its security measures with live facial recognition technology.
"Biometrics in this country is a bit of a dirty word."
Professor Richard Guest
Wherever there are biometrics, there is controversy. Live facial recognition is no different.
Biometrics is the automated recognition of individuals by means of their unique physical characteristics.
Many already feel uncomfortable with the ubiquitous use of biometric security systems used to secure smartphones (commonly known as "Face ID").
When it comes to the use of biometrics by the police, such as in live facial recognition technology, the controversy only heightens.
“Biometrics in this country is a bit of a dirty word,” Professor Richard Guest, a computer scientist, tells me, “When people hear about it, it’s always in a negative context. It’s always: ‘here’s the latest thing that will violate your civil rights’. And I’m not dismissing that.”
Richard Guest is a Professor of Biometric Systems Engineering at the University of Kent, and co-author of the government’s Biometrics & Forensic Ethics Group paper on ‘Ethical issues arising from public-private collaboration in the use of live facial recognition technology’, published in 2021.
Professor Guest stresses to me that he is not an authority on the deployment of the technology. What he sees is from the perspective of a computer scientist in the biometrics field, and what he often sees - rightly or wrongly - is public scepticism.
"There’s nothing wrong with using the technology in a way that’s going to reduce crime and capture the bad guys."
“As biometric computer scientists, we can come up with the best algorithms, ensuring that they are used ethically and fairly and without bias and all that sort of stuff, but there will be people who are always thinking: ‘Well, it’s got to be for malicious purposes. Are they really just trying to find five criminals that they really want? Or do they want to do more than that?’
“In my mind, there’s nothing wrong with using the technology in a way that’s going to reduce crime and capture the bad guys. However, people say: ‘Why do they need to capture everyone who is walking down the street to do that?’”
Scepticism is a natural reaction. Police vans with mounted cameras scanning people as they walk down the street does have an Orwellian feel. And knowing that those cameras are capturing images of one's face, and that of every other pedestrian, is scary.
Professor Guest acknowledges the concerns with biometrics being used in police surveillance. At the same time, he points out that in the case of live facial recognition technology, it is important to remember that these images are not being stored anywhere. If there’s no match, they disappear from the system immediately.
For Professor Guest, there is also a concern about accuracy. Live facial recognition software is developed by private companies before being bought by the government. The technology used by British police is developed by NEC, a Japanese company.
"There have been trials, but I don't think they have gone nearly far enough."
“Testing is so important with these things. It’s up to the police, or the Home Office, or whomever, to test these things out. Because, if it’s been developed in one part of the world, on a certain data set, does it make the flight across? Does it still work on a high street in the UK?
“There have been trials of systems. But I don’t think those trials have gone nearly far enough. I think we really need to revisit this.”
Metropolitan Police data on the use of live facial recognition technology, as of 6th April 2024. Data source: Daily Mail
Metropolitan Police data on the use of live facial recognition technology, as of 6th April 2024. Data source: Daily Mail
Metropolitan Police data on the use of live facial recognition technology, as of 6th April 2024. Data source: Daily Mail
Metropolitan Police data on the use of live facial recognition technology, as of 6th April 2024. Data source: Daily Mail
Metropolitan Police data on the use of live facial recognition technology, as of 6th April 2024. Data source: Daily Mail
Metropolitan Police data on the use of live facial recognition technology, as of 6th April 2024. Data source: Daily Mail
Metropolitan Police data on the use of live facial recognition technology, as of 6th April 2024. Data source: Daily Mail
Metropolitan Police data on the use of live facial recognition technology, as of 6th April 2024. Data source: Daily Mail
Once accuracy is assured, the challenge for computer scientists and policy makers, Professor Guest says, becomes about public communication.
The difficulty arises because live facial recognition systems are hard to understand.
“The trouble is, there’s no transparency in how these things operate. They are black boxes. They are software systems where you put an image in and it will ping you out a result.
“If we want the confidence of people to trust the use of these systems, now it falls to the scientists to say we’re going to be transparent in how we test these things, and we’re going to be able to communicate these things in a way that is independent, and in a way that people can understand.”
Professor Guest says that, in particular, it is crucial that the public understands “what’s happening to the facial image once it gets processed”.
"I don't think the communication has advanced at all."
“It’s not going to go into a big database to be used forever for detecting me as I walk around town on the weekend.”
Since his report was published in 2021, Professor Guest sees little signs of progress in this communication.
“I don’t think the communication has advanced at all, to tell you the truth.”
The police will continue to roll out live facial recognition technology, and they are prepared to defend it as they do so. But as long there is insufficient understanding of the technology, the concerns of people like Lord Clement-Jones will grow stronger and stronger.
Survey on the opinion of the UK public on the use of live facial recognition technology. Source: "Public attitudes towards the use of automatic facial recognition technology in criminal justice systems around the world" (2021), Ritche, K, et. al, originally from "Beyond face value: public attitudes to facial recognition technology" (2019), Ada Lovelace Institute.
Survey on the opinion of the UK public on the use of live facial recognition technology. Source: "Public attitudes towards the use of automatic facial recognition technology in criminal justice systems around the world" (2021), Ritche, K, et. al, originally from "Beyond face value: public attitudes to facial recognition technology" (2019), Ada Lovelace Institute.
Survey on the opinion of the UK public on the use of live facial recognition technology. Source: "Public attitudes towards the use of automatic facial recognition technology in criminal justice systems around the world" (2021), Ritche, K, et. al, originally from "Beyond face value: public attitudes to facial recognition technology" (2019), Ada Lovelace Institute.
Survey on the opinion of the UK public on the use of live facial recognition technology. Source: "Public attitudes towards the use of automatic facial recognition technology in criminal justice systems around the world" (2021), Ritche, K, et. al, originally from "Beyond face value: public attitudes to facial recognition technology" (2019), Ada Lovelace Institute.
Survey on the opinion of the UK public on the use of live facial recognition technology. Source: "Public attitudes towards the use of automatic facial recognition technology in criminal justice systems around the world" (2021), Ritche, K, et. al, originally from "Beyond face value: public attitudes to facial recognition technology" (2019), Ada Lovelace Institute.
Survey on the opinion of the UK public on the use of live facial recognition technology. Source: "Public attitudes towards the use of automatic facial recognition technology in criminal justice systems around the world" (2021), Ritche, K, et. al, originally from "Beyond face value: public attitudes to facial recognition technology" (2019), Ada Lovelace Institute.
"We've got nothing to hide."
The Metropolitan Police is adamant that the roll-out of live facial recognition technology is an effective and proportionate way of making our streets safer.
At a community policing meeting for the borough of Brent, Inspector Craig Hands attempts to justify the police usage of the technology to an audience that is, at times, sceptical.
“We’ve got nothing to hide with it,” Inspector Hands says, “it’s just a very, very good piece of technology that has proven results of bringing very nasty, high harm offenders to justice.”
However, many of the criminals that are identified by live facial recognition technology are those who have evaded a court summons or have violated their bail conditions. They are not, some argue, the hardened criminals of whom Hands speaks. In other words, it’s not worth the time. the resources, and the invasion of privacy.
"I know not everybody’s going to like it, but hopefully we can get some really good results and we can feed that back into the community and the community can see some successes."
Hands disagrees. “In terms of proportionality, this technology has identified multiple sex offenders and rapists. These are the most serious and the most dangerous people out on the street that have evaded coming into court or a police station. I personally believe that us spending time and effort bringing these people to justice is very important."
Hands does not see the technology as an entirely new type of policing, but rather as a new way of improving current policing methods.
On a regular day without live facial recognition technology, Hands explains, police will be briefed about for whom they should be looking out on the street. Live facial recognition technology just gives them a new, and incredibly efficient, helping hand.
“It’s no different than our day-to-day policing. It just uses new technology to assist us and the accuracy is very, very high.”
Hands may not convince everyone that he should be rolling out the technology. But he is adamant that it is the right thing to do.
“My commitment to the public is that I will do everything I can to tackle crime. I’m trying to do it as openly and transparently and as fairly as I can. I know not everybody’s going to like it, but hopefully we can get some really good results and we can feed that back into the community and the community can see some successes.”
The Metropolitan Police has been encouraged to implement live facial recognition technology.
The Metropolitan Police has been encouraged to implement live facial recognition technology.
Croydon has been the focus of media attention around the technology.
Croydon has been the focus of media attention around the technology.
Police have defended its use, saying it helps reduce crime.
Police have defended its use, saying it helps reduce crime.
Inspector Craig Hands said that "it’s just a very, very good piece of technology that has proven results of bringing very nasty, high harm offenders to justice."
Inspector Craig Hands said that "it’s just a very, very good piece of technology that has proven results of bringing very nasty, high harm offenders to justice."
Regardless of one's opinion on the use of live facial recognition technology, police vans with mounted cameras and AI capabilities will become increasingly common on London streets.
Privacy is, and always will be, a major concern when it comes to the use of AI in police surveillance.
Police are confronting a tricky balancing act, between harnessing modern technology to increase police efficiency, and preserving civil liberties.
Whether they have struck the right balance will continue to be debated.