Facial recognition is in London. So how should we regulate it?

The technology is being used on a regular basis – but the laws around it are muddled
Getty Images / Ian Waldie / Staff

As the first step on the road to a powerful, high tech surveillance apparatus, it was a little underwhelming: a blue van topped by almost comically intrusive cameras, a few police officers staring intently but ineffectually at their smartphones and a lot of bemused shoppers.

As unimpressive as the moment may have been, however, the decision by London's Metropolitan Police to expand its use of live facial recognition (LFR) marks a significant shift in the debate over privacy, security and surveillance in public spaces. Despite dismal accuracy results in earlier trials, the Metropolitan Police Service (MPS) has announced they are pushing ahead with the roll-out of LFR at locations across London.

MPS say that cameras will be focused on a small targeted area "where intelligence suggests [they] are most likely to locate serious offenders," and will match faces against a database of individuals wanted by police. The cameras will be accompanied by clear signposting and officers handing out leaflets (it is unclear why MPS thinks that serious offenders would choose to walk through an area full of police officers handing out leaflets to passersby).

It seems clear that, reliable or not, MPS is committed to the use of LFR technology in some capacity. It's not the only police force or law enforcement body in the world that wants to use the technology to help policing.

Here’s the problem: it’s not entirely clear if it’s legal in the UK. There is currently no law explicitly authorising the use of LFR in the UK. In the absence of an explicit legal framework, the MPS are relying on a grab-bag of other legislation including common law, the Human Rights Act 1998, the Protection of Freedoms Act 2012, the Data Protection Act 2018, and the Regulation of Investigatory Powers Act 2000.

An independent analysis (PDF) of the MPS's earlier trials argued that this hodge-podge of legal justifications might not be enough. “Without explicit legal authorisation in domestic law, it is highly possible that police deployment of LFR technology may be held unlawful if challenged before the courts,” wrote the report authors Pete Fussey and Daragh Murray of the University of Essex.

"We have argued, and continue to argue, that the common law is an insufficient basis to authorise LFR use," says Professor Fussey. "The arguments that support the use of the common law in this way would also apply to authorising a huge range of intrusive surveillance measures without adequate safeguards. We only need to look at the limitations placed on the use of other forms of intrusive surveillance – such as bulk surveillance warrants, directed surveillance of suspects, ‘equipment interference' (hacking), and phone tapping - to note that the law recognises the need for additional safeguards around the use of such technologies."

In September 2019, a UK High Court judgement ruled that the use of facial recognition by South Wales police in public spaces did not breach privacy or human rights laws. However, experts have been quick to point out that this does not amount to a green light for all uses of facial recognition technology by police. The case is being appealed.

"I personally feel that the [High Court] decision is so narrow that it should not be read as authorising all forms of automated facial recognition," says Philip Chertoff of the Harvard Law School. Chertoff has researched the legal basis for facial recognition in the UK. "But some authorisation is better than no authorisation, and I would not be surprised to see law enforcement invoking the decision as justification for uses of automated facial recognition beyond the narrow implementation challenged in this particular case."

MPS commissioner Cressida Dick has added her voice to the calls from civil society and tech companies themselves for greater regulation, saying "the best way to ensure that the police use new and emerging tech in a way that has the country’s support is for the government to bring in an enabling legislative framework that is debated through Parliament, consulted on in public, and which will outline the boundaries for how police should or should not use tech.”

Overall, people support the use of facial recognition technology by police. According to a survey by the Ada Lovelace Institute, 71 per cent of the UK public support police use of facial recognition in public spaces if it helps to reduce crime. However, a majority (55 per cent) support regulation which limits use of facial recognition to specific circumstances. 50 per cent also think that the private sector should not sell facial recognition technology to police.

Despite broad agreement on the need for regulation, however, it is still not clear exactly what form it should take. Here are three questions which need to be answered.

Who should be responsible for regulating use of LFR?

Who should take the lead on regulating LFR will be driven in part by the nature of the regulations themselves. The NYU’s Policing Project has identified three broad types of LFR legislation: general regulations that ban, pause, or study facial recognition technology (FRT); operations-based regulations that control how FRT is deployed; or data-based regulations that restrict the images used to operate FRT.

The Scottish Parliament has effectively called for a pause in police use of LFR, demanding that Scottish police demonstrate a legal basis for use of the technology, including its compliance with human rights and data protection obligations. For much of the rest of the UK some combination of the latter two, operations-based regulations and data-based regulations, seem like the most probable outcome. The overlapping areas of jurisdiction between regulating operations and the data they use is creating confusion.

"There is currently no law regulating the use of automated LFR in the UK, but there are laws which regulate the components of such systems, like the storage of biometric data, procedures for using surveillance cameras, and how and when data can be collected," Chertoff says.

This piecemeal regulation creates confusion over who exactly is responsible for regulating LFR technology. While the Home Office clearly has a role to play, it is not clear whether LFR should also fall under the specific remits of the Biometrics Commissioner, Surveillance Camera Commissioner, or the Information Commissioner (ICO). Both the Biometrics Commissioner and ICO released statements sounding notes of caution in response to the MPS’s announcement.

"Many of the commissioners have indicated that they disagree on how LFR should be regulated. With different opinions from different commissioners, this confusion among law enforcement is magnified and will likely lead to different forms of self-regulation being adopted by different law enforcement agencies," Chertoff says.

Even if it was clear which commissioner should take the lead on regulating police use of LFR, it's uncertain whether they would currently have the power to enforce their decisions. "Only the ICO has any real regulatory power, i.e. the power of censure, fines and so on," Fussey says. "The ICO have also acknowledged that data protection is an insufficient tool to regulate the wider harms of LFR, a view we have advanced in our work."

Any attempts to regulate the use of LFR by police agencies will therefore need to firstly clarify who holds responsibility, and ensure that that body has the teeth necessary to enforce it.

What should be regulated?

"A lot of the discussions around facial recognition technology assume that 'facial recognition technology' is one thing, one specific device or piece of software with the same set of applications," says Ellen Broad, a senior fellow with the 3A Institute and member of the Australian government's Data Advisory Council. "There are a range of technological applications that fall under the umbrella of facial recognition technology, from face detection settings on digital cameras which help to improve picture focus, through to identity matching."

There is also the problem of technologies which do not come under the definition of facial recognition, but which fulfil more or less the same purpose and which can be easily substituted for facial recognition. Such technologies allow police to circumvent regulation on particular technologies in a way which obeys the letter of the law but not the spirit of it. In Marbella in Spain, for example, regional legislation bans the use of LFR data without consent; instead, authorities have implemented a surveillance system which conducts 'appearance searches,' which detects unique facial traits, the colour of a person’s clothes, age, shape, gender and hair colour.

If the goal of regulation is to prevent unchecked mass surveillance, the ease with which technologies can be substituted for the same purposes suggests that any future regulation should not be tied too closely to a narrow definition of facial recognition.

There have been examples of police circumventing regulation on use of surveillance technologies in other ways. Immigration and Customs Enforcement agents in the US have reportedly been effectively buying their way around needing to get warrants to access phone location data by purchasing it from commercial providers.

It is not difficult to imagine something similar playing out in the context of facial recognition, and this implies that regulation need to consider police use of or access to LFR surveillance, not simply police operation of the technology. At a deeper level, it also suggests a need to foster a culture within police departments which respects the spirit as well as the letter of the law and does not seek to find ways around legitimate legal constraints like the requirement to get a warrant.

How will we know if it’s working?

Once an explicit and adequate legal basis is established for the use of LFR, safeguards will still need to be implemented, according to Fussey. "This includes proper and robust authorisation of LFR use. This is already enacted for other forms of intrusive surveillance. Given the intrusive nature of LFR, and the way it engages the privacy rights of everyone who passes the cameras this authorisation process should be available for scrutiny by independent bodies or regulators."

Fussey says that a full human rights impact assessment should be conducted as LFR impacts more than just privacy. "Any oversight and regulation processes should include an element of review, to hold users of these surveillance tools to account that the technology was used for the purposes claimed, avoided harm and held utility in that capacity."

The high inaccuracy rates in previous trials underscore the need for oversight and transparency of any implementation of LFR, particularly in light of the well-documented risks for bias and discrimination in facial recognition systems. Establishing independent mechanisms for testing the accuracy of the LFR systems is important, according to Broad.

"We don’t have any specific transparency and accountability mechanisms for facial recognition. There’s no requirements that software be independently verified, tested or scrutinised, or that companies disclose certain aspects of the workings of that system to organisations licensing their services," Broad says. "[Oversight] doesn’t mean making software open source – it can mean requiring independent review by approved third party suppliers. We use forms of independent review and validation across sectors like food inspectors or laboratory audits that aren’t about exposing those things to the world, but are about making sure they can be checked as following some standards of best practice. We don’t have any of that yet in this space."

Oversight isn't just about accuracy; it's also about engendering trust and a sense of democratic accountability, and a mechanism for responding to the evolving concerns of the community. "I have no doubt that law enforcement would prefer to use [LFR] in a way that is authorised by and complies with the law. But without a clear authority who can set the rules or even provide advisory opinions, law enforcement does not have any clear sense of what rules or standards are required," Chetoff says.

"There are clear trust issues between law enforcement and civilians. The adoption of pervasive surveillance technologies like [LFR] without consulting with or taking feedback from the populace will only further degrade this relationship. These technologies must be debated, in the open, and any adoption should respect the privacy concerns of the public."


Digital Society is a digital magazine exploring how technology is changing society. It's produced as a publishing partnership with Vontobel, but all content is editorially independent. Visit Vontobel Impact for more stories on how technology is shaping the future of society.


This article was originally published by WIRED UK