Met police deploy facial recognition in Westminster

London Police deployed live facial recognition (LFR) technology in Westminster on Friday January 28, leading to the arrest of four people and drawing significant criticism from civil rights groups.

The Metropolitan Police Service (MPS) said its rollout of facial recognition – which took place the day after the British government relaxed mask-wearing requirements – was part of a wider operation to tackle the serious and violent crimes in the borough.

According to the MPS, one arrest was for a man wanted on an extradition warrant in connection with suspected drug offenses and aggravated assault, while the other three were for unspecified drug offenses , an unspecified traffic offense and a wanted man in connection with alleged death threats. .

The suspects were engaged and detained by officers following alerts from the vehicle-mounted LFR system, which allows police to identify people in real time by scanning their faces and matching them against a database of facial images, or “watch list”, as they walk around. .

Computer Weekly has contacted the MPS about various aspects of the deployment – including the number of vans deployed in Westminster, the number of match alerts issued by the system, the number of stops made and the number of information biometrics of people processed – but he hadn’t provided answers more than a week later.

Silkie Carlo, director of civil liberties group Big Brother Watch, who was present at one of the deployments at Oxford Circus, told Computer Weekly she witnessed four arrests while there.

“Of the four stoppages we saw, two should not have happened – one was about outdated data and another was a simple misidentification,” she said, adding that the wrong person identified, a young black boy, had his fingerprints taken. . “There may have been a lot more misidentifications – it was just during the short time I was there.”

Carlo said the experience of misidentification through facial recognition could have “a profound effect” on individuals subjected to it. “This boy who was arrested on Friday was surrounded by four or five police officers, they took his fingerprints, barking questions at him,” she said.

“The police aren’t saying, ‘You fit the description’ – which is obviously what a lot of teenagers in London have had to put up with for a long time – but ‘You’ve been flagged by our system’. That must be incredibly demotivating. It’s much more [they’re] vs [because they] then have to start proving who they are and trying to prove they’re not the person they [the police] think they are.

Regarding the four arrests made, Carlo said the MPS needed to be clearer about the nature of the offences, particularly given its claim that the technology was only deployed to search for “serious and violent offenders”.

“Yes [the traffic offence] is something like speeding, a lot of people are going to think very, very differently about facial recognition,” she said.

“We’ve always had this problem – which they put in their press releases: ‘We only look for serious and violent offenders’, but in the arrests we see, that’s very rarely the case. 31-year-old man wanted for drug-related offences? Is it possession of a small amount of marijuana? Do we really need facial recognition for that?

A necessary and proportionate deployment?

Before it can deploy facial recognition technology, the MPS must meet a number of requirements related to necessity, proportionality and legality.

For example, the MPS’s legal mandate document – which defines the complex patchwork of legislation that the force says enables it to deploy the technology – states that “authorizing officers must decide that the use of LFR is necessary and not just desirable. to enable the MPS to achieve its legitimate aim”.

The MPS’s Data Protection Impact Assessment (DPIA) also states that “all images submitted for inclusion in a watchlist must be legally owned by the MPS”.

In 2012, a High Court ruling found that the retention of custody footage – which is used as a primary source of watch lists – by the Metropolitan Police was unlawful, with information on unconvicted persons being retained from the same way as those who were ultimately sentenced. It also considered that the minimum retention period of six years was not proportionate.

Speaking to the Parliament’s Science and Technology Committee on March 19, 2019, then Biometrics Commissioner Paul Wiles said there was a ‘very poor understanding’ of the retention period footage from police custody in the police forces of England and Wales.

“I’m not sure the legal case [for retention] is pretty strong, and I’m not sure it would stand up to another legal challenge,” he said.

Big Brother Watch’s Carlo told Computer Weekly that a police officer on the ground during the Westminster LFR deployment informed her that there were 9,500 images on the watch list for that deployment.

“It’s not a targeted, specified deployment due to pressing need – it’s a fishing net,” she said.

In July 2019, a report by the Human Rights, Big Data & Technology Project based at the University of Essex Human Rights Center – which marked the first independent review of trials of LFR technology by the Metropolitan Police – highlighted a “presumption of discernible ‘intervention’. among police officers using the technology, meaning they tended to trust the results of the system and hire people he believed fit the watch list used even when they didn’t.

Computer Weekly contacted the MPS about these issues – including how the force decided the deployments from Westminster were necessary, the basis on which the deployment was deemed proportionate, how it resolved this lawful detention issue ( and whether she could guarantee every guard image in the Jan. 28 watchlists was held legally), and how she handled the “deemed intervention” — but she still hadn’t responded for over a week. after deployment.

Civil society groups react

In response to the Westminster rollout, the director of policy and campaigns at human rights group Liberty, Emmanuelle Andrews, described facial recognition as “oppressive by design” and said that “its inaccuracy and intrusion will fall hardest on people of color, especially black men who face routine police oppression.”

He added: “The Court of Appeal has recognized that this technology violates our rights and threatens our freedom. Yet the Met has tested it repeatedly. These tools are neither necessary nor compatible with the type of society in which we want to live. To keep everyone safe, we must reject divisive and oppressive surveillance technology, we must reject ever-growing and irresponsible police powers, and demand that government work with communities to develop strategies based on fairness, participation and support.

Carlo said the facial recognition van in Oxford Street on January 28 was accompanied by a large police presence, including around 25 uniformed officers and 25 plainclothes officers.

“The Metropolitan Police have been mired in scandals all year, and there are serious trust issues… to see such a deployment of police experimenting with highly intrusive technology while handing out leaflets and having to explain to members of the public why they’re standing there staring at iPads, waiting for game alerts and scanning their faces feels extraordinary to me and very, very misguided,” she said.

Comments are closed.