Met Statistics

This post first appeared 4 March 2022.

There are three types of lies.

Lies, damned lies, and statistics. — Unknown

The Metropolitan police in London uses Live Facial Recognition (LFR) to find offenders on watch-lists in public places.

The practice is controversial, to say the least. It has been challenged in UK courts and Amazon has indefinitely halted providing its service, Rekogition, to law enforcement agencies.

In the spirit of accountability, the Met keeps records of when it deploys LFR. The document1 lists three uses since January 2020.

The relevant part of the document is repeated here (I have removed the deployment cancelled due to technical fault):

Date False alerts True alerts Faces seen False alert rate
11 Feb 2020 \(0\) \(0 \) \(4600 \) \(0\% \)
27 Feb 2020 \(7\) \(1 \) \(8600 \) \(0.08\% \)
28 Jan 2022 \(1\) \(10\) \(12120\) \(0.008\%\)

At first blush these seem like excellent results for a technology mired in accusations about its effectiveness. Many of the concerns raised surround claims that it is not as accurate on people of colour and women. Having less than 0.1% error is a good start.

But lets focus on the second line. Clearly the maths is that 7 รท 8600 = 0.08%. This much is true, but this should be read as the percentage of people who walked by who were falsely accused.

A fairer calculation might be to consider the percentage of people who were flagged but were done so falsely. Here the number is closer to 85%.

By any metric things improved greatly before they next tried the technology2, though the number might more honestly be 9% rather than 0.008% here.

Which of these two statistics is reasonable to report is debatable, and some may argue that there are no lies in this data. However, I contend that boiling the data down to a single number is rather misleading, and the false alert rate given, particularly so. Its unlikely most readers will take the time to check the numbers, and lawmakers have better things to do3.

However, it behoves the Met to think about how their data may be interpreted and to present it in a fair and just manner.

However, even if this comes to pass, there are moral and societal questions that need answering. LFR is controversial, and it’s likely that the practice will need to be regulated and overseen. When making decisions about such things, lawmakers will need to have a good understanding of how accurate the technology is.


This post grew out of some research I was doing as part of my internship with the Naked Scientists. The views expressed here are entirely my own and do not reflect those of any other organisation. If you’d like to hear my piece on facial recognition, head over here.


  1. Downloaded 11 Feb 2022 from the link on this page. ↩︎

  2. Interestingly, there were three deployments in 2020 and only again in 2021. I’m not sure if this is due to the pandemic or the court case against Southern Wales Police regarding LFR. ↩︎

  3. ..which is to say that someone else should be presenting them with the facts, not that they have better things to do than deal with the facts. The point of a parliament is to make laws. When it comes to science and the like, its bodies such as POST and the select committees that should ensure they are well informed. ↩︎