# Met Statistics

There are three types of lies.

Lies, damned lies, and statistics. — Unknown

The Metropolitan police in London uses Live Facial Recognition (LFR) to find offenders on watch-lists in public places.

The practice is controversial, to say the least. It has been challenged in UK courts and Amazon has indefinitely halted providing its service, Rekogition, to law enforcement agencies.

In the spirit of accountability, the Met keeps records of when it deploys LFR. The document1 lists three uses since January 2020.

The relevant part of the document is repeated here (I have removed the deployment cancelled due to technical fault):

11 Feb 2020 $$0$$ $$0$$ $$4600$$ $$0\%$$
27 Feb 2020 $$7$$ $$1$$ $$8600$$ $$0.08\%$$
28 Jan 2022 $$1$$ $$10$$ $$12120$$ $$0.008\%$$

At first blush these seem like excellent results for a technology mired in accusations about its effectiveness. Many of the concerns raised surround claims that it is not as accurate on people of colour and women. Having less than 0.1% error is a good start.

But lets focus on the second line. Clearly the maths is that 7 ÷ 8600 = 0.08%. This much is true, but this should be read as the percentage of people who walked by who were falsely accused.

A fairer calculation might be to consider the percentage of people who were flagged but were done so falsely. Here the number is closer to 85%.

By any metric things improved greatly before they next tried the technology2, though the number might more honestly be 9% rather than 0.008% here.

Which of these two statistics is reasonable to report is debatable, and some may argue that there are no lies in this data. However, I contend that boiling the data down to a single number is rather misleading, and the false alert rate given, particularly so. Its unlikely most readers will take the time to check the numbers, and lawmakers have better things to do3.

However, it behoves the Met to think about how their data may be interpreted and to present it in a fair and just manner.

However, even if this comes to pass, there are moral and societal questions that need answering. LFR is controversial, and it’s likely that the practice will need to be regulated and overseen. When making decisions about such things, lawmakers will need to have a good understanding of how accurate the technology is.

This post grew out of some research I was doing as part of my internship with the Naked Scientists. The views expressed here are entirely my own and do not reflect those of any other organisation. If you’d like to hear my piece on facial recognition, head over here.