Automated facial recognition (AFR) technology used by London's Metropolitan Police is created to find persons of interests within large groups of people by comparing the biometrics of attendees caught on camera with information already stored on law enforcement databases.
That figure is the highest of those given by United Kingdom police forces surveyed by the campaign group Big Brother Watch as part of a report that urges the the police to stop using the tech immediately.
If the concept of this dystopian and authoritarian policing tool turning us all into walking ID cards wasn't enough on its own, there are huge problems with both the technology and the police's intrusive and oppressive use of it.
The system, though, hasn't had much success in positive identifications either: the report showed there have been just two accurate matches, and neither person was a criminal. A Met police spokesperson said that all alerts on its watch list were deleted after 30 days and faces that do not generate an alert are immediately deleted.
The Metropolitan Police have also targeted people known to have mental health issues who weren't wanted for any crimes at Remembrance Sunday in 2017, while South Wales Police targeted peaceful and lawful demonstrators in Cardiff in March.
The Met Police's facial recognition system had the worst track record, with only 2% matching accuracy and with 98% wrongly identified people.
Privacy Commissioner John Edwards said that despite the technology being "cutting edge", it's not particularly reliable - even in law enforcement.
"We have been extremely disappointed to encounter resistance from the police in England and Wales to the idea that such testing is important or necessary", Big Brother Watch said in the report.
While she welcomed both the recent appointment of a member of the NPCC to govern the use of facial recognition in public spaces and the establishment of an oversight panel including herself, Biometrics Commissioner and the Surveillance Camera Commissioner, Denham also noted that she is "deeply concerned about the absence of national level co-ordination in assessing the privacy risks and a comprehensive governance framework to oversee FRT deployment".
SWP - which has used AFR at 18 public places since it was first introduced in May 2017 - has fared only slightly better.
"For the use of FRT to be legal, the police forces must have clear evidence to demonstrate that the use of FRT in public spaces is effective in resolving the problem that it aims to address, and that no less intrusive technology or methods are available to address that problem". The system led to 15 arrests or 0.005% of the total matches.
"Regarding "false" positive matches - we do not consider these as false positive matches because additional checks and balances are in place to confirm identification following system alerts", it said in a statement.
"We use multiple strategies to protect our people, customers and product and we make no apology for this".
It said a "number of safeguards" prevented any action being taken against innocent people.
The UK's independent biometrics commissioner, Paul Wiles, told The Independent that the technology is "not yet fit for use" judging by the figures outlined in the report. This means that they remain on the system unless a person asks for them to be removed.
"'Costs' are not an acceptable reason for the British Government not to comply with the law", it said.
The Home Office has spent £2.6m funding the technology in South Wales alone, according to a report by the group Big Brother Watch (BBW).