Freedom of Information requests by the Big Brother Watch organisation have shown that 91 per cent of so-called "matches" found by South Wales Police's technology were wrong.
The UK's Information Commissioner has threatened to take legal action over the use of facial recognition in law enforcement if the police and government can not prove the technology is being deployed legally.
Police have been rolling out the software to be used at major events such as sporting fixtures and music concerts, including a Liam Gallagher concert and global rugby games, aiming to identify wanted criminals and people on watch lists.
While she welcomed both the recent appointment of a member of the NPCC to govern the use of facial recognition in public spaces and the establishment of an oversight panel including herself, Biometrics Commissioner and the Surveillance Camera Commissioner, Denham also noted that she is "deeply concerned about the absence of national level co-ordination in assessing the privacy risks and a comprehensive governance framework to oversee FRT deployment".
It claims the Met has a failure rate of 98%, and during last year's Notting Hill Carnival, that police misidentified 95 people as criminals.
"We do not not consider these as false positive matches because additional checks and balances are in place to confirm identification following system alerts", a spokesperson told The Register. If you think this isn't worth worrying about, bear in mind that on the basis of an incorrect match the police have the power to stop you in the street and require you to identify yourself, in order to prove you aren't the person their computer tells them you are.
Technological advances in the last 20 years have rapidly increased the ability of online systems to identify individuals.
The Metropolitan Police have also targeted people known to have mental health issues who weren't wanted for any crimes at Remembrance Sunday in 2017, while South Wales Police targeted peaceful and lawful demonstrators in Cardiff in March.
The group described this as a "chilling example of function creep" and an example of the unsafe effect it could have on the rights of marginalised people.
Privacy Commissioner John Edwards said his office had not examined the facial recognition CCTV systems used by Foodstuffs in the North Island, but any such technology "runs the risk of misidentifying people".
This is compounded by the fact that the commercial software used by the Met - and also South Wales Police (SWP) - has yet to be tested for demographic accuracy biases.
But for now the Big Brother Watch report says the benefits are missing, because the technology does not work.
SWP - which has used AFR at 18 public places since it was first introduced in May 2017 - has fared only slightly better. The system led to 15 arrests or 0.005% of the total matches. What protections are there for people that are of no interest to the police?
It also raised concerns that photos of any "false alarms" were sometimes kept by police for weeks.
Further details are expected in the long-awaited biometrics strategy, which is slated to appear in June. The use of images collected when individuals are taken into custody is of concern; there are over 19 million images in the Police National Database.
"If we move forward on this path, these systems will mistakenly identify innocent people as criminals or terrorists and will be used by unscrupulous governments to silence unwelcome voices".
In March, Williams said that because images can only be deleted manually, weeding out innocent people "will have significant costs and be hard to justify given the off-setting reductions forces would be required to find to fund it".