Highlights
- Facial recognition vans deployed in Surrey and Sussex on November (26) spark privacy debate.
- Councillors cite early trial error rates of 81 per cent, with severe inaccuracies.
- Surrey Police defend technology, saying two arrests already made and no statistical bias in current system.
Vans equipped with facial recognition technology were rolled out on the streets of Surrey and Sussex on 26 November. However, independent, Labour and Liberal Democrat councillors on Woking Borough Council are calling for the scheme to be halted.
The vans are fitted with cameras that feed into specialist software designed to catch criminals, suspects and those wanted on recall to prison. Police have stated that images of people not on the watchlist will be instantly deleted from the system, minimising "impact on their human rights".
Councillor Bonsundy-O'Bryan warned the technology "risks dangerous bias, incorrectly misidentifying people from ethnic minority backgrounds and women". He told the BBC that the decision was made "with minimal consultation or transparency about data processing, sharing, and deletion".
He described the inability to opt out of being filmed as "a fundamental breach of the right to privacy that underpins a free society".
According to the council, error rates in early trials reached as high as 81 per cent, with the most severe inaccuracies impacting Black, Asian and ethnic minority individuals.
Surrey police acknowledged that historically there was potential gender and ethnic bias but stated this has greatly reduced with technological development. The force said the national algorithm currently used shows no statistical bias.
"It is our responsibility to use every tactic and innovation available to us to keep the public safe," Surrey police told BBC, adding that two people had already been arrested due to the cameras.
The force emphasised that the rollout of the technology was meticulously planned and executed following careful consideration of privacy and equality concerns.













