Skip to content
Search

Latest Stories

Woking councillors challenge police facial recognition cameras over privacy concerns

Cross-party group warns technology risks bias against ethnic minorities

Woking councillors

The vans are fitted with cameras that feed into specialist software designed to catch criminals and suspects

Getty Images

Highlights

  • Facial recognition vans deployed in Surrey and Sussex on November (26) spark privacy debate.
  • Councillors cite early trial error rates of 81 per cent, with severe inaccuracies.
  • Surrey Police defend technology, saying two arrests already made and no statistical bias in current system.
A cross-party group of Woking councillors has written to Surrey Police demanding the suspension of facial recognition cameras deployed in the town, citing concerns over privacy rights and potential bias against ethnic minority communities.

Vans equipped with facial recognition technology were rolled out on the streets of Surrey and Sussex on 26 November. However, independent, Labour and Liberal Democrat councillors on Woking Borough Council are calling for the scheme to be halted.

The vans are fitted with cameras that feed into specialist software designed to catch criminals, suspects and those wanted on recall to prison. Police have stated that images of people not on the watchlist will be instantly deleted from the system, minimising "impact on their human rights".


Councillor Bonsundy-O'Bryan warned the technology "risks dangerous bias, incorrectly misidentifying people from ethnic minority backgrounds and women". He told the BBC that the decision was made "with minimal consultation or transparency about data processing, sharing, and deletion".

He described the inability to opt out of being filmed as "a fundamental breach of the right to privacy that underpins a free society".
According to the council, error rates in early trials reached as high as 81 per cent, with the most severe inaccuracies impacting Black, Asian and ethnic minority individuals.

Surrey police acknowledged that historically there was potential gender and ethnic bias but stated this has greatly reduced with technological development. The force said the national algorithm currently used shows no statistical bias.

"It is our responsibility to use every tactic and innovation available to us to keep the public safe," Surrey police told BBC, adding that two people had already been arrested due to the cameras.

The force emphasised that the rollout of the technology was meticulously planned and executed following careful consideration of privacy and equality concerns.

More For You

 Post Office Horizon
A Post Office van parked outside the venue for the Post Office Horizon IT inquiry at Aldwych House on January 11, 2024 in London.
Getty Images

Corporate manslaughter being considered in Post Office Horizon probe

POLICE investigating the Post Office Horizon IT scandal are considering corporate manslaughter charges, according to an update shared with victims in recent days.

The National Police Chiefs' Council (NPCC) said the investigation is examining potential offences including perjury, perverting the course of justice and corporate manslaughter, which applies to companies rather than individuals, reported BBC.

Keep Reading Show less