A MAN falsely identified as a suspect by London police's live facial recognition cameras launched landmark legal action on Tuesday (27) over the force's sweeping use of the technology.
The High Court in London began a judicial review into whether the Metropolitan police is acting lawfully in its use of live facial recognition (LFR) cameras in public places.
The Met uses the technology to scan the faces of passers-by and compare their biometrics with thousands on a watchlist. The UK is the only European country to deploy the technology on a large scale.
Shaun Thompson, a volunteer who helps young people affected by street crime, was questioned by police in 2023 after being wrongly identified by the technology as on a watchlist.
He has launched the legal action alongside Silkie Carlo, director of civil liberties campaigning organisation Big Brother Watch.
Their lawyer, Dan Squires, told the court police can use the technology in more than half of public areas of London without clear enough reasons for the selected locations.
The police's lawyer, Anya Proops, argued the case was "entirely lacking in merit" and could have "very dramatic implications" if it limited use of the technology to locations connected with people being sought.
The Met's policy allows it to deploy the cameras in crime hotspots and at critical national infrastructure such as key roads and transport networks. The two sides disagreed on how much of London this covers.
The judicial review hearing is due to finish Wednesday, with a judgement at a later date.
Two judges are tasked with assessing whether the Met's policy on using facial recognition technology provides adequate constraints or is arbitrary.
The police force said in its written argument that last year 801 arrests were made as a result of LFR.
"The primary value of live facial recognition for the police is it is an immensely effective and powerful tool enabling us to locate people when we don't know where they are," Proops said.
The technology operates "at incredible speed with enormous numbers of people, and the numbers have grown exponentially", Squires said, calling it "an unblinking eye".
The issue is "the mass scale of the innocent people whose biometric data is taken", Squires added.
Last year, the Met used live facial recognition more than 200 times, capturing around four million faces, the lawyer said.
On average, there is one "false alert" per 33,000 people viewed, or around 10 false alerts per month, Squires said, claiming that what happened to Thompson was not "extraordinary".
The Met Police said the technology only "minimally" intrudes into the privacy of the public because it is required to be clearly signposted.
(AFP)





