hrf

King’s Cross developers probed for facial recognition technology


The London mayor has called for new laws to clarify how facial recognition technology can be used.  (Photo: Leon Neal/AFP/Getty Images)
The London mayor has called for new laws to clarify how facial recognition technology can be used. (Photo: Leon Neal/AFP/Getty Images)

THE UK’s data protection watchdog is investigating King’s Cross developers for scanning public with facial recognition cameras, it was reported on Thursday (15).

The King’s Cross estate includes officers, colleges, restaurants and shops and thousands of people pass through the area each day. The Information Commissioner feels scanning these people as they lawfully go about their daily lives is a threat to privacy.

“I remain deeply concerned about the growing use of facial recognition technology in public spaces, not only by law enforcement agencies but also increasingly by the private sector,” Elizabeth Denham, the Information Commissioner was quoted as saying by the Independent.

“My office and the judiciary are both independently considering the legal issues and whether the current framework has kept pace with emerging technologies and people’s expectations about how their most sensitive personal data is used.”

This probe comes just days after Sadiq Khan, the mayor of London, wrote to the King’s Cross Central Limited Partnership demanding more information on the use of facial recognition technology.

“There are serious and widespread concerns about the legal framework for the use of this technology, and I have called on the government to legislate in order to provide certainty about exactly how it can be legally used in the UK,” Khan reportedly wrote in the letter.

“I am writing to request … assurance that you have been liaising with government ministers and the Information Commissioner’s Office to ensure it is fully compliant with the law as it stands.”

Cameras using facial recognition technology are used by the police to scan faces in large crowds and public places. Images are then compared to a database of suspects.

However, recent research has shown racial disparities in the accuracy of facial recognition technology. A 2018 study from the Massachusetts Institute of Technology revealed that the software more often misidentifies darker-skinned people. The software had an error rate of 34.7 per cent for darker-skinned women, compared with 0.8 per cent for lighter-skinned men, the study showed.