The Information Commissioner’s Office is embarking on a full-scale investigation into the use of facial recognition software in public places following claims the technology in place at the 67-acre King’s Cross Central site – home to King’s Cross and St Pancras stations, as well as restaurants, shops and cafés – is riding roughshod over consumers’ privacy.
The firm behind the scheme – property development giant Argent – claims it “creates places that are enjoyable, vibrant and welcoming, where people want to live, to work and to be”. It denies any wrong-doing, and reckons it uses the technology to “ensure public safety”.
Argent said: “These cameras use a number of detection and tracking methods, including facial recognition, but also have sophisticated systems in place to protect the privacy of the general public.”
It is not known whether the firm has installed the technology into other schemes it has been involved in, including the redevelopment of the Manchester Piccadilly area and Birmingham Bridley Place, but if the ICO finds the firm in breach of GDPR it could face a crushing fine.
The omens are not good for Argent as Information Commissioner Elizabeth Denham has already insisted that “scanning people’s faces as they lawfully go about their daily lives, in order to identify them, is a potential threat to privacy that should concern us all. That is especially the case if it is done without people’s knowledge or understanding”.
Denham added: “I remain deeply concerned about the growing use of facial recognition technology in public spaces, not only by law enforcement agencies but also increasingly by the private sector. My office and the judiciary are both independently considering the legal issues and whether the current framework has kept pace with emerging technologies and people’s expectations about how their most sensitive personal data is used.”
Confirming that the technology is a now a major priority area for the ICO, Denham said that, when necessary, the regulator “will not hesitate to use our investigative and enforcement powers to protect people’s legal rights”.
She added: “We have launched an investigation following concerns reported in the media regarding the use of live facial recognition in the King’s Cross area. As well as requiring detailed information from the relevant organisations about how the technology is used, we will also inspect the system and its operation on-site to assess whether or not it complies with data protection law.
“Put simply, any organisations wanting to use facial recognition technology must comply with the law – and they must do so in a fair, transparent and accountable way. They must have documented how and why they believe their use of the technology is legal, proportionate and justified.
“We support keeping people safe but new technologies and new uses of sensitive personal data must always be balanced against people’s legal rights.”
Earlier this week, London mayor Sadiq Khan wrote to the chief executive of Argent demanding answers; he said there were “serious and widespread concerns” about the legality of the technology.
Related stories
ICO funding pays off but fears grow over huge legal bills
GDPR? Bah, it’s made no difference to us, Brits insist
‘Stark warning’ issued as ICO prosecutes CCTV firm