Southern Co-op has been accused of using “unlawful” facial recognition software in its stores, with claims that the technology is “infringing the data rights of a significant number of UK data subjects”.
The retailer uses facial recognition software developed by UK company Facewatch in 35 stores across Portsmouth, Bournemouth, Bristol, Brighton and Hove, Chichester, Southampton, and London.
The system, which it is claimed is designed to protect staff and customers from unacceptable violence and abuse, allows supermarket staff to add individuals to a facial recognition “blacklist”.
However, Big Brother Watch claims shoppers are not informed if their facial biometric data, is stored or added to the blacklist where it can be kept for up to two years.
The privacy group also claims that photos of shoppers who are not on any watchlist may also be kept for several days for Facewatch to “improve its system”.
Big Brother Watch director Silkie Carlo has branded the retailer’s use of the facial recognition system as “Orwellian in the extreme” and has demanded immediate action from the Information Commissioner’s Office.
In a statement, she said: “The supermarket is adding customers to secret watchlists with no due process, meaning shoppers can be spied on, blacklisted across multiple stores, and denied food shopping despite being entirely innocent. This would sound extreme even in an episode of Black Mirror, and yet it is taking place right now in Britain.
“This is a deeply unethical and frankly chilling way for any business to behave and I’d strongly recommend that people do not shop at the Southern Co-op whilst they continue to spy on their shoppers.”
According to correspondence between Southern Co-op and Big Brother Watch, staff do not give photos to the police, but instead use the biometric profiles to create an alert if certain customers enter the store and to share allegations of unwanted conduct between staff in different stores.
But the privacy rights group claims biometric images of “subjects of interest” can be shared with other companies which have access to Facewatch software.
“Shoppers’ photos can be shared in an 8 mile radius from where they are taken from stores in London, or up to a 46 mile radius in rural locations,” the privacy group maintains.
In a statement to the BBC, Southern Co-op said it only uses the facial recognition system in stores which have a previous history of crime or misconduct.
The retailer said it welcomes any “constructive feedback” from the ICO as a result of the complaint.
A spokesperson said: “We take our responsibilities around the use of facial recognition extremely seriously, and work hard to balance our customers’ rights with the need to protect our colleagues and customers from unacceptable violence and abuse.
“The safety of our colleagues and customers is paramount and this technology has made a significant difference to this, in the limited number of high-risk locations where it is being used.”
The use of facial recognition technology has been engulfed in controversy for years, but perhaps the most well-known case is that of Clearview AI Inc, which was fined £7.5m by the ICO in May for using images of people in the UK that were collected from the web and social media to create a global online database.
The fine – substantially less than the £17m penalty proposed in the ICO’s notice of intent, but still the third biggest UK GDPR fine to date – came after a joint investigation with the Office of the Australian Information Commissioner (OAIC), which focused on Clearview AI Inc’s use of people’s images, data scraping from the Internet and the use of biometric data for facial recognition.
Related stories
Clearview AI gets £7.5m fine; is facial recognition dead?
Clearview AI faces £17m fine for abusing UK data laws
Facial recognition tech slapped down in privacy rulings
Clearview AI whacked with multiple GDPR complaints
CMA widens probe into use of ‘murky’ online algorithms
Consumers call for AI crackdown as ICO begins probe
Authorities split on how to regulate surging AI market