Simon Mackenzie, a security guard at discount retailer QD Stores outside London, was short of breath. He had just chased down three thieves who had stolen packets of laundry soap. Before the police arrive, he sits at his desk backstage doing one important thing: capturing the criminal’s face.
On an aging desktop computer, he pulled up security camera footage, pausing to zoom in and save pictures of each thief. He then logged into Facewatch, the facial recognition program his store uses to identify shoplifters. The next time those individuals enter any Facewatch-enabled store within a few miles, store workers will be alerted.
“It’s like someone telling you, ‘That guy you caught last week just came back,'” Mr MacKenzie said.
Police use of facial recognition technology has come under intense scrutiny in recent years, but private industry applications have received less attention. Now, as technology advances and costs drop, these systems are moving even further into people’s lives. No longer just the purview of government agencies, facial recognition is increasingly being used to identify shoplifters, problem customers and legal opponents.
Facewatch is a British company used by retailers across the country frustrated by petty crime. For just £250 (about $320) a month, Facewatch gives you access to custom watchlists that are stored and shared next to each other. When Facewatch spots a tagged face, the store’s smartphone is alerted, and employees can decide whether to keep an eye on the person or ask the person to leave.
Mr McKenzie said he was adding a new face or two every week, mainly people stealing diapers, groceries, pet supplies and other cheap goods. He said he sympathized with their financial hardship, but the number of burglaries was out of control and facial recognition was required. Usually, Facewatch would alert him at least once a day that someone on the watchlist had entered the store.
Facial recognition technology is proliferating as Western countries grapple with advances brought about by artificial intelligence. The European Union is drafting rules to ban the use of facial recognition technology, while New York City Mayor Eric Adams has encouraged retailers to experiment with using the technology to fight crime. MSG Entertainment, owner of Madison Square Garden and Radio City Music Hall, used automated facial recognition technology to deny access to lawyers whose company is suing the company.
Among democracies, the UK is at the forefront of the use of real-time facial recognition technology, with courts and regulators sanctioning its use. Police in London and Cardiff are trialing the technology to identify wanted men walking down the street. In May, it was used to scan crowds to be crowned King Charles III.
But the retailer’s use has drawn criticism that the solution is disproportionate for petty crimes. Individuals have little idea whether they are on a watch list or how to file an appeal. Civil society group Big Brother Watch called it “extremely Orwellian” in a legal complaint last year.
Fraser Sampson, Britain’s biometrics and surveillance camera commissioner, who advises the government on policy, said there was “nervousness and hesitation” about facial recognition technology due to privacy concerns and algorithms that had performed poorly in the past.
“But I think facial recognition technology can be a real game-changer in some areas in terms of speed, scale, accuracy and cost,” he said. “That means its arrival and deployment is probably inevitable. It’s just a situation“.
‘You can’t expect the police to come’
Facewatch was founded in 2010 by Simon Gordon, owner of a popular 19th-century pub in central London known for its cellar-like interior and popularity with pickpockets.
At the time, Gordon hired software developers to create an online tool to share security camera footage with authorities, hoping it would save police time in filing incident reports and lead to more arrests.
There was limited interest, but Mr Gordon’s fascination with security technology was piqued. He followed the development of facial recognition and came up with the idea of creating a watchlist that retailers could share and contribute to. It’s like the pictures of shoplifters that stores put up next to the checkout line, but superseded into a collective database to identify bad guys in real time.
By 2018, Mr. Gordon believes the technology is ready for commercial use.
“You have to help yourself,” he said in an interview. “You can’t expect the police to come.”
Facewatch licenses facial recognition software made by Real Networks and Amazon and is now sold in almost 400 stores in the UK. Trained on millions of images and videos, the system reads the biometric information on people’s faces as they walk into a store and checks them against a database of tagged people.
Facewatch watchlists keep growing as stores upload photos of shoplifters and problem customers. Once added, a person remains there for a year before being removed.
“Mistakes are rare, but they do happen”
Whenever Facewatch’s system recognizes a shoplifter, it sends a notification to a tested “super recognizer” (a person with a special talent for remembering faces). Within seconds, the Super Recognizer must confirm a match with the Facewatch database before sending an alert.
But despite the company’s policies against misidentifications and other mistakes, mistakes do happen.
In October, a woman was confronted by a staff member who was buying milk at a supermarket in Bristol, England, and ordered to leave. She was told that Facewatch had flagged her for shoplifting.
The woman, who asked not to be named due to privacy concerns, and her lawyer and Facewatch to corroborate her story, said it must have been a mistake. When she contacted Facewatch a few days later, the company apologized, saying it was an incident of mistaken identity.
After the woman threatened legal action, Facewatch dug up her records. The investigation found that the woman had been added to a watch list in connection with an incident involving £20 (about $25) worth of goods that occurred 10 months earlier. Facewatch said the system “worked flawlessly.”
Although the technology has correctly identified the woman, it doesn’t leave much room for human discretion. Neither Facewatch nor the store where the incident occurred contacted her to let her know she was on a watchlist and ask what happened.
The woman said she had no recollection of the incident and had never shoplifted. She said she likely left after not realizing her debit card payment failed to complete at the self-checkout kiosk.
Madeleine Stone, legal and policy officer at Big Brother Watch, said Facewatch was “normalizing airport-style security checks for everyday activities like buying a pint of milk.”
Gordon declined to comment on the events in Bristol.
In general, he said, “mistakes are rare, but they do happen.” He added, “If this happens, we acknowledge our mistake, apologize, delete any relevant data to prevent it from happening again, and provide corresponding compensation.”
Approved by the Privacy Office
Civil liberties groups have raised concerns about Facewatch, saying its deployment to prevent petty crime could be illegal under UK privacy laws, which require biometrics to have a “substantial public interest”.
The UK’s Information Commissioner’s Office (the privacy watchdog) conducted a year-long investigation into Facewatch.The office concluded in March that Facewatch’s system permitted by lawbut only if the company changes the way it operates.
Stephen Bonner, the office’s deputy commissioner for regulation, said in an interview that an investigation led Facewatch to change its policy: It will place more signage in stores and only share information about serious and violent incidents between stores. information about offenders and alerts only about information about serious and violent offenders. repeat offender. It means people won’t be put on a watch list after a petty crime, as happened to this woman in Bristol.
“This reduces the amount of personal data held, reduces the chances of individuals being unfairly added to such lists and makes them more likely to be accurate,” Mr Bonner said. The technology, he said, is “no different than having really good security guards”“.
Liam Ardern, operations manager at Lawrence Hunt, which has 23 Spar convenience stores using Facewatch, estimates the technology has saved the company more than £50,000 since 2020.
He called the privacy risks of facial recognition overblown. The only instance of a misidentification he recalls was when a man was mistaken for his shoplifting identical twin. Critics, he says, overlook the razor-thin margins for stores like his.
“It’s very easy for them to say, ‘No, this is a violation of human rights,'” Mr Ardern said. Without reducing shoplifting, his store will have to raise prices or lay off staff, he said.