Facial Recognition Spreads as Tool to Fight Shoplifting

Simon Mackenzie, a security officer at the discount retailer QD Stores outside London, was short of breath. He had just chased after three shoplifters who had taken off with several packages of laundry soap. Before the police arrived, he sat at a back-room desk to do something important: Capture the culprits’ faces.

On an aging desktop computer, he pulled up security camera footage, pausing to zoom in and save a photo of each thief. He then logged in to a facial recognition program, Facewatch, which his store uses to identify shoplifters. The next time those people enter any shop within a few miles that uses Facewatch, store staff will receive an alert.

“It’s like having somebody with you saying, ‘That person you bagged last week just came back in,’” Mr. Mackenzie said.

Use of facial recognition technology by the police has been heavily scrutinized in recent years, but its application by private businesses has received less attention. Now, as the technology improves and its cost falls, the systems are reaching further into people’s lives. No longer just the purview of government agencies, facial recognition is increasingly being deployed to identify shoplifters, problematic customers and legal adversaries.

Facewatch, a British company, is used by retailers across the country frustrated by petty crime. For as little as 250 pounds a month, or roughly $320, Facewatch offers access to a customized watchlist that stores near one another share. When Facewatch spots a flagged face, an alert is sent to a smartphone at the shop, where employees decide whether to keep a close eye on the person or ask the person to leave.

Mr. Mackenzie adds one or two new faces every week, he said, mainly people who steal diapers, groceries, pet supplies and other low-cost goods. He said their economic hardship made him sympathetic, but that the number of thefts had gotten so out of hand that facial recognition was needed. Usually at least once a day, Facewatch alerts him that somebody on the watchlist has entered the store.

Facial recognition technology is proliferating as Western countries grapple with advances brought on by artificial intelligence. The European Union is drafting rules that would ban many of facial recognition’s uses, while Eric Adams, the mayor of New York City, has encouraged retailers to try the technology to fight crime. MSG Entertainment, the owner of Madison Square Garden and Radio City Music Hall, has used automated facial recognition to refuse entry to lawyers whose firms have sued the company.

Among democratic nations, Britain is at the forefront of using live facial recognition, with courts and regulators signing off on its use. The police in London and Cardiff are experimenting with the technology to identify wanted criminals as they walk down the street. In May, it was used to scan the crowds at the coronation of King Charles III.

But the use by retailers has drawn criticism as a disproportionate solution for minor crimes. Individuals have little way of knowing they are on the watchlist or how to appeal. In a legal complaint last year, Big Brother Watch, a civil society group, called it “Orwellian in the extreme.”

Fraser Sampson, Britain’s biometrics and surveillance camera commissioner, who advises the government on policy, said there was “a nervousness and a hesitancy” around facial recognition technology because of privacy concerns and poorly performing algorithms in the past.

“But I think in terms of speed, scale, accuracy and cost, facial recognition technology can in some areas, you know, literally be a game changer,” he said. “That means its arrival and deployment is probably inevitable. It’s just a case of when.”

Facewatch was founded in 2010 by Simon Gordon, the owner of a popular 19th-century wine bar in central London known for its cellarlike interior and popularity among pickpockets.

At the time, Mr. Gordon hired software developers to create an online tool to share security camera footage with the authorities, hoping it would save the police time filing incident reports and result in more arrests.

There was limited interest, but Mr. Gordon’s fascination with security technology was piqued. He followed facial recognition developments and had the idea for a watchlist that retailers could share and contribute to. It was like the photos of shoplifters that stores keep next to the register, but supercharged into a collective database to identify bad guys in real time.

By 2018, Mr. Gordon felt the technology was ready for commercial use.

“You’ve got to help yourself,” he said in an interview. “You can’t expect the police to come.”

Facewatch, which licenses facial recognition software made by Real Networks and Amazon, is now inside nearly 400 stores across Britain. Trained on millions of pictures and videos, the systems read the biometric information of a face as the person walks into a shop and check it against a database of flagged people.

Facewatch’s watchlist is constantly growing as stores upload photos of shoplifters and problematic customers. Once added, a person remains there for a year before being deleted.

Every time Facewatch’s system identifies a shoplifter, a notification goes to a person who passed a test to be a “super recognizer” — someone with a special talent for remembering faces. Within seconds, the super recognizer must confirm the match against the Facewatch database before an alert is sent.

But while the company has created policies to prevent misidentification and other errors, mistakes happen.

In October, a woman buying milk in a supermarket in Bristol, England, was confronted by an employee and ordered to leave. She was told that Facewatch had flagged her as a barred shoplifter.

The woman, who asked that her name be withheld because of privacy concerns and whose story was corroborated by materials provided by her lawyer and Facewatch, said there must have been a mistake. When she contacted Facewatch a few days later, the company apologized, saying it was a case of mistaken identity.

After the woman threatened legal action, Facewatch dug into its records. It found that the woman had been added to the watchlist because of an incident 10 months earlier involving £20 of merchandise, about $25. The system “worked perfectly,” Facewatch said.

But while the technology had correctly identified the woman, it did not leave much room for human discretion. Neither Facewatch nor the store where the incident occurred contacted her to let her know that she was on the watchlist and to ask what had happened.

The woman said she did not recall the incident and had never shoplifted. She said she may have walked out after not realizing that her debit card payment failed to go through at a self-checkout kiosk.

Madeleine Stone, the legal and policy officer for Big Brother Watch, said Facewatch was “normalizing airport-style security checks for everyday activities like buying a pint of milk.”

Mr. Gordon declined to comment on the incident in Bristol.

In general, he said, “mistakes are rare but do happen.” He added, “If this occurs, we acknowledge our mistake, apologize, delete any relevant data to prevent reoccurrence and offer proportionate compensation.”

Civil liberties groups have raised concerns about Facewatch and suggested that its deployment to prevent petty crime might be illegal under British privacy law, which requires that biometric technologies have a “substantial public interest.”

The U.K. Information Commissioner’s Office, the privacy regulator, conducted a yearlong investigation into Facewatch. The office concluded in March that Facewatch’s system was permissible under the law, but only after the company made changes to how it operated.

Stephen Bonner, the office’s deputy commissioner for regulatory supervision, said in an interview that an investigation had led Facewatch to change its policies: It would put more signage in stores, share among stores only information about serious and violent offenders and send out alerts only about repeat offenders. That means people will not be put on the watchlist after a single minor offense, as happened to the woman in Bristol.

“That reduces the amount of personal data that’s held, reduces the chances of individuals being unfairly added to this kind of list and makes it more likely to be accurate,” Mr. Bonner said. The technology, he said, is “not dissimilar to having just very good security guards.”

Liam Ardern, the operations manager for Lawrence Hunt, which owns 23 Spar convenience stores that use Facewatch, estimates the technology has saved the company more than £50,000 since 2020.

He called the privacy risks of facial recognition overblown. The only example of misidentification that he recalled was when a man was confused for his identical twin, who had shoplifted. Critics overlook that stores like his operate on thin profit margins, he said.

“It’s easy for them to say, ‘No, it’s against human rights,’” Mr. Ardern said. If shoplifting isn’t reduced, he said, his shops will have to raise prices or cut staff.

Add a Comment

Your email address will not be published. Required fields are marked *