Stores across North America and Europe are using software that employs artificial intelligence to identify potential shoplifters. The technology is receiving praise from retailers, but it has raised concerns over the potential for false positives.
Developed by the French AI surveillance company Veesion, the software integrates with existing surveillance camera networks to flag gestures and body movements it deems suspicious in real time.

Download the SAN app today to stay up-to-date with Unbiased. Straight Facts™.
Point phone camera here
Among the U.S. businesses using the software are Ace Hardware, 7-Eleven and Grocery Outlet, according to Veesion’s website. The company recently raised $43 million to further expand its operations in the United States.
Testing at 5,000 locations
The tool is currently being tested at 5,000 locations, including retail stores, supermarkets and pharmacies, according to Newsweek. Veesion says its system has been trained on countless hours of footage that captured shoplifters in action, helping the program’s AI to learn to identify suspicious behavior patterns that indicate theft.
The system generates alerts that can be sent to store employees, such as loss prevention personnel or security guards, on their phones, tablets or even checkout terminals.
Veesion cofounder Benoît Koenig told Business Insider that more than 85 percent of alerts at businesses using the software have been considered relevant. One business, Koenig said, reduced missing inventory rates by half after installing the software.
Koenig also said Veesion can act as a deterrent to would-be shoplifters.
“They know there is an AI in the cameras,” he said. “So they’re going to be careful with what they do.”
Concerns about errors, profiling
AI-based crime prediction software has faced widespread criticism, particularly the tools designed for law enforcement. The predictive policing software purchased by a New Jersey police department turned out to be accurate less than 1 percent of the time, Wired reported in 2023.
The Office of the United Nations High Commissioner for Human Rights said in a report last year that “predictive policing can exacerbate the historical over-policing of communities along racial and ethnic lines.”
Koenig claims that his software takes such concerns into account by not collecting or analyzing anything other than a subject’s body language.
“The algorithm doesn’t care about what people look like,” he said. “It just cares about how your body parts move over time.”
contributed to this report.