
The New Orleans Police Department secretly installed artificial intelligence-driven facial recognition cameras to monitor city streets for suspects and wanted criminals, an investigation from The Washington Post revealed in May. Since the installations, crime rates are down but concerns about privacy are up.
The NOPD’s surveillance program relied on Project NOLA, a non-profit organization headquartered at the University of New Orleans with a network of crime cameras. Project NOLA has no formal contract with the city but worked directly with police officers.
In collaboration with Project NOLA, the NOPD accessed nearly 200 facial recognition cameras to monitor the streets of New Orleans for two years.
These specialized crime cameras use AI to track license plates and recognize faces. The technology scans physical features and compares them to a database of images, drawn from mug shots, driver’s licenses or social media to identify possible matches. Once a match is identified, police officers are immediately notified through a mobile app.
According to The Washington Post, the surveillance program has contributed to dozens of arrests, four of which were charged with nonviolent crimes. Project NOLA reported that the cameras reduced gun violence in “one extremely high-crime area” in New Orleans by 80%. In 2024, murder rates dropped 23% nationally and 39% in New Orleans.
Police, at times, arrested suspects solely based on AI matches without additional independent evidence, increasing the chances of a false arrest. AI facial recognition software has been less accurate in identifying people of color, women and older people.
The widespread use of AI surveillance in New Orleans violates a 2022 New Orleans City Council Ordinance, which limits the use of facial recognition technology for investigating the violent crimes of a specific individual, not as a broad surveillance tool for the public.
While police forces across the country use facial recognition software, New Orleans’ use of AI in their surveillance program is the first known citywide effort to use AI to make immediate arrests.
Brian Lagarde, the founder of Project NOLA, said that New Orleans was the first citywide facial recognition effort he has conducted.
Roughly 5,000 cameras are scattered across New Orleans, and 200 are equipped with facial recognition software. Despite the broad network of cameras in New Orleans, many students remain unaware that these cameras even exist.
Ben Axelrad, a freshman at Tulane who frequents the French Quarter said he had never heard about the use of AI in security cameras downtown.
“I think it’s a really good idea, especially on Bourbon, where it’s so crowded … it would make me feel safer if something did happen,” Axelrad said.
The use of facial recognition cameras also presents serious concerns about privacy. “I feel like it is a bit of an invasion of privacy,” Axelrad said. “I would hope that … they’re not going to be using anything they see … against you.”
Project NOLA said it “strongly protects people’s privacy” by only storing footage for two weeks and not using computer-based predictive behavior algorithms.
Recent legal changes in Louisiana have introduced new complexities.
A recent Louisiana law prohibits law enforcement from purposefully ignoring or withholding data from U.S. Immigration and Customs Enforcement. This provision includes all forms of video surveillance, including footage collected by Project NOLA. Critics of the initiative worry that facial recognition data will be misused for immigration enforcement and put undocumented citizens at risk.
