WDSU Investigates: Face Value – The Battle Between Crime Fighting and Controversy

Posted on

The Role of Facial Recognition in New Orleans

Walking into the French Quarter, you are constantly under observation. This is not just by tourists or business owners, but by facial recognition cameras that scan faces in real time. New Orleans has become the first city in the country to deploy facial recognition alerts across an entire city using a network of over 5,000 cameras, with 200 of them equipped with high-definition facial recognition. This system, managed by a nonprofit organization called Project NOLA, has helped law enforcement track fugitives, close major cases, and even prevent violence. However, the system has recently gone silent for the New Orleans Police Department.

WDSU Investigates conducted an in-depth examination of the technology, spoke with those behind it, and raised critical questions: Why did NOPD voluntarily pause alerts? And what risks is the city facing while waiting to make a decision?

How Facial Recognition Works in New Orleans

Unlike other cities, the system in New Orleans is not run by the police or the government. It is managed by Project NOLA, a private nonprofit organization based at the University of New Orleans. According to Brian Lagarde, founder of Project NOLA, the system only scans for individuals with felony warrants, such as those involved in homicide, stabbings, home invasions, and other serious crimes.

Project NOLA’s network of cameras is hosted on homes, churches, and small businesses, with no cost to the city. The system uses artificial intelligence to track faces, clothing, and even body language. If a match is found in its “wanted library,” an alert is immediately sent to law enforcement.

Putting the Tech to the Test

To test the accuracy of the system, WDSU’s Cassie Schirm walked the French Quarter alongside Lagarde. He volunteered to be loaded into the system as a simulated “wanted subject.” In just 90 minutes, Lagarde’s face was flagged 74 times by different cameras at different angles, each triggering a live alert with his location and timestamp.

“We can identify people even in massive crowds,” Lagarde said. “Mardi Gras, the Super Bowl, thousands of people shoulder to shoulder, we can still find one.”

Project NOLA’s cameras can also identify individuals from up to three city blocks away, with high enough resolution to read small details like tattoos, logos, or even how many fingers someone holds up from a distance.

Proven Results: Terror Threats and Escaped Inmates

The technology has helped close dozens of cases and led to at least 43 arrests this year alone. On January 1, a driver plowed into a crowd in the French Quarter in what police later described as a terror attempt. Surveillance footage from Project NOLA’s cameras helped officers track the suspect, confirm he acted alone, and avoid the start of a wider panic.

In May, 10 inmates escaped from a local facility. Minutes after the alert went out, Project NOLA’s system identified two of the escapees in the French Quarter. Both were captured quickly.

“This system works,” Lagarde said. “And when it’s paused, we lose time. And sometimes, that can mean losing lives.”

Why the Alerts Were Paused

Despite the system’s success, the NOPD asked Project NOLA to pause all alerts in April 2025. The reason? Questions about whether the alerts aligned with a 2021 city ordinance that limits how facial recognition can be used. That ordinance requires that NOPD:

  • Only use still images (not real-time alerts)
  • Report each use to the City Council
  • Not solicit alerts from external providers

However, Project NOLA operated outside of city government and sent unsolicited alerts, creating what many call a “gray area.”

“There’s a provision in the ordinance that says NOPD can use facial recognition if they didn’t request it or control the system,” said Rafael Goyeneche with the Metropolitan Crime Commission. “So, technically, no law was broken.”

Lagarde claims they have been reviewed by the district attorney, the independent police monitor, and NOPD leadership. “None of them found anything wrong,” he said. “So why are we still on pause?”

Public Safety at Risk

Since the pause, Project NOLA says wanted suspects have walked through camera zones undetected. In one case, a woman wanted for sex trafficking was spotted seven times before she was finally arrested manually.

“If we had alerts active, she could’ve been caught the first time,” Lagarde said.

Melanie Talia, head of the Police and Justice Foundation, agrees. “What’s at risk is public safety,” she said. “We know this technology helped get violent people off the street faster. Slowing it down has real consequences.”

Independent Report Calls for More Tech, Not Less

Following the New Year’s Day bombing, the Police and Justice Foundation commissioned an independent analysis by national law enforcement consulting firm Teneo. The findings were clear: Technology must play a bigger role in modern policing.

The Teneo report specifically cited NOPD’s severe officer shortage, noting that the department lacks the manpower to respond in real time without automated help. It recommended expanding tools like facial recognition to act as a “force multiplier” and allow for faster identification of suspects, especially in the absence of sufficient boots on the ground.

“It’s not just about solving crime—it’s about preventing it,” the Police and Justice Foundation stated.

Civil Liberties and Community Concerns

Not everyone is convinced that facial recognition technology belongs in everyday policing. Alana Odoms, executive director of the American Civil Liberties Union of Louisiana, says the technology poses serious risks to civil liberties, particularly in Black and brown communities.

“This gives law enforcement a lot of power, and that power must come with strong checks,” Odoms said. “There have been wrongful arrests in other parishes due to facial recognition. If New Orleans moves forward, we’ll be first in the nation again, and we need to get it right.”

She points to studies showing that facial recognition algorithms have disproportionately higher error rates when scanning people of color, women, and younger individuals.

“This kind of surveillance disproportionately impacts Black and brown communities,” Odoms said. “It creates a chilling effect on public life when people don’t feel free to move without being watched.”

She also raised concerns about mission creep, where a technology introduced for serious crimes might gradually be used more broadly.

Some residents echoed her worries.

“It reminds me of something out of 1984,” one French Quarter worker said. “You just don’t know who’s watching.”

In a recent headline-making case, a man in Jefferson Parish was wrongfully arrested after being misidentified by facial recognition software. He later sued and was awarded a financial settlement.

Concerns About City-Owned Facial Recognition

The proposed ordinance would give the NOPD the authority to build and operate its own facial recognition system. But some question whether the city is ready.

Many of the crime cameras already owned and operated by the city are outdated or broken. Council members and law enforcement sources confirm that numerous city-run cameras provide low-quality video or do not work at all.

“If the city’s current surveillance network isn’t functioning properly, how can we trust a city-operated facial recognition system to be accurate or fair?” one resident asked.

Some fear that without major infrastructure upgrades, a city-run system could lead to more misidentifications and fewer reliable alerts.

Concerns About Immigration Use

Among the concerns raised by civil rights advocates is whether facial recognition could one day be used for immigration enforcement. Community members and organizations like the ACLU have warned that powerful surveillance tools, once approved, might be expanded beyond their original intent.

But Project NOLA founder Lagarde says that is not possible with his system.

Lagarde explained that the system is designed to recognize only individuals with felony warrants or those under court order. The database used for matches does not include immigrants lacking permanent legal status or individuals involved in noncriminal immigration issues.

“If someone isn’t wanted for a violent crime, a felony or listed in a protective order, the system won’t flag them,” he said. “We built this to support public safety in our community — not to target people based on status.”

He added that the nonprofit’s mission is rooted in privacy protections, with data retention limited to about 30 days and no face data stored unless there is an active match in their system.

Community Control and the ‘Kill Switch’

While Project NOLA manages the facial recognition technology and camera network, the system relies on community cooperation to operate, and that includes a built-in safeguard.

According to Lagarde, every participating business, homeowner, or organization hosting a Project NOLA camera has the power to turn it off at any time.

“If we ever violate public trust, they can unplug us immediately,” Lagarde said. “Every camera has what we call a ‘kill switch.’ It gives control back to the people who made this possible.”

This decentralized model means that no camera is forced on anyone. If a host no longer supports the mission or questions the system’s transparency, they can opt out without delay.

Lagarde says this is one of the fundamental reasons the network has grown over the years — because it’s voluntary, not mandated by the city, and rooted in trust between the organization and the neighborhoods it serves.

Nonprofit Privacy Protections

Another key distinction between Project NOLA and city-operated surveillance systems is its nonprofit status. Because Project NOLA is a private organization, its video footage is not subject to public records laws the way city-run crime camera footage is.

That means members of the public, attorneys, or media cannot file a public records request to obtain Project NOLA footage. This provides an added layer of privacy protection for residents, business owners, and crime victims whose activity may be captured by the system.

Lagarde says this helps strike a balance between safety and privacy.

“We don’t answer to politics — we answer to our community,” he said. “This allows people to share video that helps fight crime, without worrying it’ll end up online or in the wrong hands.”

Supporters say this privacy shield makes Project NOLA a safer and more trusted system, especially in neighborhoods that have historically been wary of government surveillance.

What’s Next?

The City Council has delayed the vote on the new ordinance three times, citing the need for more public input. Supporters say they want to get it right. Opponents worry that delays are stalling tools that were already working.

Council member Eugene Green says the proposed ordinance is being carefully reviewed to ensure it reflects public feedback and includes strict oversight.

“It’s a sensitive subject. We want discussion without last-minute amendments. We’ve heard from the public and are incorporating that. It doesn’t have to be done overnight, but it has to be done right,” Green said.

Meanwhile, the pause on facial recognition alerts remains in place. Officers are back to relying on traditional surveillance tools and delayed video review rather than the real-time alerts the system once provided.

The decision to restart or reshape the city’s use of facial recognition is no longer in the hands of the police chief. It now rests with the City Council.

A New Ordinance on the Table

The City Council is now considering a new, more detailed ordinance that would:

  • Allow the NOPD and the Real-Time Crime Center to use facial recognition for:
  • Violent felonies
  • Protective order violations
  • Imminent threats to public safety
  • Missing persons
  • Require supervisor approval for every use
  • Ban the use of facial recognition as the sole basis for arrest
  • Mandate quarterly reporting, including:
  • Officer name and badge number
  • Suspect race, gender and age
  • Software and database source
  • Whether it led to an arrest
  • Allow NOPD to run its own facial recognition system
  • Require third parties like Project NOLA to sign a formal agreement and follow all rules or be cut off

Under the ordinance:

  • NOPD and RTCC would be explicitly authorized to use facial recognition technology for approved crimes
  • The department could purchase or license software, run its own database, and operate in real time so long as it follows new reporting and oversight protocols
  • If the NOPD chooses to use a third party like Project NOLA, that group must sign a contract with the city and follow all transparency requirements
  • If they do not, access would be cut off

Bottom Line

Facial recognition in New Orleans has helped police respond faster and close serious cases, but critics warn that without firm safeguards, the technology can be flawed and harmful.

As the City Council prepares to make a decision, the question is not just about whether the alerts should return, but how the system should be governed.

What happens next could shape the future of surveillance, safety, and civil rights in New Orleans for years to come.

Leave a Reply

Your email address will not be published. Required fields are marked *