The New Orleans Police Department's suspension of its real-time facial recognition program, following a 2025 Washington Post investigation, has become a case study in what happens when a city tries to regulate a fast-moving technology without building the institutional muscle to enforce its own rules. According to the Post, NOPD officers had received live facial-recognition alerts from a network of roughly 200 privately-operated cameras run by the nonprofit Project NOLA — a workaround that appears to have functioned for years despite a 2022 city ordinance that explicitly limited live facial recognition use by the department to a narrow set of violent-crime investigations with supervisor sign-off.
The City Council is now weighing new oversight rules. How it responds will matter well beyond Louisiana. New Orleans was one of the first major U.S. cities to attempt a middle-path framework — neither San Francisco-style prohibition nor unrestricted deployment — and its apparent failure to operationalize that compromise will be cited in every state and municipal debate this year.
What Actually Went Wrong
The facts as reported suggest a governance failure, not a technology failure. The 2022 ordinance permitted facial recognition only in specific circumstances: investigations into violent crimes, post-hoc image searches against mugshot databases, and with documented supervisor approval. Live, continuous scanning of public-space camera feeds was not contemplated.
Project NOLA, a private nonprofit operating a citywide camera network largely funded by business and resident subscribers, reportedly pushed real-time alerts directly to officers' phones. Because the alerts originated outside NOPD's systems, the department appears to have treated them as tips rather than departmental facial-recognition use — a distinction nowhere supported by the ordinance text. The result: a system that the City Council thought it had constrained was effectively operating without the documentation, audit trail, or judicial accountability the law required.
The Wrong Lesson Would Be Prohibition
It is tempting to read this episode as proof that facial recognition cannot be safely deployed in public spaces. That conclusion overshoots the evidence.
Facial recognition technology has improved dramatically. The most recent National Institute of Standards and Technology (NIST) Face Recognition Vendor Test evaluations show top-tier algorithms achieving error rates below one percent on high-quality images, with the demographic disparities that drew justified criticism in 2018-2019 substantially narrowed in the best-performing systems. Real-world utility is real too: U.K. Metropolitan Police deployments of live facial recognition have produced documented arrests of wanted suspects, including for serious violent offenses, with false-positive rates that current evaluations place well below one in a thousand subjects scanned in operational conditions.
Refusing to use a tool that demonstrably works to find violent offenders — when properly configured, audited, and constrained — trades one harm for another. The right question is not whether to permit the technology, but how to ensure that permission carries enforceable conditions.
What Proportionate Regulation Looks Like
A workable framework for public-space biometric surveillance has several non-negotiable components, most of which New Orleans nominally had but failed to enforce:
- Statutory scope limits: Use restricted to identifying suspects in serious crimes or missing persons against narrowly-defined watchlists — not general population surveillance.
- Source-agnostic coverage: The rules must apply regardless of whether the camera or matching system is owned by the police, a contractor, or a nonprofit partner. The NOLA episode shows how easily a private intermediary can be used to launder a prohibited capability.
- Mandatory logging and independent audit: Every match alert should generate an immutable log entry reviewable by an external body. Tennessee, Virginia, and Washington State have moved in this direction with varying degrees of rigor.
- Public reporting: Annual disclosures of use volume, hit rates, false-positive incidents, and demographic breakdowns. Without this, the public cannot evaluate whether the technology is delivering its promised benefits.
- Procurement transparency: Vendor algorithms used in operational deployments should be submitted to NIST or an equivalent independent benchmark, with results published.
Federal Vacuum, State Patchwork
Part of what makes the New Orleans situation difficult is the absence of a federal framework. Congress has considered facial recognition legislation in successive sessions without enacting it. State approaches range from Illinois's Biometric Information Privacy Act (BIPA), which primarily targets private-sector collection, to outright municipal bans in San Francisco, Boston, and Portland, to permissive regimes with light oversight in much of the South and Mountain West.
A federal floor — establishing minimum logging, audit, and watchlist standards while leaving states free to layer stricter rules — would prevent both the race to the bottom and the patchwork compliance burden that disadvantages smaller agencies most. The American Law Institute's draft Principles of the Law on Policing, which addresses surveillance technology governance, offers a credible starting template.
The Council's Choice
The New Orleans City Council faces a binary framing in public debate — ban or permit — that obscures the real question: can the city build the oversight infrastructure to make its existing rules mean something? An ordinance is not self-enforcing. If the answer is no, prohibition is the honest second-best option. But the better answer is to invest in the inspector-general capacity, technical expertise, and reporting mechanisms that make calibrated rules credible.
The risk of overreaction is that other cities watching New Orleans will conclude that the only safe regulatory posture is prohibition, forgoing public-safety tools whose responsible use is well within the capacity of competent democratic institutions. That would be a loss not just for innovation, but for the residents that better-equipped police departments are meant to protect.