India facial recognition law enforcement Asia

India's Face-Scanning Police State Meets a Data Law That Looks the Other Way

DPDP Act's Section 17 carves out police FR deployments — leaving Project Panoptic's 200+ systems running ahead of a missing legal framework.

India's Unregulated Face-Scanning Build-Out People of Internet Research · India 200+ FR systems tracked Logged by IFF's Project Panoptic a… 2019 NAFRS tender year NCRB's national FR system, now int… Sec 17 DPDP exemption clause Allows broad carve-outs for state … 2017 Privacy precedent year Puttaswamy laid the three-part pro… peopleofinternet.com

Key Takeaways

India is becoming one of the world's largest live laboratories for police facial recognition (FR) — and one of the least legally constrained. From Delhi Police's AFRS, deployed during the 2019–20 Citizenship Amendment Act protests and the 2020–21 farmers' agitation, to Telangana's TSCOP and Hawk Eye apps, to Uttar Pradesh Police's drone-and-AI surveillance grid over the Maha Kumbh Mela in Prayagraj earlier this year, FR has gone from pilot to default tool of policing. The National Crime Records Bureau's National Automated Facial Recognition System (NAFRS), first tendered in 2019, is now being progressively threaded into the Crime and Criminal Tracking Network & Systems (CCTNS), giving every state police force a potential hook into a national face database.

The Internet Freedom Foundation's Project Panoptic tracker now logs more than 200 FR systems across central and state agencies. Amnesty International's Ban the Scan: Hyderabad campaign has identified that city as one of the most heavily surveilled in the world. And yet, as the Digital Personal Data Protection Act, 2023 (DPDP Act) enters its implementation phase with MeitY's draft DPDP Rules out for consultation, the single most consequential question — how the law applies to police FR — has effectively been answered: largely, it doesn't.

Section 17: The hole at the centre of the new privacy regime

The DPDP Act is, on paper, India's long-awaited answer to Justice K.S. Puttaswamy v. Union of India (2017), which recognised a fundamental right to privacy and laid down a three-part proportionality test for any state intrusion: legality, legitimate aim, and necessity. The Act introduces familiar building blocks — consent, purpose limitation, notice, data fiduciary obligations, a Data Protection Board.

But Section 17(2)(a) gives the Union government the power to exempt any 'instrumentality of the State' from most of these requirements, including for the prevention, detection, investigation or prosecution of offences. There is no statutory requirement of judicial authorisation, no independent oversight body for police data use, no purpose-limitation backstop, and no data subject right of access against law enforcement processing. Biometric data — the most sensitive category in every comparable regime, from the EU's GDPR to Brazil's LGPD — is not separately classified as sensitive personal data at all under the DPDP Act.

The practical effect: when Delhi Police runs AFRS against a protest crowd, or UP Police runs FR-equipped drones over a religious gathering of tens of millions, the DPDP Act's consent and notice architecture simply does not apply. The new regulator has no clear mandate to audit those deployments.

The courts are doing the legislature's job

With Parliament silent on FR, the work has fallen to the judiciary. In SQ Masood v. State of Telangana, the Telangana High Court is examining whether the use of Hawk Eye and TSCOP — including reported instances of officers stopping pedestrians to photograph them on the street — meets the Puttaswamy proportionality test. The petitioner's argument is straightforward and, in our view, correct: without an enabling statute, defined retention periods, accuracy audits, or redress mechanisms, indiscriminate FR fails the 'legality' prong before any necessity analysis even begins.

That is the right judicial instinct. But constitutional litigation, deployment by deployment, is not a substitute for a coherent national framework.

A pro-innovation case for guardrails

We are firmly pro-innovation and supportive of India's growing public-tech and AI ecosystem. Indian firms like Innefu Labs, Staqu, and others are building serious computer vision products, and there are legitimate, proportionate uses for FR — missing-person reunification, identification of serious-crime suspects against narrow watchlists, victim identification in trafficking cases. The NCRB's original public framing of NAFRS emphasised exactly these.

The problem is not the technology; it is the absence of rules. Unconstrained FR is bad for innovation too:

What proportionate regulation looks like

A workable Indian framework — one that preserves legitimate law-enforcement use without normalising mass biometric surveillance — would include:

The window is closing

The DPDP Rules are still in draft. Parliament can still revisit Section 17 — and should. India does not have to choose between an Orwellian default and an outright ban. But it does have to choose. Right now, by design or by drift, it has chosen the surveillance default. That is not a pro-innovation outcome; it is an under-regulated one, and the bill — in litigation, in public trust, and eventually in harsher overcorrection — will come due.

Sources & Citations

  1. IFF — Project Panoptic (FR tracker)
  2. MeitY — Digital Personal Data Protection Act, 2023
  3. Justice K.S. Puttaswamy v. Union of India (2017) — Supreme Court of India
  4. Amnesty International — Ban the Scan: Hyderabad
  5. Internet Freedom Foundation — SQ Masood v. State of Telangana
Share this analysis: