• Research
  • Emissary
  • About
  • Experts
Carnegie Global logoCarnegie lettermark logo
DemocracyIran
  • Donate
{
  "authors": [
    "Vidushi Marda"
  ],
  "type": "legacyinthemedia",
  "centerAffiliationAll": "",
  "centers": [
    "Carnegie Endowment for International Peace",
    "Carnegie India"
  ],
  "collections": [],
  "englishNewsletterAll": "",
  "nonEnglishNewsletterAll": "",
  "primaryCenter": "Carnegie India",
  "programAffiliation": "",
  "programs": [],
  "projects": [
    "Technology and Society"
  ],
  "regions": [
    "South Asia",
    "India"
  ],
  "topics": [
    "Technology"
  ]
}

Source: Getty

In The Media
Carnegie India

View: From Protests to Chai, Facial Recognition is Creeping up on us

The Indian Ministry of Home Affairs has proposed a nationwide Automated Facial Recognition System (AFRS) that will use images from CCTV cameras, newspapers, and raids to identify criminals against existing records in the Crime and Criminal Tracking Networks and System (CCTNS) database.

Link Copied
By Vidushi Marda
Published on Jan 7, 2020
Project hero Image

Project

Technology and Society

This program focuses on five sets of imperatives: data, strategic technologies, emerging technologies, digital public infrastructure, and strategic partnerships.

Learn More

Source: Economic Times

Facial recognition technology is no longer the realm of movies and science fiction. Just last month, newspapers reported that Delhi Police used facial recognition to identify “habitual protestors” and “rowdy elements”. At the PM’s rally on December 22, it was used to deny entry to “miscreants who could raise slogans and banners”. In Chennai, it identifies “suspicious looking people” in crowded areas while Punjab Police use it to investigate crimes and gather intelligence in real time. What’s more, the ministry of home affairs has proposed a nationwide Automated Facial Recognition System (AFRS) that will use images from CCTV cameras, newspapers, and raids to identify criminals against existing records in the Crime and Criminal Tracking Networks and System (CCTNS) database.

Those who use it claim that it introduces efficiency, speed and reduces costs in both state and retail efforts. Law enforcement agencies, for instance, have stated that they will find missing children, catch criminals, and preserve law and order with it. As for fears about the threat to privacy, these are brushed off saying that it is only directed at criminals, not law-abiding citizens.

The ‘civilian’ benefits of this tech are also touted. DigiYatra promises a seamless, paperless, hassle-free experience at airports by eliminating long security lines or check-in procedures. Retail users such as a tea chain claim to make customers’ experience more enjoyable by billing them through facial recognition instead of needing them to reach for their wallet.

These paint a picture of progress and efficiency at first glance, but in reality buy into at least one (and often all) of the following fallacies:

The Legal Fallacy: Facial recognition is not merely a collection of pictures. It creates a biometric map of one’s face which is then used for verification of a person (1:1 matching) or identification of the person from an existing database (1:many matching). Facial recognition is thus, by definition, a threat to privacy. In 2017, the Supreme Court recognised the fundamental right to privacy and explicitly noted that this right extends to public spaces. Further, it laid down that any infringement of this right must be necessary, proportionate, in pursuit of a legitimate aim, and have a rational nexus with the aim. Applying this four-part test in 2019, the Bombay high court laid down that the State cannot simply claim law and order or security to infringe on the right to privacy, but must rather, demonstrate that its action meets the proportionality test. Current deployments do not satisfy this legal requirement. In fact, the legal basis for law enforcement use of facial recognition does not exist. Responding to the Internet Freedom Foundation, the home ministry traces the legality of the AFRS to a Cabinet note from 2009. However, a Cabinet note is a document of procedure, not law, and does not qualify as a valid legal basis. Similarly, Delhi Police’s use of facial recognition was first directed in January 2018 by the Delhi high court for a very specific use — to find missing children. Its current usage has evolved without any legal oversight, and now includes monitoring peaceful protests. The legal fallacy thus has a high cost — it paves the way for mission creep, which is particularly worrying in the absence of data protection safeguards.

The Efficiency Fallacy: While it is assumed that efficiency is a natural consequence, in reality, the technology is dangerously unreliable. Delhi high court has repeatedly expressed concern over the deplorable accuracy rates of facial recognition systems in use, which was less than 1% in August 2019. In fact, the technology had a difficult time doing something as rudimentary as differentiating between boys and girls. Research from around the world has also shown that the likelihood of false positives, i.e. a person being wrongly identified, is particularly high in the case of women, children, elderly people, and ethnic minorities.

The Convenience Fallacy: In reality, being subject to a facial recognition system is far more inconvenient than simply paying for a cup of tea with cash or standing in an airport queue. In the absence of data protection safeguards, sensitive personal information about individuals can be used for any number of purposes, and shared, sold and processed in a plethora of ways. In the context of law enforcement, it can be used to create blacklists of “suspicious people” and “miscreants”. These systems do not afford individuals the luxury of knowing when they are included in such lists, or of having transparency and accountability mechanisms to fall back on.

Given these fallacies, and evidence that shows us the cost of buying into them, it is crucial that we understand and evaluate these systems before we make them ubiquitous.

This article was originally published by the Economic Times.

About the Author

Vidushi Marda

Former Nonresident Research Analyst, Carnegie India

Vidushi Marda was a nonresident research analyst at Carnegie India. She is a legal researcher who focuses on the interplay between emerging technologies, policy, and society.

Vidushi Marda
Former Nonresident Research Analyst, Carnegie India
TechnologySouth AsiaIndia

Carnegie does not take institutional positions on public policy issues; the views represented herein are those of the author(s) and do not necessarily reflect the views of Carnegie, its staff, or its trustees.

More Work from Carnegie Endowment for International Peace

  • Photo of Balen Shah taking a selfie with a group of Nepali adults and children.
    Article
    A New Generation Takes Power in Nepal

    The incoming government has swept Nepal’s election. The real work begins now.

      Amish Raj Mulmi

  • A Black man pulls a trolley. He is small in the bottom center of the frame; in the background are stacks of large, colorful shipping containers and the parts of a large crane or similar piece of equipment.
    Article
    Africa’s Global Economic Edge: Advancing Strategic Sectors

    In key sectors such as critical minerals, specialty agriculture, and fintech, Africa can become a global powerhouse by investing more in manufacturing, value-add, and scaling.

      • Kholofelo Kugler

      Kholofelo Kugler, Georgia Schaefer-Brown

  • Xi walking into a room with people standing and applauding around him
    Commentary
    Emissary
    The Xi Doctrine Zeros in on “High-Quality Development” for China’s Economic Future

    In the latest Five-Year Plan, the Chinese president cements the shift to an innovation-driven economy over a consumption-driven one.

      • Damien Ma

      Damien Ma

  • Article
    Kenya’s Health Deal Is a Stress Test for the America First Global Health Strategy

    U.S. agreements must contend with national data protection laws to make durable foreign policy instruments.

      • A Black woman with long hair wears a black blazer

      Jane Munga, Rose Mosero

  • illustration of AI chat bubbles
    Commentary
    California Sees Ways AI Can Support Policymaking. Here’s What It Needs to Succeed.

    For AI to capture the public’s policy concerns, people need to know that the models are elevating human concerns in human words, not generating their own.

      • Micah Weinberg headshot

      Micah Weinberg

Get more news and analysis from
Carnegie Endowment for International Peace
Carnegie global logo, stacked
1779 Massachusetts Avenue NWWashington, DC, 20036-2103Phone: 202 483 7600Fax: 202 483 1840
  • Research
  • Emissary
  • About
  • Experts
  • Donate
  • Programs
  • Events
  • Blogs
  • Podcasts
  • Contact
  • Annual Reports
  • Careers
  • Privacy
  • For Media
  • Government Resources
Get more news and analysis from
Carnegie Endowment for International Peace
© 2026 Carnegie Endowment for International Peace. All rights reserved.