• Research
  • Politika
  • About
Carnegie Russia Eurasia center logoCarnegie lettermark logo
  • Donate
{
  "authors": [
    "Vidushi Marda"
  ],
  "type": "legacyinthemedia",
  "centerAffiliationAll": "",
  "centers": [
    "Carnegie Endowment for International Peace",
    "Carnegie India"
  ],
  "collections": [],
  "englishNewsletterAll": "",
  "nonEnglishNewsletterAll": "",
  "primaryCenter": "Carnegie India",
  "programAffiliation": "",
  "programs": [],
  "projects": [
    "Technology and Society"
  ],
  "regions": [
    "South Asia",
    "India"
  ],
  "topics": [
    "Technology"
  ]
}

Source: Getty

In The Media
Carnegie India

View: From Protests to Chai, Facial Recognition is Creeping up on us

The Indian Ministry of Home Affairs has proposed a nationwide Automated Facial Recognition System (AFRS) that will use images from CCTV cameras, newspapers, and raids to identify criminals against existing records in the Crime and Criminal Tracking Networks and System (CCTNS) database.

Link Copied
By Vidushi Marda
Published on Jan 7, 2020
Project hero Image

Project

Technology and Society

This program focuses on five sets of imperatives: data, strategic technologies, emerging technologies, digital public infrastructure, and strategic partnerships.

Learn More

Source: Economic Times

Facial recognition technology is no longer the realm of movies and science fiction. Just last month, newspapers reported that Delhi Police used facial recognition to identify “habitual protestors” and “rowdy elements”. At the PM’s rally on December 22, it was used to deny entry to “miscreants who could raise slogans and banners”. In Chennai, it identifies “suspicious looking people” in crowded areas while Punjab Police use it to investigate crimes and gather intelligence in real time. What’s more, the ministry of home affairs has proposed a nationwide Automated Facial Recognition System (AFRS) that will use images from CCTV cameras, newspapers, and raids to identify criminals against existing records in the Crime and Criminal Tracking Networks and System (CCTNS) database.

Those who use it claim that it introduces efficiency, speed and reduces costs in both state and retail efforts. Law enforcement agencies, for instance, have stated that they will find missing children, catch criminals, and preserve law and order with it. As for fears about the threat to privacy, these are brushed off saying that it is only directed at criminals, not law-abiding citizens.

The ‘civilian’ benefits of this tech are also touted. DigiYatra promises a seamless, paperless, hassle-free experience at airports by eliminating long security lines or check-in procedures. Retail users such as a tea chain claim to make customers’ experience more enjoyable by billing them through facial recognition instead of needing them to reach for their wallet.

These paint a picture of progress and efficiency at first glance, but in reality buy into at least one (and often all) of the following fallacies:

The Legal Fallacy: Facial recognition is not merely a collection of pictures. It creates a biometric map of one’s face which is then used for verification of a person (1:1 matching) or identification of the person from an existing database (1:many matching). Facial recognition is thus, by definition, a threat to privacy. In 2017, the Supreme Court recognised the fundamental right to privacy and explicitly noted that this right extends to public spaces. Further, it laid down that any infringement of this right must be necessary, proportionate, in pursuit of a legitimate aim, and have a rational nexus with the aim. Applying this four-part test in 2019, the Bombay high court laid down that the State cannot simply claim law and order or security to infringe on the right to privacy, but must rather, demonstrate that its action meets the proportionality test. Current deployments do not satisfy this legal requirement. In fact, the legal basis for law enforcement use of facial recognition does not exist. Responding to the Internet Freedom Foundation, the home ministry traces the legality of the AFRS to a Cabinet note from 2009. However, a Cabinet note is a document of procedure, not law, and does not qualify as a valid legal basis. Similarly, Delhi Police’s use of facial recognition was first directed in January 2018 by the Delhi high court for a very specific use — to find missing children. Its current usage has evolved without any legal oversight, and now includes monitoring peaceful protests. The legal fallacy thus has a high cost — it paves the way for mission creep, which is particularly worrying in the absence of data protection safeguards.

The Efficiency Fallacy: While it is assumed that efficiency is a natural consequence, in reality, the technology is dangerously unreliable. Delhi high court has repeatedly expressed concern over the deplorable accuracy rates of facial recognition systems in use, which was less than 1% in August 2019. In fact, the technology had a difficult time doing something as rudimentary as differentiating between boys and girls. Research from around the world has also shown that the likelihood of false positives, i.e. a person being wrongly identified, is particularly high in the case of women, children, elderly people, and ethnic minorities.

The Convenience Fallacy: In reality, being subject to a facial recognition system is far more inconvenient than simply paying for a cup of tea with cash or standing in an airport queue. In the absence of data protection safeguards, sensitive personal information about individuals can be used for any number of purposes, and shared, sold and processed in a plethora of ways. In the context of law enforcement, it can be used to create blacklists of “suspicious people” and “miscreants”. These systems do not afford individuals the luxury of knowing when they are included in such lists, or of having transparency and accountability mechanisms to fall back on.

Given these fallacies, and evidence that shows us the cost of buying into them, it is crucial that we understand and evaluate these systems before we make them ubiquitous.

This article was originally published by the Economic Times.

About the Author

Vidushi Marda

Former Nonresident Research Analyst, Carnegie India

Vidushi Marda was a nonresident research analyst at Carnegie India. She is a legal researcher who focuses on the interplay between emerging technologies, policy, and society.

Vidushi Marda
Former Nonresident Research Analyst, Carnegie India
TechnologySouth AsiaIndia

Carnegie does not take institutional positions on public policy issues; the views represented herein are those of the author(s) and do not necessarily reflect the views of Carnegie, its staff, or its trustees.

More Work from Carnegie Russia Eurasia Center

  • Commentary
    Carnegie Politika
    Why Did Messaging App Telegram Fall From Grace in Russia?

    The history of Telegram’s relations with the Russian state offers a salutary lesson for international platforms that believe they can reach a compromise with the Kremlin.

      Maria Kolomychenko

  • Commentary
    Carnegie Politika
    How Will the Loss of Starlink and Telegram Impact Russia’s Military?

    With the blocking of Starlink terminals and restriction of access to Telegram, Russian troops in Ukraine have suffered a double technological blow. But neither service is irreplaceable.

      Maria Kolomychenko

  • Commentary
    Carnegie Politika
    Russia’s Cyberfraud Epidemic Is Now a Political Issue

    For years, the Russian government has promoted “sovereign” digital services as an alternative to Western ones and introduced more and more online restrictions “for security purposes.” In practice, these homegrown solutions leave people vulnerable to data leaks and fraud.

      Maria Kolomychenko

  • Commentary
    Carnegie Politika
    How Far Will the Kremlin Take Its Internet Crackdown?

    In an attempt to stop Ukrainian drones from reaching their targets, the Russian authorities have significantly ramped up online repression.

      Maria Kolomychenko

  • Commentary
    Carnegie Politika
    China Is Using Vocational Training Centers to Rebuild Its Image in Central Asia

    In Central Asia, Beijing is learning to adapt. The era of raw economic assertiveness is giving way to a more nuanced strategy that fuses investment with education, infrastructure with human capital, and ambition with a dose of humility.

      Edward Lemon, Bradley Jardine

Get more news and analysis from
Carnegie Russia Eurasia Center
Carnegie Russia Eurasia logo, white
  • Research
  • Politika
  • About
  • Experts
  • Events
  • Contact
  • Privacy
  • For Media
Get more news and analysis from
Carnegie Russia Eurasia Center
© 2026 Carnegie Endowment for International Peace. All rights reserved.