in the media

View: From Protests to Chai, Facial Recognition is Creeping up on us

The Indian Ministry of Home Affairs has proposed a nationwide Automated Facial Recognition System (AFRS) that will use images from CCTV cameras, newspapers, and raids to identify criminals against existing records in the Crime and Criminal Tracking Networks and System (CCTNS) database.

published by
Economic Times
 on January 7, 2020

Source: Economic Times

Facial recognition technology is no longer the realm of movies and science fiction. Just last month, newspapers reported that Delhi Police used facial recognition to identify “habitual protestors” and “rowdy elements”. At the PM’s rally on December 22, it was used to deny entry to “miscreants who could raise slogans and banners”. In Chennai, it identifies “suspicious looking people” in crowded areas while Punjab Police use it to investigate crimes and gather intelligence in real time. What’s more, the ministry of home affairs has proposed a nationwide Automated Facial Recognition System (AFRS) that will use images from CCTV cameras, newspapers, and raids to identify criminals against existing records in the Crime and Criminal Tracking Networks and System (CCTNS) database.

Those who use it claim that it introduces efficiency, speed and reduces costs in both state and retail efforts. Law enforcement agencies, for instance, have stated that they will find missing children, catch criminals, and preserve law and order with it. As for fears about the threat to privacy, these are brushed off saying that it is only directed at criminals, not law-abiding citizens.

The ‘civilian’ benefits of this tech are also touted. DigiYatra promises a seamless, paperless, hassle-free experience at airports by eliminating long security lines or check-in procedures. Retail users such as a tea chain claim to make customers’ experience more enjoyable by billing them through facial recognition instead of needing them to reach for their wallet.

These paint a picture of progress and efficiency at first glance, but in reality buy into at least one (and often all) of the following fallacies:

The Legal Fallacy: Facial recognition is not merely a collection of pictures. It creates a biometric map of one’s face which is then used for verification of a person (1:1 matching) or identification of the person from an existing database (1:many matching). Facial recognition is thus, by definition, a threat to privacy. In 2017, the Supreme Court recognised the fundamental right to privacy and explicitly noted that this right extends to public spaces. Further, it laid down that any infringement of this right must be necessary, proportionate, in pursuit of a legitimate aim, and have a rational nexus with the aim. Applying this four-part test in 2019, the Bombay high court laid down that the State cannot simply claim law and order or security to infringe on the right to privacy, but must rather, demonstrate that its action meets the proportionality test. Current deployments do not satisfy this legal requirement. In fact, the legal basis for law enforcement use of facial recognition does not exist. Responding to the Internet Freedom Foundation, the home ministry traces the legality of the AFRS to a Cabinet note from 2009. However, a Cabinet note is a document of procedure, not law, and does not qualify as a valid legal basis. Similarly, Delhi Police’s use of facial recognition was first directed in January 2018 by the Delhi high court for a very specific use — to find missing children. Its current usage has evolved without any legal oversight, and now includes monitoring peaceful protests. The legal fallacy thus has a high cost — it paves the way for mission creep, which is particularly worrying in the absence of data protection safeguards.

The Efficiency Fallacy: While it is assumed that efficiency is a natural consequence, in reality, the technology is dangerously unreliable. Delhi high court has repeatedly expressed concern over the deplorable accuracy rates of facial recognition systems in use, which was less than 1% in August 2019. In fact, the technology had a difficult time doing something as rudimentary as differentiating between boys and girls. Research from around the world has also shown that the likelihood of false positives, i.e. a person being wrongly identified, is particularly high in the case of women, children, elderly people, and ethnic minorities.

The Convenience Fallacy: In reality, being subject to a facial recognition system is far more inconvenient than simply paying for a cup of tea with cash or standing in an airport queue. In the absence of data protection safeguards, sensitive personal information about individuals can be used for any number of purposes, and shared, sold and processed in a plethora of ways. In the context of law enforcement, it can be used to create blacklists of “suspicious people” and “miscreants”. These systems do not afford individuals the luxury of knowing when they are included in such lists, or of having transparency and accountability mechanisms to fall back on.

Given these fallacies, and evidence that shows us the cost of buying into them, it is crucial that we understand and evaluate these systems before we make them ubiquitous.

This article was originally published by the Economic Times.