Earlier this year, the National Crime Records Bureau (NCRB) invited bids to set up a centralised Automated Facial Recognition System (AFRS). The platform is expected to help the police identify people by matching facial features, with data available from any existing database. However, the proposal brought to the fore concerns about invasion of privacy and the absence of legality. The initial deadline has been extended four times, November 7 to November 30 and now, January 2020. The NCRB hasn't given a reason for it.
The Internet Freedom Foundation (IFF), a Delhi-based non-profit organisation, has sent a legal notice to the NCRB questioning the idea of setting up a centralised AFRS. “The system is a gross violation of our right to privacy and has no legal basis,” says Joanne D'Cunha, IFF’s associate counsel. The NCRB responded by saying the AFRS does not violate privacy and that it, “automates the existing police procedure of comparing suspects’ photos with those listed in LEA’s [law enforcement agency] databases”. However, unlike fingerprinting, there are no clear laws or regulations for facial recognition.
“Although there is a Supreme Court judgment, there is no statutory procedural law which would govern this kind of data collection,” says Divij Joshi, technology policy fellow at Mozilla Foundation, a global non-profit. He adds that even under the Criminal Procedure Code and The Evidence Act—the primary legislation used to control the use of evidence and criminal procedure by the police—there is nothing that talks about facial recognition. “Facial recognition is completely unregulated.” Although the NCRB states that the ARFS is legal, D'Cunha argues that NCRB claims its legality stems from the Cabinet Note for Crime and Criminal Tracking Network Systems, which “is not a legal authority”.
Adds Joshi, “There is a possibility that it will misidentify certain categories of faces more than others, which will mean that those categories will be more likely to face police scrutiny and investigation.” The chances of biases, stereotypes and assumptions are high, warns the IFF.
However, this is not the first time that such a technology is being used in the country; there are state police departments who have attempted using them. The IFF says the facial recognition software used by the Delhi Police had a 1 percent match rate and misidentified images of boys with girls. On some Indian private players who are developing such systems, Joshi says, “Their algorithms are based on Bollywood movies. Such data is hardly representative for the purpose of building probabilistic identification systems.”
Facial recognition systems have been banned in San Fransico and Massachusetts. However, countries such as China, Japan, the UAE and Singapore continue to use them for airport security and evaluating student behaviour in schools.
(This story appears in the 06 December, 2019 issue of Forbes India. To visit our Archives, click here.)