W Power 2024

The tech vs privacy face-off

Rampant and unregulated use of inaccurate facial recognition systems by government agencies threatens the Right to Privacy of citizens

Published: Aug 24, 2021 12:08:50 PM IST
Updated: Jun 3, 2022 04:03:25 PM IST

The tech vs privacy face-offCan facial recognition technology be adopted without looking at its human rights implications?
Illustration: Chaitanya Dinesh Surpur


Every day we embrace technology with equal parts delight and horror. Delight that devices such as Alexa can wake us up with our preferred music, while being horrified by the knowledge that they also capture our snores and sleep talk. While Alexa is an innocuous (but still disturbing) device in a digital society, something genuinely terrifying is being purchased and installed during this pandemic.

Facial recognition technology (FRT), which I have spent the better part of a pandemic year investigating, documenting and advocating against, is being deployed at a terrifying speed. As tender after tender is issued for their purchase, we, at the Internet Freedom Foundation (IFF), have documented close to ₹1,248 crore already earmarked for it by various government authorities at the state and central level.

However, does it serve a legitimate purpose? Especially during a pandemic when public resources may be better utilised elsewhere? Still, such surveillance is being institutionalised in the name of security, with the expenditure on acquiring surveillance architecture being one of the major elements of the Safe City (under the Ministry of Home Affairs) and Smart City (a central government scheme) initiatives, which have estimated budgets of ₹2,919.55 crore and ₹48,000 crore respectively. Can such technology be adopted without looking at its human rights implications?

Proliferation of FRT in India

We’ve come across FRT being used for the purposes of authentication of identity as well as for identification by security agencies in the United States, European Union, United Kingdom, China and several other jurisdictions. In India, the IFF’s Project Panoptic is monitoring 64 FRT systems across the country at the central and state government levels. These projects include biometric authentication and attendance systems by the ministries of power, road, transport and highways, defence and others. The Ministry of Civil Aviation has announced the Digi Yatra Programme, which aims to increase the ease and efficiency with which passengers are processed from the moment they enter an airport till they board their flight, by using the biometrics (face) of passengers as their boarding pass. The government has started rolling out Digi Yatra on a trial basis in airports like Hyderabad (July 2018), Delhi (September 2018) and Bengaluru (December 2018).

More invasive identification systems are being developed by the Ministry of Home Affairs’ National Crime Record Bureau, touted to be the “world’s largest FRT system”. The Request For Proposals (RFP) for the project invites bids for the creation of a “National Automated Facial Recognition System” (AFRS), which would be used to create a national database of photographs that could help in swiftly identifying criminals. This would be done by gathering existing data from the passport database, the Crime and Criminal Tracking Network System, the Interoperable Criminal Justice System, the prisons database, the Ministry of Women and Child Development’s Khoya-Paya portal, National Automated Fingerprint Identification System or any other image database available with the police or any government entity. The Ministry of Railways has also announced its own FRT-based security/surveillance system.

The use of FRT is also growing swiftly among police departments of various states and cities, such as Delhi, Punjab, Uttar Pradesh, Chennai, Uttarakhand, Bihar, Vadodara, Kerala and Telangana. The Delhi Police acquired FRT through a Delhi High Court order for the specific purpose of finding missing children. However, the technology is now used for criminal investigation purposes as well as on protesters, such as during the anti-CAA protests in 2019 and during the farmers’ protests in 2020. The Delhi Police is also using FRT to allegedly identify people at the 2020 North East Delhi riots and the 2021 Republic Day parade violence at the Red Fort.

According to reports, the Delhi Police used FRT in 137 of the 1,800 riot-related arrests, and identified over 250 people for the Red Fort violence. Apart from these, it has arrested 42 people with the help of FRT since August 2020, and conducted 18,968 FRT scans till March 23, 2021. This, despite the Delhi Police admitting that the FRT they possess has an accuracy rate of 2 percent and cannot distinguish between female and male children. This is the time to examine how the use of such technology, which is perceived to be free of bias, adds to the problem.

The issues of inaccurate FRT
The foremost issue with the use of FRT is that the technology is inaccurate. Complete accuracy in finding matches has not been achieved even in ideal laboratory conditions and thus deploying this technology in the real world, where environmental factors play a major role in the quality of images, comes with the harms of misidentification (false positives) and failure to identify (false negatives).

Misidentification occurs when a person is identified as someone else, which could potentially lead them to be falsely implicated in criminal investigations and to the strengthening of existing biases against certain communities. You could potentially have a higher chance of being picked up and harassed by the police on suspicion of a crime if you look like the actual culprit since the technology cannot produce a faultless result. On the other hand, a failure to identify would lead to a person being excluded from access to their workplace (in the case of attendance systems) or from government schemes and benefits.

While there have been claims of fully accurate FRT systems, none of these have been corroborated by independent reviews and audits. The National Institute of Standards and Technology (NIST) has extensively tested FRT systems for 1:1 verification and 1:many identification, and how the accuracy of these systems varies across demographic groups. Research has shown that FRT is biased not just on the basis of race but also on the basis of gender, with the error rate being the highest while identifying women of colour.

FRT & the Right to Privacy

While use of the technology is troublesome on its own, the issues surrounding it are amplified when we take into consideration the fact that the use is taking place in a legal vacuum. There are currently no specific regulations in place for FRT that would create standards in terms of quality, and specifically demarcate a threshold of accuracy that has to be passed for this technology to be used by any authority. There is also no data protection law to prevent misuse of personal data collected by these systems, or create any accountability mechanisms in the case of misuse. Protection of personal data is necessary to ensure that all processing takes place within the bounds of law, while respecting the Fundamental Right to Privacy; is restricted to a specific and legal purpose that has been authorised by appropriate authorities, and that the data is not used or shared further.

The Right to Privacy stands to be violated if personal data, which is collected by FRT systems, is used without the informed consent of an individual. This would also hamper the individual from exercising their liberty to share their information in some contexts and remain anonymous in others. However, we have seen how the lack of a personal data protection law allows government agencies to use FRT for purposes other than what it is authorised to use it for, as in the case of Delhi Police. This phenomenon is called ‘function creep’ wherein a technology or system gradually widens its scope from its original purpose to encompass and fulfil wider functions.

India’s current surveillance architecture also fails us at this point since it accounts only for interception of calls, under the Indian Telegraph Act, and interception of messages, under the Information Technology Act, while failing to account for how advanced technologies like FRT, or spyware such as Pegasus, may be used against Indian citizens. Here, the draft Personal Data Protection law, which is soon to be passed, is equally disappointing as it gives wide leeway to government agencies with regard to data collection and does not include any strong surveillance provisions to protect the privacy of Indian citizens.

The Right to Privacy of Indian citizens will be severely impacted if such unregulated use of FRT is allowed to continue. The damage will not be limited to only the Right to Privacy, as there are various other rights that are incidental on the Right to Privacy to be realised fully. This was the view of the Supreme Court bench that held that the Right to Privacy is a Fundamental Right under Article 21 of the Constitution of India. The decision also held that any justifiable intrusion by the state into the people’s Right to Privacy must conform to certain thresholds, which are legality, necessity, proportionality and procedural safeguards.

At present, all ongoing FRT projects in India fail to fulfil these thresholds. If use of FRT continues unregulated, it will have a chilling effect on the Right to Freedom of Speech and Expression, as well as on the rights to protest, form associations and move freely throughout the territory of India, as fear of identification and retaliation by the state would deter individuals. Therefore, FRT will affect not only the privacy of individuals but also their personal liberties and autonomy.

The writer is associate counsel (surveillance & transparency), Internet Freedom Foundation

(This story appears in the 27 August, 2021 issue of Forbes India. To visit our Archives, click here.)

Post Your Comment
Required
Required, will not be published
All comments are moderated