Ethical issues of Facial Recognition Technology use

A discussion paper on Facial Recognition Technology (FRT) floated by the NITI Aayog has suggested strong data protection norm for FRT application which has set the tone for public discourse on ethical issues related to application of Artificial Intelligence (AI) in India.
Ethical issues of Facial Recognition Technology use
Published on: 

A discussion paper on Facial Recognition Technology (FRT) floated by the NITI Aayog has suggested strong data protection norm for FRT application which has set the tone for public discourse on ethical issues related to application of Artificial Intelligence (AI) in India. Transparency in FRT regime is critical to ensure that ethical questions of privacy concerns and racial bias are not left unanswered. The paper titled "Responsible AI for All: Adopting the Framework-A use case approach on Facial Recognition Technology," has flagged data privacy concern in application of "Digi-Yatra", a biometric based boarding system pushed the Civil Aviation Ministry which automatically scans passengers using FRT. Launched in 2018, the Digi-Yatra has been rolled out in Delhi, Hyderabad and Bengaluru airports where passenger's face is the new boarding pass and proposed to be implemented in other airports by 2023 amidst criticism by privacy rights defenders for pushing the technology without Data Protection laws. The FRT use is based on principles of facial detection, matching and recognition by making use of robotics and AI to match digital or video image of a human face with a database of faces for verifying the indemnity of a person. Widespread use of FRT on smartphones for enhanced security locking and of phones and simultaneously making the process fast, easy and smooth for the user. However, most smartphone users being not familiar about how the facial data is collected, stored and used, they grapple with unanswered questions of data privacy and accuracy of application of FRT for identification purpose at public places such as airports which also give rise to apprehension of stored data being used by government agencies for surveillance. The Office of the High Commissioner for Human Rights (UN Human Rights) called for a moratorium on the use of FRT in the context of peaceful protests, until States meet certain conditions including human rights before deploying it which include effective, independent oversight of its use; strict privacy and data protection laws; and full transparency about the use of image recordings and FRT in the context of assemblies. The objectives and benefits of Digi Yatra programmes include delivering a seamless, paperless and hassle-free experience to all passengers across all processors/ checkpoints at all Indian airports, walk-through security scanners swiftly owing to advanced biometric security solutions, enhancing security at Indian airports using "Digi Yatra ID"-based Identification with real-time biometrics, validating boarding pass or e-ticket with the airline system in real-time and use face biometrics for processing passengers at checkpoints in the airport and also extend to passengers without Aadhar or Digi Yatra ID using biometric validation. The NITI Aayog's paper says that while the Digi Yatra Policy states that it is completely voluntary in nature, if the use of Digi Yatra is made mandatory in any way, then the same must comply with the principles laid down in K.S. Puttaswamy v. Union of India relating to the legality, necessity, and proportionality of the policy. The paper notes the Digi Yatra Policy statement that facial biometrics are deleted from the local airport's database 24 hours after the departure of the passenger's flight but underlines the need for clearly setting out in the policy the rules related to deletion of other information collected from the passengers, as well as any facial biometrics that are stored in other registries. It points out that the Digi Yatra Policy mentions that users may also be able to provide consent for value-added services at the airport, for which purpose their data may be shared with other entities like cab operators and other commercial entities. The recommendation made in the discussion paper of the think tank is that there must be specific care taken to ensure that such consent is meaningfully provided and is not bundled by default and providing the 'opt-in' instead of an 'opt-out' to set the default to a passenger's data not being shared with a third party, unless they authorize and give consent through 'opt-in' provision. Another recommendation made in the paper that there must be frequent cybersecurity audits and vulnerability testing of the Digi Yatra platform to ensure that reliability, usability, information security in the ecosystem is a subject of continuous engagement and is adaptive to the rapidly evolving threats that exist in this sphere is crucial to building public trust on the stated objectives. Establishing a mechanism for performing algorithmic audits by independent and accredited auditors, prior to system deployment at periodic intervals, as suggested, will be critical for preventing system security being compromised, data being misused owing to wrong identification. The sound of caution that "the deployment of FRT systems automatically raises a risk of data breaches and unauthorised access", is a strong reminder for the policy makers not to rush with AI solutions for FRT without taking care of the data protection and privacy issues.

Top News

No stories found.
Sentinel Assam
www.sentinelassam.com