Pixels & Privacy- The Delhi High Court’s Landmark Ruling on reporting Non-Consensual Intimate Images

Mrs. X v. Union of India & Ors. (2023:DHC:2806)

Facts of the Case-

  1. The Petitioner herein is a married woman with a nine-year-old son. In 2019, the Petitioner became acquainted with the Accused who approached her through social media and introduced himself as a British Chartered Accountant. In July 2020, the Accused came over to Petitioner’s place and forced himself upon her. He allegedly clicked explicit pictures of the Petitioner, but also transferred to himself from the Petitioner’s phone her explicit pictures, that had been taken for the purpose of sharing with her husband. 
  1. The Accused involved the minor son of the Petitioner in various sexual acts as well. Thereafter, the Petitioner lodged a complaint against the said Accused at P.S Lajpat Nagar, and on the basis of which, a Zero FIR was registered. The Accused threatened the Petitioner that he would leak her sexually explicit photographs on various pornographic websites and that he would kill her son if she did not pay huge amounts of money to him. 
  1. The Petitioner was extorted into paying lakhs of money to the Accused, along with handing him all her jewellery. As the Petitioner was unable to pay any more money, the Accused leaked the Petitioner’s explicit images on various pornographic websites without her consent. This led to the Petitioner addressing a complaint dated 03.08.2021 to the SHO at P.S Lajpat Nagar. The said complaint stated that the Accused had made a YouTube channel in the Petitioner’s name, and has been posting her explicit videos and photographs on a daily basis. 
  1. Despite approaching the Grievance Cells of various Intermediaries (Google, Youtube, Bing, etc), and filing cyber complaints, her explicit images were not taken down. Thus, the Petitioner approached the Delhi High Court U/A 226 r/w S.482 CrPC, seeking blocking of certain sites exhibiting intimate images of the Petitioner and for registration of an FIR arising out of the complaint dated 03.08.2021.

The Hon’ble Court’s Analysis & Decision-

*The scope of the instant Writ Petition u/a 226 was expanded, and the directions rendered were limited to search engines, MEITY and Delhi Police.* 

  1. The Court analysed NCII (Non-consensual intimate image) vis-a-vis IT Act & Rules- Rule 3(2)(b) of the IT Rules, which lays down the grievance redressal mechanism that is to be followed by an intermediary, more or less defines NCII as any content which prima facie exposes the private area of any individual/shows such individual in full or partial nudity/shows or depicts such individual in any sexual act or conduct/is in the nature of impersonation in an electronic form, including artificially morphed images. Rule 3(2)(b) is not a charging offence. It is only under Section 66E of the IT Act that violation of privacy of an individual is punished with imprisonment which may extend to three years or with fine not exceeding two lakhs, or with both.
  1. Emphasis was supplied on the role of Search engines (para 30): “Search engines do not themselves store and transmit content, they allow users to locate and visit content. Search engines further rank the content in their order of relevance in a bid to solve the user’s query at the earliest. It is relevant to note that as search engines do not host content per se, they cannot take down the content available on a third-party platform. However, they can de-index specific URLs that can render the said content impossible to find due to the billions of webpages available on the internet and, consequently, reduce traffic to the said website significantly.” 
  1. Despite NCII abuse being perpetuated by a third-party user and causing harm to a stranger, the intermediary becomes liable for the conduct of the third-party user. Further, the IT Rules also devise a mechanism for the user/victim to directly approach intermediaries for removal of NCII content without having to obtain a Court order. Therefore, apart from making its own reasonable efforts in not publishing offending content, intermediaries can be requested to takedown offending content after being informed by a Court order or by an order of the appropriate Government or by the user themselves. 
  1. If the individual has the right to informational privacy, it also subsumes the individual’s right to be forgotten which has been held to be the consequence of the dignity of an individual and, thus, a facet of the right to privacy. A Division Bench of the Kerala High Court has recently in Vysakh K.G. v. Union of India and Ors., 2022 SCC OnLine Ker 7337, while adjudicating upon right to privacy vis-à-vis right to information, goes on to observe that, in the digital context, the “right to delisting” and “right to oblivion” are facets of the right to be forgotten. 
  1. The argument that has been advanced in the present case by the learned Senior Counsel appearing for the Respondent (Intermediaries) is that as search engines merely provide access to content and are not responsible for hosting the said content, directions must be rendered to the publishers and not the search engines themselves. It is at this stage that a search engine’s role in ensuring that one’s right to privacy is not contravened comes into prominence, especially with Rule 3(1)(m) which states that the intermediary shall respect all the rights accorded to the citizens under the Constitution, including Articles 14, 19 and 21. It is further essential to state that the continued existence of NCII content on the internet does not serve any public interest and it is punishable under Section 66E of the IT Act. The argument, therefore, put forth on behalf of the Intermediaries was not accepted by the Hon’ble Court. 
  1. Social Responsibility of Search Engines (para 46 onwards)  The newly amended Rule 3 of the IT Rules explicitly pronounces the obligation of the intermediary to not only “inform”, but to make “reasonable efforts” to ensure that its users do not publish content that is prohibited under Rule 3(1)(b). Thus, any directions given herein fall squarely within the statutory regime with regard to obligations of intermediaries. 
  1. Search engine plays an important role in the dissemination of content and its powers in connecting the said content to the consumers is undeniable. There resides a social obligation in these intermediaries to be proactive in de-indexing such links when it comes to its knowledge that such content is illegal. The Hon’ble High Court found the suggestion untenable that the user/victim must approach either the intermediary in question or the Courts every single time the NCII content is duplicated. Such a suggestion also frustrates the legislative intent behind the IT Rules which devises a time-bound schedule in removal of such content. The Hon’ble High Court further observed that an approach that entails the victim/user having to sift through the internet to identify and then share every URL hosting their NCII is unconscionable.
  1. Moreover, search engines cannot hide under the garb of not possessing the adequate technology to remove NCII content which has been reported without the victim/user having to approach the intermediary again and again. As per the Affidavit of Google LLC, hash-matching technology, generates a unique identifier/fingerprint/hash, exists for the purpose of removing CSAM. This technology allows detection and removal of the matched content that has previously been removed. For the purposes of removal of NCII, once such content has been identified and removed, the hash-matching technology can store only the unique identifier pertaining to the NCII content and in the event that such content is re-uploaded, it can filter out the same by going through its database of such fingerprints. A similar tool has already been built by Meta, and Microsoft. YouTube has also developed CSAI (Child Sexual Abuse Imagery) Match which is used by NGOs and other companies to identify abusive content. 
  1. The Hon’ble High Court stated that entities of the nature of Google and Microsoft, considering their ubiquity, cannot abscond or withdraw from their duties to the public at large in the name of reducing the liability they might incur, the Hon’ble Court was in fact inclined to agree with the submissions of the learned Senior Counsel appearing for Google and Microsoft that any direction that necessitates pro-active filtering on the part of intermediaries may have a negative impact on the right to free speech. No matter the intention of deployment of such technology, its application may lead to consequences that are far worse and dictatorial.
  1. One of the concerns that arises when we consider the right to privacy of an individual under Article 21 is its impact on the right to freedom of expression and speech. This issue requires an interpretation of the phrase “such content” in Rule 3(2)(b) and whether the same means a specific instance of identified NCII, as has been contended by the intermediaries, or all such content of identical nature, as submitted by the learned Amicus Curiae. The Hon’ble High Court observed that construing the phrase “such content” as “all content” is necessary to reduce the burden on the user/victim, however, “all content”, access to which is to be disabled, must pertain to NCII abuse that has already been reported.
  1. Search engines being an intermediary cannot hide behind the argument that they merely provide access to third-party websites as due diligence exercised as per Rule 3 is applicable to all intermediaries. In addition to “actual knowledge” as defined in Shreya Singhal v. Union of India as a Court order or upon being notified by the appropriate Government, Rule 3(2)(b) and (c) of the IT Rules now allows the victim/user to approach the intermediary on their own with their grievance. It mandates a timeline that must be adhered to when it comes to disabling access/de-linking the offending content. If read holistically, if the user/victim is required to approach with each specific URL again and again, this will only frustrate the purpose of the timelines and the grievance mechanism redressal as expounded under the IT Rules. 
  1. It has been submitted that the sustained practice with regard to content removal under the IT Act has been to provide specific URLs, however, this practice fails to account for a grievance redressal mechanism available to the user/victim and it is not justifiable, morally or otherwise, to suggest that an NCII abuse victim will have to constantly subject themselves to trauma by having to scour the internet for NCII content relating to them and having to approach the authorities again and again. Once it has been reported by the user/victim or a Court order or an order of the appropriate Government has been rendered, then the search engine cannot contend that any filtering of the content that is done subsequent to the reporting or the Order is proactive in nature; it can only be termed as being in pursuance to the reporting of existence of such content specific to an individual or a judicial Order. 
  1. The fact that search engines do not host or publish or create content themselves is of no consequence when it comes to the question of removal of the access to the offending content. It is undeniable that they do have the ability, the capacity, and the legal obligation to disable access to the offending content; this responsibility of the search engine cannot be brushed under the carpet on the ground that it does not host content. 
  1. The Hon’ble High Court in the said judgment painfully notes that there is an abysmal absence of a collaborative effort that should ideally be undertaken by the intermediaries and the State. The focus of such entities and authorities should be on the quick redressal of the complaint brought before them rather than the shirking of blame or making submissions on the onerous nature of their duties. In the process of shirking responsibility, precious time is lost in removal of the offending content and enables the offender to keep reposting the content. The endeavour of every entity involved should be to expeditiously resolve the issue. 

Directions & Recommendations by the Hon’ble Delhi High Court:

  1. On approaching the Court for a takedown order in a matter involving NCII content, the Petitioner must, along with the petition, file an affidavit in a sealed cover identifying the specific audio, visual images and key words that are being complained against, in addition to the allegedly offending URLs for ex facie determination of their illegality. 
  1. The Grievance Officer appointed by the intermediary must be appropriately sensitised. The definition of NCII abuse must be interpreted liberally by the intermediaries to include sexual content obtained without consent as well as sexual content obtained and intended for a private and confidential relationships. 
  1. The “Online Cybercrime Reporting Portal”, must have a status tracker for the complainant, commencing from filing of a formal complaint to the removal of the offending content. The portal must display various redressal mechanisms that can be accessed by the victim in cases of NCII. This display should be in all languages specified in the Eighth Schedule. The Portal, along with every other website of Delhi Police, should also display the contact details of each District Cyber P.S present in the NCT of Delhi.
  1. On the receipt of information, noting the nature of NCII content, the Delhi Police must immediately register a formal complaint in order to initiate an investigation and bring the perpetrators to book as soon as possible so as to prevent the repeated upload of the content. 
  1. Every District Cyber P.S must have an assigned Officer who must liaise with the intermediaries against which grievances have been raised by the victim who has approached the Delhi Police and an endeavour should be made to ensure that the grievance is resolved within the time schedules stipulated under the IT Rules. The intermediaries are directed to cooperate unconditionally as well as expeditiously respond to Delhi Police.
  1. A fully-functioning helpline available round-the-clock should be devised for the purpose of reporting NCII content. Operators and individuals manning this helpline must be sensitised about the nature of NCII content and must, under no circumstances, indulge in victim-blaming or shaming the victim. These operators should also have a database of organisations with registered counsellors, psychologists and psychiatrists available for reference to the victims. The Delhi Legal Services Authority may also be apprised and engaged in case the victims need legal aid.
  1. Search engines must employ the already existing mechanism with the relevant hash-matching technology on the lines of the one developed by Meta as has been discussed above. They cannot be allowed to avoid their statutory obligations by stating that they do not have the necessary technology, which is patently false as has been exhibited during the course of hearing. 
  1. The reporting mechanism under Rule 3(2)(c) of the IT Rules must be conveyed to the users by the intermediaries by way of prominent display of the same on the website of the intermediary. It is necessary for users to be made aware of the reporting mechanism and the onus for educating the users lies on the intermediaries.
  1. The timeframe as stipulated under Rule 3 of the IT Rules must be strictly followed without any exceptions, and if there is even minor deviation from the said timeframe, then the protection from liability under S, 79 of the IT Act cannot be invoked by the search engine. When a victim approaches a Court or a law enforcement agency and obtains a takedown order, a token or a digital identifier based approach must be adopted by search engines to ensure that the de-indexed content does not resurface. 
  1. As a long-term suggestion, a trusted third-party encrypted platform may be developed by MEITY in collaboration with various search engines under Rule 3(2)(c) for registering the offending NCII content or the communication link by the user/victim. Accordingly, the intermediaries in question may assign cryptographic hashes/identifiers to the said NCII, and automatically identify and remove the same through a safe and secure process.

Your Guide to Managing Data Subject Access Requests

DSAR means Data Subject Access Request, and this is one of the rights that a data subject or an individual under the General Data Protection Regulation (GDPR) enjoys. 

  1. A data subject is anyone whose data is collected, shared and processed by a data controller.
  2. A data controller is a company, organization or anyone who deals with the personal data/information of the data subjects. 

As per the GDPR, the data subject should be a resident living in the European Union.

Recital 63 of the GDPR states:

“A data subject should have the right of access to personal data which have been collected concerning him or her, and to exercise that right easily and at reasonable intervals, in order to be aware of, and verify, the lawfulness of the processing.”

  1. Reasons to have a DSAR process
S.NOReason(s) for DSAR
1.For confirming whether your organization/business processes the personal data of an individual (referred to as Data subjects).
2.For accessing the personal data/information of a data subject.
3.For determining whether such processing of data of the subject is on a lawful basis or not.
4.For knowing the duration/period for such data which has been stored in your organization/business
5.For enquiring about how the data subject’s personal information/data was obtained by your organization/business.
6.For obtaining information about automated decision-making and profiling from the data subject’s personal information.
7.For obtaining the names and further details of the third-parties with whom your organization/business is sharing the personal information of the data subject(s).

This isn’t an exhaustive list; a data subject has a right under the GDPR and can submit such a request (DSAR) without any given reason to the data controller and at any time. The data controller may only ask questions in order to verify the data subject’s identity. 

  1. Principles for DSAR

GDPR in the entirety is based on the following principles and it is the data controller’s responsibility and obligation to process data in accordance to the principles laid down-

Article 5 of the GDPR lays down the following principles-
Lawfulness, fairness and transparency
Purpose limitation
Data minimisation
Accuracy
Storage limitation
Integrity and Confidentiality
Accountability

Whereas, the DSAR is based on the rights granted to the data subjects under the GDPR-

Article(s)Right of the data subject
Art.15This article grants the data subject the right to access his/her personal data held by the data controller.
Art.16This article grants the data subject the right to rectify his/her inaccurate personal data without any undue delay caused by the data controller while giving access. 
Art.17This article grants the data subject with the right to be forgotten without causing any undue delay by the data controller.
Art.18This article grants the data subject the right to restrict the processing of his/her personal data.
Art.20This article grants the data subject the right to transmit his/her personal data to any other controller, and also to obtain his/her personal data in a machine-readable format.
Art.21This article grants the data subject the right to object to processing of his/her personal data.
Art.22This article grants the data subject the right not to be subjected to automated decision making and profiling.
  1. Steps to perform as a Data Controller-
S.No.Steps to be taken
1.The first step should be to verify the data subject’s identity and record the DSAR in the system.
2.The next step is collecting and categorizing the personal data that you have stored.
3.The next step should be to review the data subject’s request in order to understand the DSAR’s requirement. The reply to such a request should be within 30 days as mandated by the GDPR and without causing any undue delay.
4.Before sharing the response to the data subject, it is better to gather all the personal data of the data subject into the response, as the GDPR also encourages remote access to such data.
5.The data controller needs to ensure that the delivery of the data to the data subject should be secure as data leaks and breaches are quite expensive, moreover, it affects the trust among its users and the reputation/goodwill.
6.Once you have followed all the required steps, you are ready to send the response to the data subject
7.It is essential to remind the data subjects about their privacy rights and you may do so by adding a fews lines at the end of your response.

Aarlin Moncy: Discussing Law & Technology

Hello everyone! I am yours truly, LawyerStrange, aka Aarlin Moncy!

Thank you for visiting my page. Here, you will find blogs and video content on topics (but not limited to) such as- Data protection & Privacy, Cyber law, Constitutional law, contract law, movies and comics.

But the idea is to make this platform an exclusive page for Technology Law. Help me in this journey to build a community for tech & comic geeks. Let’s grow together.

Feel free to contact me and do share your suggestions.

Thanks again!

Stay safe and Take care!