In a significant development, the Ministry of Electronics and Information Technology recently notified certain amendments to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021. The amendments expand the scope of existing obligations of intermediaries, impose new obligations on them and provide for a grievance appellate mechanism.
This update covers:
On 28 October 2022, the Central Government notified the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2022 (2022 Amendment), with immediate effect. Shortly after, in official press statements, the Government emphasised that the 2022 Amendment ushers in a collaborative accountability framework to build an open, safe and trusted cyberspace for Indian netizens. Intermediaries that do not comply with the 'due diligence' obligations mandated under the 2022 Amendment (which amends the Information Technology (Guidelines for Intermediaries and Digital Media Ethics Code) Rules, 2021 (2021 IT Rules)), risk losing the safe harbour protection (i.e., immunity from liability for any third- party content) under Section 79 of the Information Technology Act, 2000 (IT Act). (To read our detailed update on the 2021 IT Rules, click here.)
The 2022 Amendment largely mirrors the draft circulated for public consultation in June this year, with some key additions. While the precise manner in which some of the amendments will be enforced is unclear, we have set out an overview below.
In a significant development, the Ministry of Electronics and Information Technology recently notified certain amendments to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021. The amendments expand the scope of existing obligations of intermediaries, impose new obligations on them and provide for a grievance appellate mechanism.
1. Terms of service, privacy policy, annual terms/policies reminders, and other agreements of intermediaries must be made available to users in 22 Indian languages
Rules 3(1)(a) and 3(1)(b) of 2021 IT Rules required intermediaries to (i) prominently publish their rules, regulations, privacy policies and user agreements on their websites and/or applications (collectively, Platform Policies); and (ii) inform users through such Platform Policies of statutorily identified types of prohibited content (Prohibited Content). These rules have been amended; intermediaries must now publish their Platform Policies and inform users of the Prohibited Content in a user's preferred language, which may be English or one of the 22 languages specified in the Eighth Schedule to the Constitution of India (Regional Languages) as per the user's choice. As the 2022 Amendment is effective immediately, intermediaries must make their Platform Policies available to users in Regional Languages at the earliest, given a user may elect to access such Platform Policies in such languages.
Similarly, annual reminders regarding the Platform Policies or changes made to them (required under Rule 3(1)(f)) must also now be in English or a Regional Language of the user's choice. However, it remains to be seen how this will be implemented in practice.
2. Obligation on intermediaries to 'ensure compliance' with their Platform Policies and take 'reasonable steps' to keep Prohibited Content off their platforms
The 2022 Amendment imposes a positive obligation on intermediaries to 'ensure' user compliance with the Platform Policies (amended Rule 3(1)(a)), significantly expanding their earlier obligation under the 2021 IT Rules to merely publish these Platform Policies. The amended and inevitably related Rule 3(1)(b), on the other hand, adopts a lower threshold of diligence, requiring intermediaries to take 'reasonable efforts' to cause its users to not upload, share, store any Prohibited Content. The 2022 Amendment does not clarify the disjunction between these standards. However, given that the 2022 Amendment is primarily concerned with regulating unlawful content on online platforms, a possible harmonious reading of the two phrases could mean that 'ensuring compliance' with Platform Policies is limited to taking 'reasonable efforts' to ensure that users steer clear of Prohibited Content identified under Rule 3(1)(b).
The 2022 Amendment is, nonetheless, silent on (i) how these two undefined phrases interact with each other; and (ii) the level of diligence required under each amended sub-rule. For instance, the 2022 Amendment does not clarify how intermediaries are to 'ensure compliance' with their Platform Policies. Doing so will likely require an intermediary to implement proactive monitoring measures to (i) prevent users from hosting/uploading/disseminating Prohibited Content; and (ii) identify such content if it has already been posted/once posted.
It is equally unclear how implementing such measures to 'ensure compliance' will impact an intermediary's safe harbour status. While voluntary actions taken by an intermediary (i.e., on its own or based on a complaint) to remove content in violation of its Platform Policies are expressly protected under the proviso to Rule 3(1)(d), the 2022 Amendment does not clarify if this protection extends to actions taken by intermediaries to 'ensure compliance' as required under amended Rule 3(1)(a). For example, if intermediaries proactively prevent users from uploading certain content to ensure compliance with their Platform Policies, it is unclear if they will continue to enjoy safe harbour immunity for such content. This is because by doing so, intermediaries are arguably selecting who receives the content and what content they receive (or do not, in this case), although intermediaries are required under Section 79(2)(b) to abstain from selecting the 'receiver' or 'information' in a transmission as one precondition to availing safe harbour immunity.
3. Revised list of Prohibited Content and expedited grievance redressal for removing Prohibited Content
Rule 3(1)(b), which specifies categories of Prohibited Content, has also been amended to:
a. specifically include content that promotes enmity between different groups based on religion or caste with the intent to incite violence, and which communicates misinformation, and
b. delete certain categories present under the 2021 IT Rules, i.e., content that is (a) defamatory, (b) libellous and (c) patently false and untrue, and is written or published in any form, with the intent to mislead or harass a person, entity or agency for financial gain or to cause any injury to any person.
The Government has, in official press statements, indicated that it envisages a three-pronged approach to handling Prohibited Content:
a. Intermediaries must adopt appropriate content moderation policies and ensure user compliance with the same (amended Rule 3(1)(a));
b. They must take reasonable measures (e.g., through algorithms or manual review) to cause their users to not engage in any form of Prohibited Content on their platforms (amended Rule 3(1)(b)); and
c. Intermediaries should resolve, within 72 hours of receipt, complaints received from users or victims regarding Prohibited Content (new proviso to Rule 3(2)(i)).
Notably, the expedited 72-hour resolution timeline does not apply to complaints relating to all Prohibited Content. The 15-day resolution timeline under the 2021 IT Rules will continue to apply to Prohibited Content that (a) belongs to another person and to which the user does not have any right; (b) infringes any patent, trademark, copyright, or other proprietary rights; (c) violates any law for the time being in force.
Although the 2022 Amendment also permits intermediaries to develop 'appropriate safeguards' to avoid user- misuse of the grievance redressal mechanism, it is presently unclear what this entails.
4. Intermediaries to adhere to constitutional standards and ensure accessibility, due diligence, privacy and transparency
The newly inserted Rule 3(1)(n) mandates that intermediaries must respect the constitutional rights guaranteed to citizens, including but not limited to the fundamental rights guaranteed under Articles 14 (equality before the law), 19 (right to freedom) and 21 (protection of life and personal liberty) of the Constitution. In effect, this Rule appears to make the protection of fundamental rights horizontally applicable to private parties. The precise way this will now be implemented or interpreted by writ courts – where this argument is frequently made and routinely rebutted – remains unclear. The Government has, however, indicated that this rule is expected to negate intermediaries' arguments to the contrary: i.e., as private parties, they are not required to ensure such constitutional rights. The Government explained that this would be particularly relevant where, for example, intermediaries arbitrarily remove content/user accounts without meeting the requirements of natural justice e.g., issuing a show cause notice.
This stance iterates a position that the Government took in a Delhi High Court case (Wokeflix v Union of India & Ors.) predating the 2022 Amendment and concerned with notice and appeal requirements for significant social media intermediaries (i.e., social media intermediaries with over 50 lakh registered users in India) under the 2021 IT Rules. There, the Government submitted that intermediaries are bound to respect users' constitutional rights, and cannot (except in limited circumstances, e.g., in the case of rape or child sexual abuse material) suspend a user account without giving the user prior notice and reasonable time and opportunity to explain her stance. From a practical perspective, Rule 3(1)(n) will, therefore, also impact how significant social media intermediaries comply with these related prior notice and take-down requirements under Rule 4(8) of the IT Rules. This will likely elongate the notice and appeal process, which will feed into appeals before the Grievance Appellate Committees (discussed below), once established, or protracted litigation.
A more amorphous addition, Rule 3(1)(m) requires intermediaries to take all measures to ensure accessibility of their services to users along with reasonable expectation of due diligence, privacy, and transparency without further elaboration. While existing legislation (i.e., the Persons with Disabilities Act, 2016 read with the Rights of Persons with Disabilities Rules, 2017) require all private entities (including, by implication, intermediaries) to follow the accessibility requirements prescribed for Indian Government websites, the specific ambit of the new obligations under the 2022 Amendment is unclear.
5. Grievance Appellate Committee(s)
The 2022 Amendment empowers the Central Government to establish a Grievance Appellate Committee (GAC) or more than one GAC. The appeal process, which is intended to hold intermediaries accountable for how they redress grievances, enables any person aggrieved by a decision made by an intermediary's Grievance Officer (GO) to appeal such a decision before the GAC within 30 days of receipt of the GO's communication. The GAC must then resolve the appeal within 30 days of its receipt. The entire appeal process (from filing to decision) will be conducted through an online dispute resolution mechanism. An intermediary is expected to comply with the GAC's order and upload a compliance report on its website. This requirement is applicable to all intermediaries and not just social media intermediaries.
Although the GAC is expected to be established within three months from the effective date (i.e., by 28 January 2023), important details such as the GAC's terms of reference and qualifications of its members (technical, judicial or industry based) are absent from the 2022 Amendment. The Government has, however, indicated that given the wide range of intermediaries and the consequent breadth of user grievances, they may establish one or two GACs, to begin with, and build on this, going forward, to address different types of user appeals.
If you would like to learn more about our firm and areas of expertise, please feel free to drop in your queries here.