Skip to main content
Image by Gerd Altmann used under Pixabay License (https://pixabay.com/images/id-1187198/)
Download in

To the Ministry of Electronics and Information Technology, Government of India

                                                               Ref. Comments to the IT Rules 2021 and proposed amendments

We, the undersigned organizations operating in more than 10 countries and internationally in the promotion and protection of digital rights and freedoms, submit the following comments and urge you to withdraw the amendments recently proposed to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules 2021 (IT Rules 2021). [1] We commend the Ministry of Electronics and Information Technology (MeitY) on the initiative to amend the IT Rules 2021, and commence a process of consultation on the proposed amendments.

Amendments to the Rules are necessary in order to meaningfully protect fundamental rights. In our view, however, these new revisions add concerns to already restrictive provisions in the Rules that pose a direct threat to the rights to freedom of expression and privacy, as well as other related human rights protected under the Indian Constitution and international law.

It is our understanding that the IT Rules 2021 should indeed be amended, but to address important shortcoming in its 2021 text, which include:

  • Serious privacy concerns, mainly related to an expansion of mandatory data retention and traceability requirements, which would undermine end-to-end encryption and have a chilling effect on the rights to expression and association. Additionally, section 4(2) the IT Rules 2021 allows for tracing of the ‘first originator’ of information not only by the order of Courts, but by administrative authorities, and applies to a wide range of open-ended categories of ‘serious offences’. The requirements fail the necessity and proportionality test laid down by the Indian Supreme Court in the Puttaswamy judgement, [2] to assess whether a restriction to the right to privacy is reasonable.

  • The model of grievance redressal, oversight, and media regulation in Part III of the Rules that allows for unprecedented and unconstitutional state control over the press and online news, fails to establish an independent regulator free from government interference and lacks judicial oversight for removal or blocking of content, as recommended under international freedom of expression standards.

  • The model of conditional immunity for internet intermediaries that creates differential and extensive obligations for ‘significant social media intermediaries’, but which obligations may be at the government's discretion extended to ‘any intermediary’. These provisions, in addition to imposing burdensome obligations on intermediaries, also provide unclear language concerning the nomination of grievance officers by companies and makes use of criminal liability for non-compliance, which could result in over-compliance to the detriment of users’ rights.

  • Additionally, restrictions on online content are set forth directly by the Executive, with no legislative debate (as will be addressed below) and based on extremely vague and overbroad terms. The IT Rules impose a framework that oversteps the Executive’s rule-making powers or ventures to create mandates that can only be validly enforced by parliament through legislative instruments.

These concerns have repeatedly been raised by  civil society, more fully set out here, here, here and here.

The recently announced proposed amendments introduce 4 main modifications to the 2021 version and are particularly worrisome for setting up:

  • Intermediaries’ duty to ‘ensure compliance’ and to ‘cause the user of its computer resources not to host, display, upload, modify, publish, transmit, store, update or share’ – through an implicit push for proactive monitoring - an extensive list of open and vague types of information, irrespective of any specific complaint. 

  • The adoption of due diligence obligations that are overly burdensome, impractical and virtually impossible to comply with, especially as it relates to the further limited timeframe for removal and blocking of content in certain contexts, exacerbating the Rules’ negative impact on free speech.

  • The creation of a government led and appointed grievance appellate committee that can overturn platform content moderation decisions irrespective of judicial assessment, worsening the existing issue of non-judicial oversight, with extensive powers to the Executive to control online speech through the grievance redressal mechanism.

Below, we expand on these comments:

New due diligence and redress obligations – the threat of prior censorship and unreasonable timeframes for addressing complex freedom of expression issues

As per the IT Rules that came into effect in 2021, social media companies already had to comply with onerous due diligence obligations in order to ensure immunity for the content users publish on their platforms. These obligations included, among others, the obligation to prominently publish their rules and regulations, privacy policies and user agreements.

Beyond these transparency obligations, however, the proposed amendments seek to create additional duties concerning the enforcement not only of platform policies, but also of a list of types of information that users are not supposed to host, display, upload, modify, public, transmit, store, update or share. The new provisions are problematic on many levels, since they:

  • promote overzealous content moderation practices

  • incentivize the use of automated content controls

  • refer to broad categories of information that are vague and open to abuse

The new duty to ‘ensure compliance’ with rules, regulations and policies, in practice charges intermediaries with the task of controlling online speech irrespective of specific complaints. The language used in the amendment seems to indicate the creation of an obligation of proactive monitoring that could result in the massive removal or blocking of legitimate speech. This would constitute undue restriction on the freedom of expression of users.

As per international human rights standards, any restriction to the right to freedom of expression has to go through a strict test of (i) legality, (ii) necessity and proportionality, and (iii) legitimacy. The provisions set up by the IT Rules in 3(1)(a) and (b) fail to pass all parts of this test.

First, any restrictions have to be provided by law in its strict sense; that is, they must be adopted by regular legal processes aimed at limiting government discretion through Legislative deliberation and public participation. Executive regulations such as the IT Rules extrapolate Executive powers in this regard. Second, the IT Rules provisions in 3(1)(b) do not comply with the requirement of ‘sufficient precision’. [3] Terms such as ‘racially and ethnically objectionable’, ‘relating to money laundering’, ‘harmful to child’ and  ‘threatens the unity of India’ are vague and may lead to overly broad interpretations and manipulation.

In practice, the restrictions on freedom of expression imposed by 3(1)(a) and (b) can be considered illegitimate for also failing to comply with the requirement of necessity, not only given the extensive grounds for content control listed under 3(1)(b), beyond those recognized in international human rights law (ICCPR article 19 (3)), but also for being disproportionate. In considering if a given piece of legislation meets the requirement of ‘legitimate purpose’, it is important to consider that the right to freedom of expression is broad in its scope, encompassing “even expression that may be regarded as deeply offensive.” [4] When restricting broad categories of content, lawmakers should consider the likelihood that speech that is controversial but protected will be impacted.

As clarified by the General Comment 34, [5] even restrictions under Article 20 (2) of the ICCPR - which requires States to prohibit ‘advocacy of national, racial or religious hatred that constitutes incitement to discrimination, hostility and violence’ - must still satisfy the cumulative conditions of legality, necessity and legitimacy.         

Imposing the obligation of conducting this complex assessment on intermediaries may lead private enterprises to censor content and may bring those private enterprises into direct contravention of their responsibility under international human rights law to respect the right to freedom of expression. [6] As clarified by the UN Special Rapporteur in his 2018 report to the General Assembly, States should “avoid delegating responsibility to companies as adjudicators of content, which empowers corporate judgment over human rights values to the detriment of users.” [7]

This is especially problematic given the short timeframe introduced by 3(2)(a), which states that complaints relating to 3(1)(b) shall be acted upon expeditiously and redressed within 72 hours of reporting. Given the huge volume of content involved, this provision would not only be impractical – leading to overzealous content moderation, but virtually impossible for many intermediaries, and will lead to over-censorship and self-censorship.

In addition, while certain content, in certain circumstances, may merit quick and decisive moderation, imposing broad and strict time limits applying to all complaints also hinder intermediaries’ ability to exert an assessment of priority and urgency and apply a more targeted and nuanced response that would allow for a better allocation of resources. Also, this restricted time frame may make it difficult for the author to contest the allegation or seek remedy.

The provision also appears to be an incentive for the use of automated tools that in themselves may pose additional challenges. [8] As a tool for flagging content, automation, such as algorithmic filtering, has resulted in the removal of content that does not violate terms of service and laws. Some examples include the removal of war crimes evidence [9] and errors generated by algorithm-based translation. [10] Automated takedown tools often have a problem in understanding the context in which certain content is shared. The technologies which are used for automated filtering are still in their nascent stages of development. AI tools which are used for filtering content operate with limited efficiency when it comes to regional languages or languages which are not widely spoken.

There is also the risk that proactive takedown of content may lead to large-scale mass surveillance by private companies and increased automated censorship as platforms would err on the side of caution. This will also have an impact on free speech which forms the spine of a democratic nation.  

The push for proactive monitoring of some types of content would lead to a clear tension with the international human rights law prohibition of prior censorship. [11]

The new Grievance Appellate Committee – on who has the final word on restrictions to online speech

The IT Rules, in its current version, mandates setting up a grievance redressal mechanism that includes the nomination of a Grievance Officer in-country to receive complaints concerning the violations to the IT Rules themselves or ‘any other matters pertaining to the computer resources made available by [the intermediate]’ (provision (3(2)). As set out above, the new amendments set unrealistic deadlines for this mechanism not only to act upon, but to effectively redress any complaints.

But in addition to this mechanism, which is to be set up by the intermediaries themselves, the proposed amendments also create an appellate instance that can overrule intermediaries’ content moderation decisions, which is fully composed of members appointed by the Central Government. As per Rule 3(3)(d), every order passed by this Grievance Appellate Committee ‘shall be complied by the concerned Intermediary’.

This body’s function and constitution would allow the government to exert undue control and effectively have the final say in determining what stays online. Giving the power to control protected speech to unelected bodies that are not independent from political interference creates the potential for democratic decision-making processes and methods of accountability to be circumvented. Any laws that restrict expression should provide for sufficient transparency, oversight and remedy, so as to avoid “confer[ring] unfettered discretion for the restriction of freedom of expression on those charged with its execution.” [12]

As pointed out by the former UN Special Rapporteur on freedom of expression and opinion in his report on content moderation, “States should refrain from adopting models of regulation where government agencies, rather than judicial authorities, become the arbiters of lawful expression.”

In the Press Note announcing the 2022 amendments to the IT Rules 2021, the Ministry states that ‘this is made necessary because currently there is no appellate mechanism provided by intermediaries nor is there any credible self-regulatory mechanism in place.’ This new provision, however, will lead to a lack of independent and judicial oversight over demands to remove or block content – a necessary element of a rights-respecting content governance framework.  

In ordering a stay on certain provisions of the IT Rules, Indian courts have acknowledged that there is substance to the concern that the oversight mechanism to control the media by the Government may rob the media of its independence and the fourth pillar of democracy may not at all be there. The court also noted that “there is a substantial basis to the petitioners’ assertion that there may be a violation of Article 19 (1)(a) (of the Constitution) in how the Rules may be coercively applied to intermediaries. Accordingly, if there is any action taken in terms of Rules 3 of the said Rules read with (Rule) 7 thereof, it will abide by petition.” The proposed amendment to Rule 3 contravenes this caution against the negative impact on the right to freedom of expression, owing to excessive government control, and will not hold up against legal scrutiny, involving an assessment of rights guaranteed by the Indian Constitution.

Conclusion and requests

Amendments to the IT Rules, 2021 are crucial to ensure that the Rules are democratic, constitutional and safeguard the fundamental rights of Indian citizens. However, the amendments currently proposed seek instead to further curtail rights of people through imposing additional due diligence requirements on intermediaries that could result in over-censorship, impractical timeframes for resolution of complex grievances related to rights and the formation of an appellate authority that is not independent of the executive.

We urge MeitY and the Government of India to:

  1. Suspend the implementation of the IT Rules, 2021, and commit to reviewing them in their entirety, to ensure that the rights to freedom of expression, information, association and privacy, are protected and strengthened;

  2. Withdraw the proposed draft amendments to the IT Rules, 2021;

  3. Conduct a sustained, meaningful and participatory consultation with the relevant stakeholders and public at large.

Sincerely,

  1. Access Now

  2. ARTICLE 19

  3. Association for Progressive Communications (APC)

  4. Body & Data, Nepal

  5. Bytes for All, Bangladesh

  6. Digital Empowerment Foundation

  7. Electronic Frontier Foundation

  8. EngageMedia

  9. Foundation for Media Alternatives, Philippines

  10. Internet Freedom Foundation

  11. JCA-Net, Japan

  12. Jokkolabs Banjul, The Gambia

  13. Korean Progressive Network Jinbonet

  14. Manushya Foundation, Thailand

  15. Media Matters for Democracy

  16. Open Net Korea

  17. Software Freedom Law Centre, India

  18. Southeast Asia Freedom of Expression Network (SAFEnet)

  19. VOICE, Bangladesh

Annex

Rules where addition/  changes are proposed

Rule-wise comments/ suggestions

Justification for the comments/  Suggestions

Rule 3(1)(a)

The new duty of intermediaries to ‘ensure compliance’ with rules, regulations and policies must be withdrawn.

The rule charges intermediaries with the task of controlling online speech without reference to specific complaints and potentially creates an obligation of proactive monitoring that could result in the massive removal or blocking of legitimate speech, restricting freedom of expression of users.

Rule 3(1)(b)

The rule mandating intermediaries to cause users “not to host, display, upload, modify, publish,

transmit, store, update or share any information” laid down in 3(1)(b)(i) to (x) must be withdrawn.

 

The rule promotes overzealous content moderation practices from intermediaries, incentivizes the use of automated content controls and refers to broad categories of information that are vague and open to abuse. It will lead to unreasonable restriction of freedom of expression of users and will fail to pass the test of (i) legality, (ii) necessity and proportionality, and (iii) legitimacy set out under international human rights standards.

Rule 3(2)(a)

The rule requiring request for removal

of information or communication link relating to sub-clauses (i) to

(x) of the clause (b) under sub-rule (1) of rule 3 to be acted upon expeditiously and redressed within 72 hours must be withdrawn.

 

The time-frame proposed by this Rule is unreasonable and unrealistic, given the huge volume of content that intermediaries will have to review. This burdensome requirement would not only lead to overzealous content moderation, but be virtually impossible for many intermediaries to implement. It will also lead to over-censorship and self-censorship.

Rule 3(3)

The rule allowing for the setting up a grievance appellate committee by the Central Government against orders made by Grievance Officers of intermediaries, process for appeal and compliance by intermediaries must be withdrawn.

 The Grievance Appellate Committee sought to be created by the proposed amendments, is to be constituted by the central government, and further empowers the government to appoint its members, thereby exacerbating the issue of lack of independent oversight, and allows for unreasonable involvement of the government in decision-making on matters of fundamental rights of citizens. 

 

 


[1] Proposed draft amendment to the IT Rules, 2021, https://www.meity.gov.in/writereaddata/files/Press%20Note%20dated%206%2…

[2] Justice K.S.Puttaswamy(Retd) vs Union Of India, (2017) 10 SCC 1, AIR 2017 SC 4161

[3] Restrictions on freedom of expression must be provided by public laws “formulated with sufficient precision to enable an individual to regulate his or her conduct accordingly.” Human Rights Committee, General Comment No 34, CCPR/C/GC/34, 12 September 2011, para 43 (hereinafter General Comment 34).

[4] General Comment 34, para 11.

[5] General Comment 34, para 50.

[6] UN Guiding Principles on Business and Human Rights.

[7] A/HRC/38/35, paragraph 68. 

[8] See, Natasha Duarte and Emma Llansó, Mixed Message? The Limits of Automated Social Media Content Analysis, November 28, 2017. Available at: https://cdt.org/insights/mixed-messages-the-limits-of-automated-social-….

[9] Asher-Schapiro, A. (2017). YouTube and Facebook Are Removing Evidence of Atrocities, Jeopardizing Cases Against War Criminals. Available at: https://theintercept.com/2017/11/02/war-crimes-youtube-facebook-syria-r….

[10] Ong, T. (2017). Facebook apologizes after wrong translation sees Palestinian man arrested for posting ‘good morning’. Available at: https://www.theverge.com/us-world/2017/10/24/16533496/facebook-apology-….

[11] See, American Convention on Human Rights, Article 13(2); see also, Eur. Ct. H.R., Case of The Sunday Times v. the United Kingdom, Judgment of April 26, 1979, Application Nº 6538/74.

[12] General Comment 34, para 25.

[13] A/HRC/38/35, paragraph 68.