On 25 February, the Information Technology (Guidelines for Intermediaries and Digital Media Ethics Code) Rules, 2021, were released jointly by the ministries of information and broadcasting and of electronics and information technology. They were met with an anticipated storm of dissent, particularly over rules that would regulate online content, such as user-generated material on social media and curated entertainment and news. In this din, critics overlooked the positive elements of these guidelines.

Consider the accountability of intermediaries. A central objective of content regulation globally is the protection of vulnerable groups, especially minors, from content that is violent, sexually abusive or violative of the rights of women and children. The Supreme Court raised the issue of child sexual abuse online and violent imagery of women on the internet in the Prajwala case of 2018. A substantial portion of its order is captured in the guidelines. This includes a non-adversarial mechanism for expedited takedowns and the use of technology-enabled tools to proactively identify harmful content. This is not without precedent; Australia’s Criminal Code Amendment (Sharing of Abhorrent Violent Material) Act of 2019 places much higher obligations on digital intermediaries that are gatekeepers of user-generated-content.

On curated content, the regulatory intent is right, if not the method. Controversy has often attended what over-the-top (OTT) platforms offer, be it the case of Sacred Games or A Suitable Boy. We’ve had a Tandav (‘dance of fury’) over the issue, with yea-sayers asking for controversial scenes to be excised and naysayers citing free speech. The issue of regulating such content ought to be evaluated without the baggage of such episodes, with child protection kept in mind, even as adults are given freedom to exercise informed choices. This allows for the nuancing of an otherwise binary narrative between self-regulation and prescriptive state rule-making.

Age-gating is recognized as an effective regulatory instrument under various international legal frameworks. The European Union’s Audio visual Media Services Directive requires EU member states to use age verification tools, parental control and other technical measures to protect minors from certain kinds of content. This is now reflected in the laws of many EU states. Germany, for example, mandates that access to problematic content must be made difficult or impossible for children. The UK in 2014 mandated that those below the age of 18 must not be able to access restricted content. The success of a rating system depends on standardization across jurisdictions and extensive adaptation. This ensures predictability and builds consumer trust. With close to 50 platforms that offer content across genres and languages, India’s market for online curated content has expanded rapidly. Yet, it lacked industry-wide rating standards and technical measures for child protection.

Also Read: Regulating online speech with due process, transparency

Internal grievance redressal mechanisms, with an apex independent body with the power to impose fines, direct reclassification of content and even suggest modifications in select cases, would have been a suitable alternative to adversarial litigation. With the courts overburdened, consumers need other bodies to complain to. But these should not focus on blocking content and cannot be regulated under the intermediary provisions.

It is apparent that the new regulations meet the requirements of the digital ecosystem and are warranted. However, to force-fit provisions to regulate curated content into a legal framework meant for intermediaries like social media entities is not sustainable, considering that the parent Act, i.e., the Information Technology Act of 2000, and, more specifically, the provisions invoked (Sections 79 & 69(A) IT Act) do not support the formulation of such rules. The government will have to adapt appropriate processes for implementing its age-gating and soft regulatory mechanisms in the future.

In addition, the inclusion of news media under this new framework is misconceived and untenable in its entirety. The very nature of material that the guidelines attempt to regulate puts these entities beyond the purview of intermediaries to which Section 79 and 69(A) are applicable. There is no scope for expanding their applicability to online news media, or, for that matter, curated entertainment content through delegated legislation by the executive. If this is not remedied, it is bound to be struck down in a court of law.

On balance, the Information Technology Rules, 2021, do address growing concerns about the safety of children on the internet. They have also recognized the need for social media platforms to be held more accountable. However, it is worth asking: Wouldn’t these frameworks have greater moral legitimacy if Parliament had contemplated and passed them? After all, they do have implications for our fundamental rights of free speech and information. An un- informed citizenry, the Supreme Court observed in Union of India vs. Association for Democratic Reforms, would make democracy a farce. India must strive to ensure due protections for the right to seek, receive and impart information and ideas through any media, as encoded in Article 19 of the United Nations Declaration of Human Rights.

Vivan Sharan is partner at Koan Advisory, and  he tweets @SharanVivan.

N.S. Nappinai is an advocate at Supreme Court of India & founder of Cyber Saathi; she tweets @NSNappina.

The article was appeared in Live Mint.