This is the first instalment of a two-part series on how the Information Technology Rules, 2021 impact Non-Profit Organisations. Click here to read part two.

Online platforms have become a major source of information and content for people around the world. With the intention to regulate the nature of content circulated on such platforms, the Information Technology (Guidelines for Intermediaries and Digital Media Ethics Code) Rules, 2021 (the “Rules”) have been enacted. The  Rules, in application, extend beyond traditionally understood data intermediaries. They lay down different obligations for social media intermediaries, digital media platforms that publish news, current affairs, and online curated content, and entities that transmit such content.

But, will the enforcement of the Rules have implications for non-profit organisations who disseminate this information too? 

The answer is a resounding “Yes”! The question now is: in what ways will they affect non-profits? 

The non-profit community deals extensively with data. Several non-profit organisations are in the domain of collecting stories, experiences, and incidents from the ground and reporting them in multiple forms. They often create documentaries and short films for advocacy or impact reporting and disseminate them electronically. Some organisations also curate online content, publish news, and run online discussion fora. With the latest developments, these organisations too will now be covered within the ambit of the Rules. 

The Rules classify the entities into two categories: intermediaries and publishers of digital media. Organisations need to identify the category they belong in order to understand their obligations.  To comprehend this decision-making process better, getting to grips with the nitty gritties of the Rules is integral.

Who is an Intermediary?

The Information Technology Act, 2000 (the “Act”) defines an intermediary as any person who receives, stores or transmits information or data on behalf of another person. An intermediary may also provide any additional services for that data. For example, platforms such as Google Cloud that allow users to store and share data would be considered an intermediary.

With the growth of social media platforms over the years, the term intermediary has been broadened to include social media intermediaries. Now, an intermediary that primarily facilitates online interactions between two or more users and allows them to create, upload, share, disseminate, modify or access information is considered a social media intermediary. 

Furthermore as per the Rules, any such social media intermediary with more than 50 lakh registered users will be classified as a significant social media intermediary. As a result, not only online fora like WhatsApp, Facebook, Twitter, Clubhouse or Signal fall under the web of the Rules, but also productivity platforms like Slack that allows users to upload documents and have discussions. Fora created by non-profit organisations to facilitate online networking, discussions, and interactions (example below) like the Ashoka Centre for Social Impact and Philanthropy, YUVA Active Advocacy Forum, Green Peace India, and Reap Benefit would be likely to fall within the definition of a social media intermediary, thus bringing their operations under the government’s regulatory scanner. When this happens, intermediaries need to comply with a laundry-list of regulations laid out by the government.

 What are the Compliances for Intermediaries?

Under the 2021 Rules, intermediaries are expected to observe a gamut of compliances.

Firstly, intermediaries must periodically inform users about the platform’s terms of use and include a privacy notice on their website or mobile application. Through these documents, users are cautioned against publishing or sharing information that is obscene, harmful to minors, violative of any law, a threat to the sovereignty of India,  and so on.  

For example, if a website allows users to post content and the user posts pornographic material, that action would be violative of both the Rules and intermediaries terms of use. The intermediary must then exercise its right to terminate access to the platform, or to remove non-compliant information, or both. 

Secondly, intermediaries may retain the registration data of users whose content has been blocked/removed only for 180 days from the date of cancellation or withdrawal. 

Thirdly, the organisation must appoint a Grievance Officer for the platform and publish their name and contact details on its website/mobile application. Once a complaint is filed with the Grievance Officer, they shall acknowledge the complaint within 24 hours and dispose of it within 15 days. 

However, for significant social media intermediaries, that is, those with over 50 lakh registered users, the list of compliances only increases. 

Significant social media intermediaries are expected to, among other things, appoint a Chief Compliance Officer to ensure compliance with the law, a Nodal Contact Person to coordinate with law enforcement agencies, and a Resident Grievance Officer. They are also expected to publish elaborately detailed compliance reports every month. 

So, non-profit organisations falling under the category of intermediaries will now have to invest human and monetary resources towards establishing grievance redressal mechanisms and keeping track of the content on their platforms to ensure compliance with the Rules.

Implications for Curators and Publishers of Digital Media Content

All digital content that is received, stored, transmitted, edited, or processed by an intermediary or a publisher is classified as digital media.  This would capture digital media disseminated by publishers of news, current affairs or online curated content, who would be collectively referred to as “Publishers”.  

Several non-profit organisations and think tanks fall under the definition of a Publisher as they engage in the publication of news, analysis of social issues, and content curation. For example, the Foundation for Independent Journalism, a non-profit organisation that publishes news under the The Wire, would be considered a Publisher. Other examples include the Independent and Public-Spirited Media Foundation [1] which hosts analytical work on socially sensitive issues, or organisations like PARI, IT for change, Maraa, and Vidhi Centre for Legal Policy which create short films and documentaries, and curate content and reports on social issues. The digital media content created by such organisations will be regulated by the Code of Ethics provided under the Rules.  

The Code of Ethics requires non-profit Publishers that carry news and current affairs to comply with the Norms of Journalistic Conduct of the Press Council of India under the Press Council Act, 1978, and the Programme Code under section 5 of the Cable Television Networks Regulation Act, 1995. The Code mandates that Publishers must factor the implications of their content on the sovereignty, integrity, security, and friendly relations of India with foreign countries, as well as the multi-religious and multi-racial context of India prior to carrying such content. Furthermore, based on the content’s theme or message and the presence of violence, nudity, sex, language, drug and substance abuse, and horror, the publisher of online curated content would have to classify its content into age-based categories before publication. 

Therefore, if a non-profit organisation, as a publisher, makes a documentary on a rape case that has explicit rape scenes, crime scene footage or violence, it will have to classify the content and prominently display the classification on screen. 

Creation of a Regulatory Mechanism

The Rules lay down a three-fold regulatory mechanism for Publishers that serves to provide a system of checks and balances in the process of content moderation. This system will be applicable to by non-profits too.

  • Level I, Self Regulating Mechanism: All Publishers shall establish a grievance redressal mechanism, appoint a grievance officer, and publish their details on its platform. The grievance officer shall dispose of the complaint within 15 days. This implies that publishers of digital media must appoint a grievance redressal officer to investigate complaints against its content.
  • Level II, Self Regulating Body: A self-regulating body is an independent body formed by the Publishers or their associations. The body shall be headed by the retired judge of the Supreme Court or High Court or an independent eminent person from the relevant field. The body shall oversee the adherence of the Publisher and guide such adherence with the Code of Ethics. The body will also hear appeals from the self-regulating mechanism at the level I. 
  • Level III, Oversight Mechanism: The Ministry of Information and Broadcasting will publish a charter for self-regulating bodies, establish an Inter-Departmental Committee for hearing grievances, and issue advisories to publishers. The Committee will have representatives from the relevant ministries and organisations. The Ministry can refer the appeals from the level II mechanism or take suo motu cognizance of grievances to the Level III mechanism. 

The Road Ahead for Non-Profits

The burden of accountability is also as acute on non-profits as on their for-profit counterparts. With philanthropy having an increasingly strong influence over civil society, the ethical responsibility to share and disseminate accurate information and representations of stories and events must be equally shouldered by non-profits. Possibly rooted in this rationale, the Rules are also meant for adoption by philanthropic actors in the ecosystem. 

However, the Rules suggest that the content created by non-profit organisations will be under the scanner at all times, at times threatening the very basis of their existence. Civil society organisations often provide effective and impartial checks and balances to state and non-state actions. They help aggregate the voices of vulnerable communities and expose the under-currents of social issues. Regulation of this nature is likely to stifle the free functioning of such initiatives. It is unsurprising that the Rules have received opposition from various quarters and have even been challenged before the Delhi and Kerala High Courts.

Clearly, these Rules are not without fault. In the second part of this series, we will examine them to see if the means and the ends of social media control for non-profits are justified.

Featured image courtesy of Prateek Katyal on Unsplash.

[1] Disclaimer: The Bastion is receives financial support from the Independent and Public-Spirited Media Foundation.

Nivedita is a lawyer and company secretary (LL.B & ACS). She is the Founder of Pacta, a social and impact sector exclusive law firm based in Bengaluru. Pacta advises billion-dollar philanthropies, family foundations, NGOs, CSR entities, public trusts, start-ups, social incubators/accelerators, schools, and universities. Nivedita is also a performing classical dancer and is passionate about exploring the intersection of the performing arts and the law.
Geetanjali is a lawyer with a specialization in the field of Law and Development. She is currently a legal associate at Pacta, Bengaluru. Her areas of interest include technology regulations, privacy and data protection laws, and climate change.



Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.