In June 2020, at the peak of India’s first wave of the coronavirus, Advocate Prashant Bhushan was convicted for contempt of court for tweeting against the Chief Justice at the time, Justice Bobde. Bhushan was convicted of the offence and asked to pay a ₹1 token fine.
If the Draft Data Protection Bill (DPB), as recommended by a Joint Parliamentary Committee (JPC), had been in force at the time, Twitter may have also been held liable for the post. Why? Because the JPC’s recommendations, filed in December 2021, call for suspending what is known as ‘safe harbour’ protections that are otherwise enjoyed by social media entities. With this, these entities will be treated as “publishers” of user content by the law.
But the concerns with the Draft DPB are not limited to the liability of social media platforms for third-party content. The introduction of “social media platforms” within the ambit of the Draft is in and of itself troublesome It presents peculiar challenges for policymaking, as the regulation of social media entities is beyond the scope of the data protection legislation explicitly concerned with informational privacy.
Safe harbour provisions protect social media platforms from assuming liability for user-generated content hosted on their platform. Effectively, social media platforms stand to lose this critical immunity, although their actual knowledge or control over users’ actions could be limited. This could have potentially disturbing implications for freedom of expression as well as the innovation potential of internet companies.
Why the classifications under the draft DPB are a cause for concern
As a part of its recommendations, the JPC has introduced significant revisions by including Clause 3(44) in the Draft DPB to define “social media platforms” as entities that “primarily or solely enables online interaction between two or more users and allows them to create, upload, share, disseminate, modify or access information using its services”. The recommendation to classify social media entities as “platforms” in the Draft is a diversion from their prior classification as “intermediaries” under the previous draft Personal Data Protection Bill, 2019. Despite the difference in classification, the definition of “social media platforms” is identical to that of “social media intermediary”. Therefore, the distinction between an “intermediary” and “platform” is unclear, leading to regulatory concerns.
Regulatory concerns are also related to who the regulator actually is. The Draft DPB provides expansive powers to the Centre to notify certain social media entities as “significant data fiduciaries” (Section 26(4)), granting the executive unprecedented sway over the latter. The specific threshold for such a classification (linked to volume of data processed by an entity) is to be notified by the Government. Not only is this provision arbitrary, but stands to impinge on the powers of the Data Protection Authority (DPA) which can notify ‘significant data fiduciaries’ as per Section 26(1) of the Data Protection Bill, 2021. The DPA is the apex regulatory body concerned with the enforcement of India’s data protection legislation. Government intervention in this regard is highly undesirable, as it could undermine the independence and powers of the DPA.
The Draft DPB also risks treading choppy legal waters based on how it chooses to classify these entities. It fails to define any legitimate basis for the reclassification of “social media intermediaries” as “publishers”. This has disconcerting implications for intermediary liability, subverting the Supreme Court’s verdict in Shreya Singhal v. Union of India, where the court struck down Section 79 of the Information Technology Act, 2000. By doing this, the Court held that social media entities (the intermediary) could no longer be held liable for third party content hosted on the entities’ application.
Fundamental Freedoms, Arbitrary Classifications
The Intermediary Guidelines 2021 deal with the regulation of intermediaries including social media intermediaries. Social media intermediaries like Whatsapp, Signal and many others come under the ambit of these rules which are administered by the Ministry of Electronics and Information Technology (MeitY). These guidelines empower the State to take down the news content of publishers. These rules combined with the Draft subject social media to additional scrutiny. Bracketing social media platforms under ‘publishers’ may imply that they will be subject to government oversight.
Moreover, the recommendation to treat social media platforms as “publishers” is premised on the understanding that platforms possess the ability to “select the receiver of the content” and “exercise control over (its) access”. This means that since social media intermediaries have the ability to control access to the content that is posted on their platform, they must be held accountable for the content they allow.
The Draft fails to provide a concrete set of circumstances under which an intermediary may be notified as a publisher. The lack of such a framework may lend itself to arbitrary notification of an intermediary as a publisher. This would have damaging effects on safe harbour protections and the freedom of expression.
For instance, entities may resort to censoring user content prior to publication (“prior restraint”) for fear of being held liable for hosting said content, especially by unverified accounts. This has detrimental consequences for public participation in the democratic process.
Are all social media platforms equally significant?
Section 26 of the Draft has a non-obstante clause that overrides the effect of any other contrary clause. This clause classifies social media platforms as ‘significant data fiduciaries’ based on the number of users above a threshold as specified by the government in consultation with the Data Protection Authority (last notified as 50 Lakhs). This would mean that these platforms will be subject to additional obligations such as data audit requirements and appointment of data protection officers among other things.
Another consideration for classifying an entity as a ‘significant data fiduciary’ is whether their actions have a ‘significant impact’ on electoral democracy, public order, and state security.
‘State security’ is usually said to involve national upheaval, revolution and civil strife. ‘Public order’, on the other hand, encompasses activities of far less gravity and is an extremely low threshold. Where a provision is vague and creates an offence without clear definitions and standards, the Supreme Court has held in many cases that such a provision will be deemed unconstitutional. The rationale is that where no reasonable standards are laid down and terms are not attempted to be defined, it leaves it open to authorities to be arbitrary and such a law must be struck down.
Lastly, the Intermediary Guidelines 2021 require significant social intermediaries to provide an option for users to voluntarily verify their identity. This coupled with the recommendation to hold these platforms liable for user content from unverifiable user accounts reflects the intent of the government to move towards mandatory verification.
While voluntary verification increases compliance costs for intermediaries, it’s a drop in the ocean given the huge market that they have in India. It is the users who bear the brunt of such a policy. The voluntary regime is difficult to implement, especially at a local level, where the enforcers may be unaware of the policy. This means that users may experience friction from enforcers if they are unverified. Additionally, access may be restricted based on verification. For example, if there’s a government live stream of an event, an entry barrier maybe that only verified accounts would be allowed to spectate.
━Rohin Garg, Internet Freedom Foundation
Voluntary verification has the potential to undermine the purpose of social media itself – free expression without any fear or threat of consequences. The provision can most affect vulnerable groups such as sexual assault survivors, minorities, whistleblowers – who use social media to speak about their experiences.
The Concern of Legitimate Regulation
The JPC recommends in the Draft that social media platforms should be treated as “publishers” of content that they host and calls for a separate statutory regulatory body, similar to the Press Council of India, to oversee such content. This could imply a separate code of conduct to be followed by social media entities, according greater power to the statutory body in tightening compliance that could lead to over- censoring of user content.
The cumulative effect of these recommendations and the vague criteria attributed to the provisions of the Draft undermines the safe harbour provisions of the IT Act and amounts to the transgression of power. Rohin Garg from the Internet Freedom foundation commented, “This is a Bill that deals with data protection, not intermediary liability. Why are such provisions being included at all? The provisions of the Bill are reminiscent of the IT Rules because they try to integrate data protection and intermediary liability”. As noted, this is a perilous move with few global precedents, thereby calling for considered debate and research on the subject.
However, instances such as the Cambridge Analytica scandal which saw data of 87 million Facebook users sold for patently unethical targeted political advertising, alongside the rapid proliferation of fake news and hate speech nonetheless serve to drive home the fundamental need for regulation of entities. As is evident, social media has immense sway over public opinion and democratic processes like elections. More importantly, platforms also contribute to influencing and defining the Overton Window – policies that the public deems to be acceptable and desirable.
But, an overarching data protection legislation cannot adequately address the challenges relating to social media regulation. Adopting an independent law, like the UK’s forthcoming Online Safety Bill, could prove to be an important instrument to govern such entities. The Bill provides for social media platforms to take steps to prohibit illegal user content, deals with the protection of children, age verification, end-to-end encryption among others.
The JPC raises legitimate concerns on lack of accountability, managing harms of illegal content, and threats to individual and collective privacy – all of which justify the need for regulation of social media entities. However, restrictions of our fundamental rights to speech and privacy cannot be arbitrary and ill-defined as that would be unconstitutional.