Millions of Indian children used to go to school every day. They would largely study in classrooms, with textbooks, teachers, and blackboards. To ensure their safety, whether on a school bus or on campus, specific security guards or helpers were often hired. Some schools even chose to install security cameras to keep their students and campuses safe. 

Yet, COVID-19 changed things significantly. It didn’t matter if everyone owned a smartphone or laptop or not: classes were shifted online anyway, with mixed results. Gone were the days of face to face learning. Instead, thanks to the pandemic, millions of children are logging in to online ed-tech services like Byjus, Unacademy, and Toppr to learn from home. Ed-tech platforms in India are currently booming as a result. Byju’s–valued at $8 billion (USD)–witnessed a 150% surge in activity in the first month of the lockdown, while Noida-based Edumarshal has witnessed a 250% spike over the three-month period. 

You May Also Like Our Series: Education in Times of COVID

Yet, as more and more of India’s children are forced to go online to learn, who are the security guards monitoring the safety of children using these new online schools? What should they be guarding in the first place? When children register with ed-tech services, they generate copious amounts of data on themselves–be it personal contact details, or specific behavioural data on how they learn. If this data is breached–which actually happened in the case of Unacademy and Vedantu–and ends up in the wrong hands, it can make minors and their parents especially vulnerable to threats, intimidation, and even long-term targeted behavioural profiling. Yet, beyond the ‘big, bad hacker’, are ed-tech companies themselves even using children’s data appropriately?

And so, as Indian students continue to depend on ed-tech platforms to learn for the foreseeable future, we also need to think about how we’re protecting their digital and personal safety. Are ed-tech companies currently responsibly using and protecting children’s data? How is the Indian state monitoring ed-tech companies’ protection of children’s data? Are our Internet laws strong enough to protect a child’s right to privacy online? Are government and private ed-tech solutions being monitored differently under the law? We answer all these questions and more, in this months On the Fence.

Click on a quote to read the opinion

“As ed-tech companies’ data breaches reveal, several platforms may not be deploying adequate security and technology standards–course-correcting for the sake of vulnerable users is imperative.”

— Pallavi Bedi, Senior Policy Officer, The Centre for Internet & Society

“Within the contours of a data protection law, the protection of vulnerable groups, such as children, could be an area where the definition and understanding of a fiduciary relationship, or obligation to care, can be improved upon.”

— Smitha Krishna Prasad, Director, Center for Communication Governance, NLU Delhi

“Under the Draft Personal Data Protection Bill, there is some uncertainty about whether data privacy restrictions for children are also applicable to state schools or state-run ed-tech interventions, in addition to those in the private sector.”

— Rahul Narayan, Advocate-on-record, Supreme Court of India

As with most services such as health care and delivery of public resources, there has also been a clamour for education resources to be provided online, and for schools and other educational institutions to adopt virtual classrooms and teaching methodologies. 

For example, in 2015, the Ministry of Human Resource Development launched the ePathshala application, available on android and iOS, that hosts educational resources and digital books. | Source  

Since then, over the last five years or so, there has also been a proliferation of private ed-tech companies such as Byjus, Vedantu, and Unacademy in India. Their well-packaged digital lessons have caught the imagination of students, parents and to some extent, even schools and teachers. 

However, while private ed-tech platforms have flourished, the legal regime on the protection of personal information and data collected from children has not kept pace with the changes in the education system. 

For example, on May 8, 2020, Unacademy, “India’s largest learning platform” reported that the personal data of a staggering 22 million users had been compromised in January 2020. The platform caters to a large section of the school-going population–worryingly, however, this is not the first such case. In September 2019, Vedantu, another private ed-tech company which also targets school-going students, experienced a similar data breach that exposed the data of 68,000 of its customers.

The Stakes

It is widely accepted that children’s privacy and processing of personal information should be subjected to greater protection, as children are unable to fully comprehend the consequences of their actions. This gets amplified in the digital world, where consent forms that request informed access to their personal information are often mired in complex language that a child may not fully understand. And so, children’s privacy is exposed to greater threats.

Ed-tech companies could be privy to multiple levels of personal and sensitive personal data. In the absence of a robust data protection and privacy law, these platforms face multiple security and privacy threats. Concerns include the exposure of a student’s personally identifiable information, biometric data, academic information, geolocation, web browser history, IP addresses, and classroom activities, as well as the potential disclosure of student’s disciplinary and medical information.

Enter: The PDP Bill

In view of the several infirmities in the IT Act of 2000, the Government has proposed the new Personal Data Protection Bill, 2019 (PDP Bill). Currently being reviewed by a joint parliamentary committee, the Bill is applicable to both private sector entities, as well as government authorities. The PDP Bill is a departure from the existing data protection regulations under the IT Act, which are only applicable to private entities. 

Like the IT Act, the PDP Bill is also sector agnostic. However, there is a significant difference between the two. Irrespective of the sector, all entities (private or government) who determine the purpose for collection of any data (referred to as ‘data fiduciaries’ under the PDP Bill)will have to comply with the security provisions of the Bill.

So, all educational institutionswhether physical or onlinewill be regarded as data fiduciaries as they determine the purposes for which educational data is collected. Therefore, they will have to comply with the statutory obligations prescribed under the Bill. 

These obligations include limitations on the extent of personal data that can be collected, in that data can only be collected for a specific purpose necessary for the ed-tech platform. Furthermore, obtaining informed consent from the user, after clearly informing them about the purpose and type of data being collected, the time period for which the data will be retained, and the security measures being deployed is also necessary. Individuals will also have the right to seek the correction and erasure of data as well as the right to withdraw consent for any further processing of data. Separate consent must also be obtained for collecting sensitive personal data–which includes health data, financial data, genetic data, or data which could identify the caste or sexual orientation of the individual. 

Why is this significant departure? Now, the familiar “opt-in” (i.e., tick a box that says “I allow my data to be used for analysis purposes”) or “opt-out” (i.e., remove a tick in a box that says “I allow my data to be used for analysis purposes”) options listed during user registration on an ed-tech platform may not be considered valid mechanisms to obtain a user’s consent anymore.

How the PDP May Impact Ed-tech in India, and Children’s Privacy 

Now, the PDP Bill specifically recognises the need to address the issue of children’s personal data independently from the general processing of data mentioned above, allocating a separate section of the Bill to it. The Bill considers any person below the age of 18 to be a child, which is in line with the age of consent prescribed under the Indian Contract Act.

As per the Bill, any data fiduciary which seeks to process the personal data of children will have to verify the age of the child and obtain the consent of the parent or guardian. The manner of obtaining the consent of the parent or guardian will be determined by the rules to be framed by the Government. The rules will take into consideration the volume of personal data processed, how much of that personal data is likely to be that of a child, and whether it is possible that there will be any harm to the child from the processing of such personal data.

The PDP Bill has also created another class of data fiduciaries with respect to children, known as guardian data fiduciaries. Guardian Data Fiduciaries have been defined as those data fiduciaries that operate commercial websites or online services directed at children, or who process large volumes of children’s data. They are barred from profiling, tracking, or targeting advertising at children using their data and undertaking any other processing of personal data that can cause significant harm to the children. 

Ed-tech platforms such as Byjus or Toppr for example, which are primarily directed at children and therefore collect a large volume of children’s personal data, will be regarded as Guardian Data Fiduciaries. But, this may conflict with their current data processing policies. 

As per Byjus’s privacy policy, the user’s personal information will be used to “to analyse trends, to conduct research, to administer the Application/Services and products, to learn about each user’s learning patterns and movements around the Application/Services and products and to gather demographic information and usage behaviour about its user base as a whole.” As per Toppr’s privacy policy, the personal data is also used to “gather broad demographic data.”

And so, if the PDP Bill is passed in its current form, it could have a significant impact on the business models of these companies. They may be prohibited from tracking and using children’s personal data to analyse behaviour, learning patterns, and systems. Further, they may also be prohibited from collecting and analysing any trends emanating from the demographic data of children. The prospects of this fast-growing sector, whose tailored educational data-driven insights have thrived in the absence of much legal restrictions on user data, may be fundamentally affected by the onset of the PDP.

In any case, with the PDP Bill still under review, as of now there are a few steps ed-tech companies can incorporate into their operations, to make their data collection processes more secure and transparent. The privacy policies of ed-tech platforms should be clearly available in an easily accessible manner at the sign-up stage, and be understandable for students, parents, and teachers alike. As ed-tech companies’ data breaches reveal, several platforms may not be deploying adequate security and technology standards–course-correcting for the sake of vulnerable users is imperative.


Views expressed are personal.

The Draft Personal Data Protection Bill, 2019 (Draft PDP Bill) provides us with a glimpse of what data protection laws in India may look like in the future. The Draft PDP Bill will impact any ed-tech service that collects or processes personal information, whether for the creation of an account on a website, payment for services, or for the creation of a detailed profile of a student for further analysis. These include basic data protection measures, that ensure that limited amounts of personal data are processed, and that processing is undertaken for a specific purpose that the user is aware of and consented to (with some exceptions). The obligations of an ed-tech service that caters to minors will be even greater. 

Now, while these obligations are applicable in relation to the processing of all personal data, there are at least two aspects of the Draft PDP Bill that stand out in relation to ed-tech providers that target minors. The first is the chapter of the Draft PDP Bill that specifically addresses the collection and processing of children’s data. The second is the conception of data controllers as ‘data fiduciaries’ under the Draft PDP Bill.

The impact of the latter is less clear from the Draft PDP Bill, but it can be better understood with a more general understanding of fiduciary relationships and how they increase an ed-tech provider’s responsibility towards the students that use its services.

Data Fiduciaries in the Draft PDP Bill

The Draft PDP Bill, and an earlier version–the draft Personal Data Protection Bill, 2018–both adopt relatively new nomenclature for stakeholders in the context of personal data protection laws. Typically, data protection laws have identified three stakeholders: (i) data subjects, or the individuals whose personal data is processed, (ii) data controllers, or the service providers who collect and process, or control the processing of such personal data, and (iii) data processors, or third parties who process certain personal data on behalf of the data controller. 

The Draft PDP Bill has also drawn from the concept of fiduciary relationshipsor relationships of trust between the user and more powerful data controllerto define data subjects as data principals, and data controllers as data fiduciaries

However, the Draft PDP Bill does not seem to first consider or explain the nature of the fiduciary obligations that any data fiduciary is supposed to undertake. These obligations are generally in the nature of a duty of care towards the principal. In this case, it is not clear whether compliance with the provisions of the law as per usual is enough, or whether this shift in nomenclature signifies that data controllers will have to act in a fiduciary capacity over and above compliance with the law. Such questions are made complicated by the history of fiduciary law in India

So, how do all of these understandings of fiduciaries affect the processing of children’s data?

The (Multiple) Possible Data Fiduciaries Concerning Children’s Data

Under the Draft PDP Bill, every data fiduciary that processes the personal data of a child (i.e., anyone below the age of 18) must do so in a manner that is protective of the child’s rights and best interests. Once the Draft PDP Bill is passed, the Data Protection Authority (DPA) (the proposed overarching data regulator for the country) is also expected to create regulations that guide how these data fiduciaries verify the age of the data principal, and in the cases of children, obtain the consent of the parents for the collection and processing of personal data. These obligations will apply to any ed-tech service provider that collects and processes the personal data of children (whether a private company or a government agency). 

However, ed-tech companies could be subject to even more fiduciary obligations under the PDP, depending on how they are classified.

For example, in addition to the primary obligations mentioned above, the Draft PDP Bill also places additional obligations on ‘guardian data fiduciaries’ (GDF). The DPA is responsible for classifying a data fiduciary as a GDF if it (i) operates commercial websites or online services directed at children, or (ii) processes large volumes of children’s data. In this context, any ed-tech providers who primarily cater to K-12 education, or whose users are largely made up of children, may be considered GDFs. 

What’s important to note here, is that while the regulations guiding age verification, as well as the additional obligations applicable to GDFs, are tied to the risk of harm to the child and their data, the classification of such data fiduciaries itself is not.

Simply put: guardian data fiduciaries are not defined by how much risk they put the user in by virtue of their data processing.

This is relevant in the context of India’s ed-tech companies, because of the nature of their data processing in the first place.

For example, if an ed-tech provider is categorised as a GDF, they will be “barred from profiling, tracking or behaviourally monitoring” children, and will also be barred from “any other processing of personal data that can cause significant harm to the child.” 

This prohibition could potentially restrict the solutions and services that ed-tech providers are able to offer, as they are largely based on the tracking of a user’s performance

The Draft PDP Bill does provide that this prohibition may be modified where the data fiduciary offers counselling or child protection services to a child–indicating that the aim is to ensure that the child’s rights are protected. In this context, it could be argued that educational services are also in the interest of the child, and should be allowed certain limited modifications to this prohibition as well. 

However, while the broader goal of most ed-tech services may be to offer tools and opportunities for educationthat is assumed to be in the interest of the childa more detailed evaluation of the benefits and risks of different ed-tech solutions may be required, before going down that road.

However, GDF are not the only kinds of fiduciaries ed-tech companies can be classified as!

Separately, the larger ed-tech providers, or those dealing with sensitive personal information (whether of children or not), could also be considered to be significant data fiduciaries (SDF). Why might they be classified as SDF? Because the risk of harm by means of processing personal data is one of the factors that the DPA is expected to take into consideration for the classification of SDFs as well. 

SDFs have additional obligations to improve transparency and accountability measures, under the Draft PDP Bill. For instance, they would need to conduct a data protection impact assessment before using any new technology or undertaking large scale profiling, maintain detailed records of their data management and security practices, and ensure that regular independent audits of these practices are conducted.

Moving Ahead

It is easy to assume that ed-tech services which cater to children have heightened responsibilities to ensure that their rights, especially their right to privacy, are well protected. To begin with, while the Draft PDP Bill itself does not ensure that a guardian data fiduciary will be treated as a significant data fiduciary, many of these obligations might go a long way in ensuring that adequate measures are taken by ed-tech providers, to ensure that children’s data and rights are protected. Further, within the broader contours of a data protection law, the protection of vulnerable groups such as children could be an area where the definition and understanding of a fiduciary relationship, or obligation to care, can be improved upon.   

A more detailed discussion on these issues by both the drafters of the Draft PDP Bill, as well as the legislators constituting the Joint Parliamentary Committee that is examining the Draft PDP Bill currently, may help service providers, as well as users, understand the protections the law offers better.


Views expressed are personal.

To paraphrase Jane Austen, it is a truth universally acknowledged that an innovator in possession of a good app must be in want of Data. Generally speaking, the better the quality of data collected, the more effective the app. 

However, excessive collection, processing, and sharing of data violates the ancient human right to privacy. Accordingly, data processing–whether by the State or by private entities–has been at the centre of technological policy debate for over 20 years.

Now, most nations are attempting to balance privacy rights with the insatiable maw of data collection in the Internet age. At the heart of this balance is the requirement of informed consent of the data principal, without which data processing is not permitted.

Now, ed-tech deals with the data of children, who are unable to provide informed “consent” for the purposes of the law, and who are generally recognized as being particularly vulnerable to data fraud, and profiling. 

Thus, in nearly every regime of data protection, the collection and processing of children’s data are subject to greater and more intense regulation than that of other data principals. 

In India the draft Personal Data Protection Bill (Draft PDP Bill) is no exception, as it creates a special category of ‘Guardian Data Fiduciaries’ and imposes other special restrictions on fiduciaries dealing with the data of children, to stop profiling in particular (Section 16). This operates in addition to the requirements of informed consent for all data processing, imposed in Section 11 of the Draft PDP Bill.

However, there is some uncertainty about whether such restrictions are also applicable to state schools or state-run ed-tech interventions, in addition to those in the private sector. Section 12 of the Draft PDP mandates that when providing any legally authorised function of the state, or when providing a service or benefit to the data principal, data can be collected without the requirements of consent under Section 11. It also allows for data processing by the state without consent, in order to provide assistance or services during any disaster. 

A plain reading of the Draft PDP Bill reveals that while section 12 is an exception to section 11, it has not been declared to be an exception to section 16, which deals with children’s data. Thus, section 16 continues to apply to government ed-tech platforms and schools while the requirement for informed consent under section 11 does not

As the Draft PDP bill makes its way through Parliament, one hopes more clarity emerges on this vital aspect and vigorous protections are retained for children enrolled in the state education system. 

On Closer Examination: The MHRD’s DIKSHA

Schools, school boards, and HRD ministries all over the country are now scrambling to adapt to education in the digital age. As a result, they are encouraging the use of platforms like DIKSHA, to ensure that the entire school year is not wasted. A closer inspection of the state’s ed-tech platforms is revealing of the long-term data hurdles that may emerge as millions of students begin to use them.

Source

In 2017, the MHRD introduced DIKSHA primarily for teachers to enable, accelerate, and amplify solutions in the realm of teacher education. It was intended to “aid teachers to learn and train themselves for which assessment resources will be available.” However now the focus of the app is shifting from teachers to students, who are being encouraged to enrol in ever greater numbers for their online lessons. 

You May Also Like: DIKSHA: The Long-Awaited Antidote to India’s Education Crisis?

In the absence of data protection laws, how well is student data protected on DIKSHA? A perusal of the Privacy Policy and Terms of Use of the DIKSHA App is useful.

Guest users of DIKSHA who don’t register on the platform simply reveal information such as the user type (teacher or student), board, medium, and class. Registered users, however, reveal much more information, such as usernames, email addresses, contact numbers, subjects, boards, learning mediums, districts, and states. For minors registering, the email address or phone number of parents/guardians is also required. The IP address is collected once when the app is installed to determine the state and district of the user, after which it is permanently deleted. The data on the state is then stored as part of the device profile. 

For unregistered users, personal details are only stored on the phone device of such a user, and not on the system; while for registered users, details are stored in the system. DIKSHA assures that the device profile is handled securely. The app collects personal data and telemetry data in relation to usage and uses the same to deliver services to the student and user.

The Privacy Policy assures that all reasonable measures under the IT Act 2000, and the Reasonable Security Practices and Sensitive Personal Data Rules, 2011, are used by the app. Data is primarily stored in an electronic form and the administrators of DIKSHA may enter into agreements with third parties for such storage. Users also have the right to demand the deletion of information. DIKSHA permits students to access, view, and use content without registering themselves.

What does all of this mean?

DIKSHA’s policies reveal an adherence to data minimisation and purpose limitation, even if they lack rigour as regards to non-sharing and deletion of data. Minors have also been offered some protection and choice in terms of what they share as well. 

The issue with DIKSHA though, and indeed with other government ed-tech apps, is that the circumstances of the pandemic may make it more or less mandatory for students to register on the platform, even though the clear design and purpose of the app was teacher-centric. 

In other words, while DIKSHA may be perfectly adequate to protect the privacy of adult teachers who voluntarily use it, it may require tweaks to incorporate the best practices of data protection and privacy of minors, who are enrolling in ever-increasing numbers under a government mandate. Such tweaks or modifications should be done so as to protect “privacy by design,” which is the standard all data processors have to conform to under the Draft PDP Bill.    

The MHRD and other necessary ministries should consider undertaking certain modifications of DIKSHA taking into account how the app is being used now, if this app and others are to become, by default, repositories of student data. A study of the worrying breaches of data privacy in private ed-tech apps to identify possible loopholes is a start. The danger of behavioural profiling of students at the hands of the state and other entities should also be taken into account.

Moving Ahead

There is no rational reason to demand anything less than the highest standards of protection of the data of minors, whether from the government, or from the private sector. Considering the data breaches of ed-tech apps in the past, it is best always to exercise caution as a user. Guardians should ask themselves whether it is necessary to register the minor with DIKSHA to take full advantage of the app, if it has not been expressly mandated. 

Ed-tech apps have revolutionised education in ways scarcely imaginable earlier. But, we are entering this “brave new world” without a real trial run. We must cross the river by feeling the stones.


Views expressed are personal.

Avatar
Pallavi is a Senior Policy Officer at The Centre for Internet and Society (CIS), Bengaluru, where she conducts research on data protection and privacy. She is a lawyer by training, with undergraduate and postgraduate degrees from the National Law Institute University, Bhopal, and the Fletcher School of Law, Tufts University, respectively. Prior to joining CIS, she worked with a corporate law firm and the United Nations Development Programme, New Delhi.
Smitha Krishna Prasad
Smitha Krishna Prasad is a Director at the Centre for Communication Governance (CCG) at the National Law University, Delhi, where she heads the Centre's programmatic portfolio. Her research interests relate to international and domestic laws on privacy, data protection, and surveillance, with a focus on sectors where the intersection of law, policy, and technology have an increased impact on the right to privacy.
Rahul Narayan
Rahul Narayan is an advocate-on-record in the Supreme Court of India, practicing in the fields of commercial, technological, and constitutional laws. Drawing on his involvement in the Right to Privacy and Aadhaar cases in the Supreme Court, Rahul works and regularly writes on issues relating to digital rights, privacy, and data protection. He completed the BCL from the University of Oxford in 2005.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.