Children’s Data Under the DPDP Act, 2023 and DPDP Rules, 2025: The New Compliance Frontier for EdTech, Gaming, Social Media, and Consumer Platforms

Introduction
The protection of children’s personal data has emerged as one of the most sensitive and heavily regulated areas under global privacy laws. India has now entered this space decisively with the Digital Personal Data Protection Act, 2023 (DPDP Act) and the Digital Personal Data Protection Rules, 2025. Together, they create one of the most stringent frameworks in the world for processing children’s data.
While the DPDP Act establishes the legal foundations consent, prohibition of tracking, targeted advertising, and monitoring, Rules 10, 11 and 12 operationalise these mandates with precise verification systems, permitted exemptions, and special safeguards. For India’s rapidly expanding digital ecosystem, particularly EdTech platforms, gaming companies, social media platforms, OTT services, online learning platforms, and child-focused consumer apps these provisions mark a significant regulatory shift.
The stakes are high. The DPDP Act empowers the Government and the Data Protection Board (DPB) with strong enforcement powers, and the penalty schedule prescribes fines up to ₹200 crore for violations relating to children’s data. For industry players, compliance is no longer optional, it is a strategic, operational, and ethical necessity.
Table of Contents
Who Is a Child Under the DPDP Act?
The DPDP Act defines a child as a person who has not completed 18 years of age. This is broader than several global privacy regimes (such as COPPA in the US which defines a child as under 13), and aligns with India’s legal system, which treats 18 as the age of majority.
This has enormous implications because India has one of the world’s largest minor populations, many of whom participate actively in online learning, gaming, social interaction, streaming, and mobile applications.
Core Obligations Under the DPDP Act: A High-Level View
Under Section 9 of the DPDP Act
1. Verifiable parental consent is mandatory before processing any personal data of a child.
2. Data fiduciaries must not:
- track or monitor children’s behaviour,
- conduct targeted advertising directed at children, and
- process data in a manner likely to cause “harm.”
3. Additional obligations apply to “risk-prone” platforms such as social media intermediaries. These obligations reflect a shift from “consent-only models” to a risk-based protection-first model for minors.
DPDP Rules, 2025: The Operational Framework
Rule 10 – Verifiable Parental Consent Mechanisms
Rule 10 introduces a robust and technically detailed system for obtaining verifiable parental consent, going significantly beyond mere declarations or checkboxes.
A Data Fiduciary must:
- Reliably identify an adult claiming to be the parent.
- Use authoritative identity credentials, including Aadhaar-linked Digital Locker tokens.
- Support multiple verification methods, enabling flexibility but ensuring reliability.
- Maintain audit trails and verification logs for regulatory examination.
Illustrations provided in the Rule demonstrate:
- When a parent is already a registered user,
- When they are not,
- When verification must be done through virtual tokens.
Operational Impact: Platforms must redesign onboarding flows entirely simple “I am above 18” screens are no longer sufficient. Mobile apps must build age gates, verification APIs, parent dashboards, and withdrawal-of-consent mechanisms.
Rule 11 – Data Processing for Children With Disabilities
Rule 11 recognises that certain children or adults with severe disabilities require lawful guardians to provide consent on their behalf. Key obligations for Data Fiduciaries include verifying lawful guardianship certificates issued by:
- Courts,
- District authorities,
- Local-level committees under disability laws.
- Using only authorised identity and disability credentials.
Impact: Platforms must train customer service teams to recognise and validate guardianship documents. EdTech and healthcare apps must update processes to handle such requests with sensitivity and legal accuracy.
Rule 12 – Exemptions for Essential Child Services
Rule 12 provides limited and conditional exemptions to protect essential activities without compromising safety. Two categories exist:
(A) Entity-Based Exemptions (Fourth Schedule Part A), Includes:
- Hospitals and healthcare establishments,
- Schools, educational institutions,
- Crèches, day-care centres,
- Transport services for children.
These entities may process children’s data without parental consent only when strictly necessary for health, safety, or essential services.
(B) Purpose-Based Exemptions (Fourth Schedule Part B)
Allows processing for:
- Real-time location tracking (e.g., school buses),
- Age verification for restricting harmful content,
- Safety filtering,
- Subsidy or welfare delivery.
Interpretation: These exemptions are narrow, purpose-bound, and do not provide blanket immunity. Organisations cannot use these exemptions for analytics, profiling, or monetisation.
Prohibition of Behavioral Tracking and Targeted Advertising
The DPDP Act places sweeping prohibitions on:
- behavioural monitoring,
- tracking,
- targeted advertising.
This impacts:
- gaming companies using behavioural economics and nudges,
- social media recommendation engines,
- EdTech platforms with engagement analytics,
- content platforms analysing viewing patterns.
Anything that profiles behaviour, predicts choices, or tailors ads based on activity is prohibited for children. Companies must configure tracking systems to exclude users under 18 from profiling mechanisms.
This may require:
- complete re-engineering of algorithms,
- dual-mode recommendation systems,
- user-segmentation by age,
- location-based or identity-based age verification.
Enforcement: Penalties and Compliance Risk
Violations involving children’s data are among the most heavily penalised under the DPDP Act’s Schedule. Penalties may reach: ₹200 crore for failure to observe obligations relating to children’s personal data.
In addition:
- The Data Protection Board (DPB) may investigate platforms,
- The Board may order urgent remedial measures,
- In extreme cases, the Government may order blocking of non-compliant platforms under Section 37 of the Act,
- Repeat violations significantly raise penalty risk.
For platforms serving large child populations (gaming, EdTech, streaming), the risk is both regulatory and reputational.
Impact on Key Sectors
EdTech Companies
EdTech platforms must: Implement reliable parental verification,
- Design child-friendly privacy notices,
- Remove personalised ads,
- Provide data erasure dashboards,
- Establish parental control centres.
India is the world’s largest online education market compliance will be resource-intensive.
Gaming Companies
Gaming entities face major operational challenges:
- Age verification must be built into account creation,
- Behavioural monitoring (e.g., addiction detection) must be redesigned,
- Microtransactions and in-game purchases must have parental visibility,
- Reward-based or targeted incentives may be prohibited.
Social Media Platforms
Social platforms must:
- Prevent children from being targeted with personalised content,
- Stop behavioural profiling,
- Build robust complaint-handling mechanisms,
- Implement age assurance systems with high accuracy.
User-generated content platforms must adapt moderation and reporting workflows to handle parental complaints swiftly.
Consumer Platforms & Retail Apps
E-commerce and consumer apps must:
- Implement age gating,
- Avoid behavioural data analysis for minors,
- Prevent cross-use of child data for marketing.
Even loyalty programmes and teen-oriented shopping platforms must overhaul data flows.
Special Considerations for Significant Data Fiduciaries (SDFs)
Platforms likely to be classified as Significant Data Fiduciaries (SDFs) those with high impact, volume, risk, or children-focused services will face additional regulatory scrutiny.
Under Rule 13, SDFs must:
- Conduct annual DPIAs,
- Undergo independent data audits,
- Assess algorithmic risk to children,
- Report audit findings to the DPB.
This could apply to:
- large EdTech platforms,
- major gaming companies,
- social networks,
- streaming platforms.
SDFs must maintain detailed documentation for every child-related data flow.
Technical, Legal, and Governance Measures Required for Compliance
Companies must build end-to-end children’s data governance frameworks.
1. Technical Measures
- Robust age-verification mechanisms,
- Identity-backed parental authentication,
- Re-engineered consent flows,
- High-quality logging and audit trails,
- Automated triggers for consent withdrawal,
- Removal of behavioural trackers for children,
- Real-time segmentation of under-18 users.
2. Legal Measures
- Updating privacy policies and notices under Rule 3,
- Modifying terms of service,
- Negotiating new contracts with third-party processors,
- Ensuring foreign vendors comply with DPDP obligations,
- Establishing SOPs for handling personal data breaches under Rule 7,
- Aligning children’s data practices with retention rules under Rule 8.
3. Governance Measures
- Appointing trained data-handling personnel,
- Updating internal data governance charters,
- Conducting DPIAs specifically for child data processing,
- Creating child-safety committees,
- Running staff training modules,
- Maintaining readiness for DPB inquiries.
Compliance Checklist for Companies Processing Children’s Data
1. Age-Gating: Mandatory age verification at the account creation level.
2. Verifiable Parental Consent: Use of digital locker tokens, document verification, or consent manager mechanisms.
3. No Tracking or Targeted Ads: Disable personalised advertising and behavioural analytics.
4. Data Minimisation: Collect only essential information required for the service.
5. Risk Assessment: Conduct and document regular DPIAs.
6. Retention Limits: Delete data upon purpose fulfilment, with 48-hour advance notice per Rule 8.
7. Parental Rights: Provide portals for access, correction, and erasure requests.
8. Vendor Controls: Contractually bind third-party processors to child-protection standards.
9. Incident Response: Ensure breach notifications are sent promptly to parents and the DPB.
10. Governance Documentation: Maintain SOPs, policy documents, risk registers, and audit reports.
Strategic Recommendations for Businesses
1. Re-engineer user onboarding: Platforms must implement robust age-assurance systems and parent-child account linking.
2. Separate child and adult data pipelines: Systems for minors must be isolated to prevent accidental targeting or tracking.
3. Develop child-safe UX frameworks: Design interfaces that minimise data sharing and maximise clarity.
4. Adopt least intrusive data models: Use anonymous or pseudonymous processing wherever possible.
5. Build proactive parental dashboards: Empower parents to control visibility, approvals, and data deletion.
6. Use AI responsibly: Ensure algorithms used for moderation or engagement do not inadvertently profile children.
7. Stay prepared for DPB inquiries: Maintain documentation, logs, and consent verification trails.
Conclusion
The DPDP Act and the DPDP Rules, 2025, together constitute a landmark shift in how children’s data must be processed in India. With strict prohibitions, verifiable parental consent requirements, narrow exemptions, and high penalties, the framework sends a clear message: child safety is non-negotiable in the Indian digital ecosystem.
For EdTech companies, gaming platforms, social media intermediaries, OTT services, and digital consumer brands, compliance is not just a legal requirement it is a reputational, ethical, and business imperative. Companies that move early to embed robust child-protection measures into their operations will earn trust, reduce regulatory exposure, and differentiate themselves in an increasingly competitive digital market.
Contributed by – Aurelia Menezes
By entering the email address you agree to our Privacy Policy.