Children & Technology — Model Policy
Status:
DraftLast updated: 2026-04-05 Maintainers: Open Digital Policies community Related domains: AI Adoption, Algorithmic Accountability, Digital Accessibility, Smart Cities & Privacy Key sources: UN Convention on the Rights of the Child (UNCRC), UK Children’s Code (Age Appropriate Design Code), US COPPA, EU DSA, US KOSA (proposed)
Overview
Children are not small adults. They are among the most targeted groups in the digital economy — for manipulation, surveillance, algorithmic amplification of harmful content, and commercial exploitation. Digital platforms are designed by adults, for adults, and optimised for engagement metrics that treat children’s attention as a monetisable resource.
The evidence base is substantial: excessive social media use correlates with depression and anxiety in adolescents, particularly girls; algorithmic recommendation systems can rapidly route vulnerable young people toward content promoting eating disorders, self-harm, or extremism; data broker ecosystems built on children’s data are used for targeted advertising and profiling; and educational technology frequently collects far more data than is necessary for learning.
This is not an argument against children’s access to digital technology — quite the opposite. Children have rights to information, to communication, to cultural participation, and to education that digital technology can meaningfully support. The goal of this policy model is to ensure that children can participate in digital life in ways that support their development, protect their rights, and respect their dignity — rather than exploiting their vulnerabilities for commercial gain.
The Core Tension
We want children to benefit from digital technology — for learning, communication, creativity, and participation — without exposing them to design patterns engineered to exploit developmental vulnerabilities, without building surveillance profiles that follow them into adulthood, and without amplifying content that causes demonstrable harm.
Scope
- Municipal / local government (educational technology, public digital services)
- Regional / state / provincial government
- National government
- Public sector procurement
- Regulated platforms and digital services accessible to children (private sector)
- Other: _______
Definition of child: This policy applies to any person under the age of 18. Where platforms cannot reliably verify age, protections apply by default.
Pillar 1: Principles
Foundational Values
1. Children’s Best Interests as the Primary Consideration Article 3 of the UNCRC requires that the best interests of the child be the primary consideration in all actions affecting children. This applies to digital platform design, data practices, algorithmic systems, and content governance. “Best interests” means what is genuinely good for children’s development, wellbeing, and rights — not what maximises their engagement time, which is the commercial interest of platforms.
2. Children Are Rights-Holders, Not Products Children have rights under the UNCRC — including rights to privacy (Article 16), protection from exploitation (Article 36), access to information (Article 17), education (Article 28), and participation (Article 12). Digital policy for children must be grounded in these rights, not only in safety restrictions. Overprotective policy that denies children access to digital information and communication also violates their rights.
3. Design Must Not Exploit Developmental Vulnerabilities Adolescents have developing prefrontal cortices — they are biologically more impulsive, more sensitive to social reward, more susceptible to peer influence, and more vulnerable to anxiety and depression than adults. Platform design that uses variable reward mechanisms, social comparison features, infinite scroll, and engagement notifications exploits these developmental characteristics for commercial gain. This is not neutral product design; it is targeted manipulation.
4. Data Collected From Children Requires Heightened Justification The data broker ecosystem that profiles adults is harmful; the same ecosystem applied to children is especially harmful. Data collected from or about children in educational contexts may define how those children are treated by institutions for decades. Children’s data requires: minimal collection, strict purpose limitation, default deletion at adulthood, and a presumption against commercial use.
5. Parental Oversight Is Not a Substitute for Platform Accountability Placing full responsibility on parents to protect children from harmful design is both unrealistic (parents cannot monitor every interaction) and unfair (it removes accountability from the actors who created the harm). Platform design, algorithmic systems, and data practices are the primary point of intervention — parental tools are a supplement, not a substitute.
6. Young People Must Have Agency Children — particularly adolescents — have rights to participate in decisions that affect them (UNCRC Article 12). Digital policy for children should not be made entirely by adults about children. Young people should be involved in the design of protections, the development of policy, and the oversight of platforms that serve them.
7. Inclusion and Access Must Not Be Sacrificed Children from low-income households, children with disabilities, children in rural areas, and children from marginalised communities often have fewer digital opportunities, not more. Any restrictions on children’s digital access must be designed so that they do not further disadvantage already marginalised children. Accessibility is not optional.
Equity Considerations
- Girls and young women — Current evidence suggests that social media’s mental health harms are disproportionately experienced by adolescent girls, particularly through social comparison, appearance-based harassment, and eating disorder content amplification.
- LGBTQ+ young people — LGBTQ+ youth use digital platforms at high rates for community, identity exploration, and access to information not available in their local environments. Restrictions that reduce this access harm a particularly vulnerable group. Policy must protect LGBTQ+ young people from harassment while preserving their access to affirmative communities.
- Children with disabilities — Educational and social technologies often exclude children with disabilities. Accessibility must be a mandatory requirement in all children’s technology.
- Children from low-income households — Technology policy must not create systems where protection from harmful design is a premium feature available only to well-resourced families. Protections must be defaults, not opt-ins.
- Children in state care — Children in foster care and residential care are among the most surveilled and most vulnerable. Their digital data deserves heightened protection, and digital services for them must be designed with particular care.
Environmental Considerations
Devices marketed to children — tablets, laptops, educational technology hardware — are often designed with short product lives and limited repairability, generating e-waste at scale. Educational technology procurement should include device longevity requirements. See Right to Repair model.
Pillar 2: Standards
Mandatory Standards
Standard 1: Prohibited Practices in Services Accessible to Children The following practices are prohibited in digital services that are likely to be accessed by children, regardless of whether those services are formally directed at children:
(a) Targeted advertising based on profiling of children under 18;
(b) Variable reward mechanisms (including “like” counts, streaks, social comparison features, and notifications designed to maximise session return) in services used primarily by children under 16;
(c) Collecting data from children for the purpose of building commercial profiles, sale to data brokers, or use in advertising systems — regardless of parental consent;
(d) Designing or deploying algorithms that recommend or amplify content related to self-harm, eating disorders, suicide, or targeted harassment to users identified or reasonably identifiable as minors;
(e) Using children’s biometric data (including facial recognition) in educational settings without explicit informed consent from the child (where of sufficient maturity) and parent, and with a genuine non-biometric alternative available;
(f) Deploying dark patterns — design choices intended to circumvent users’ intentions — in services used by children.
Rationale: UK Age Appropriate Design Code (Children’s Code) Standard 9 prohibits use of personal data in ways detrimental to children’s wellbeing. California Age-Appropriate Design Code Act (AADC, 2022) extends similar requirements to any online service likely to be accessed by children. The EU DSA (2022) Article 28 prohibits targeting minors with advertising based on profiling. COPPA (US, 1998; Rule amended 2024) prohibits collection of personal data from children under 13 without verifiable parental consent.
Reference: UK Children’s Code (ICO); California AADC; EU DSA Article 28; COPPA Rule 2024
Standard 2: Age Assurance Without Surveillance Where digital services are restricted to adults or require differentiated protections based on age, age assurance mechanisms must:
(a) Use the minimum data necessary to establish the age category — not collect identity documents for storage;
(b) Not create a centralised database of age verification data that could be breached or misused;
(c) Be technically accessible to users with disabilities;
(d) Not require children to provide biometric data as the primary means of age verification;
(e) Not use age assurance data for any purpose other than applying the relevant age-differentiated protections.
Blanket age verification requirements that cannot be implemented without creating surveillance infrastructure are not acceptable policy outcomes.
Rationale: The UK Online Safety Act (2023) requires Ofcom to develop age verification codes. The practical implementation of age verification without creating new surveillance harms is an active technical and policy challenge. This standard sets the principles for acceptable age assurance rather than mandating specific technical approaches.
Standard 3: Data Minimisation and Deletion for Children’s Data Digital services that collect data from or about children shall:
(a) Collect only data strictly necessary for the service function — with commercial data collection presumptively prohibited;
(b) Retain children’s data for no longer than [12] months after the end of the service relationship, unless the child (if of sufficient maturity) or their parent specifically requests longer retention;
(c) Delete or anonymise all data from educational technology platforms within [90] days of a student leaving the school or district;
(d) Not sell or transfer children’s data to third parties for commercial purposes;
(e) Provide a simple, accessible mechanism for children and their parents to request deletion of all data held about the child;
(f) Present a privacy notice in age-appropriate, plain language — not solely in adult legal terms.
Rationale: California’s Student Online Personal Information Protection Act (SOPIPA, 2014) prohibits using student data for advertising and requires deletion on request. New Mexico’s Student Data Privacy Act (2019) is the most comprehensive state framework. COPPA’s data minimisation provisions have been significantly strengthened in the 2024 Rule.
Standard 4: Educational Technology Procurement Public bodies procuring educational technology — including learning management systems, assessment tools, communication platforms, and classroom devices — shall require:
(a) A data flow map showing all data collected, processed, and shared, and for what purposes;
(b) Written certification that student data will not be used for targeted advertising, sold to data brokers, or used to build commercial profiles;
(c) Conformance with applicable accessibility standards (WCAG 2.2 AA minimum);
(d) A data processing agreement that provides for deletion of student data within [90] days of contract termination;
(e) Disclosure of all subprocessors (third-party services with access to student data);
(f) Compliance with applicable student privacy laws (SOPIPA, FERPA, local equivalents) as a contract condition.
Rationale: EdTech vendors routinely collect more data than necessary and share it with undisclosed third parties. The 2022 OECD report on digital learning and the 2021 HRW investigation into EdTech data practices found systematic data sharing that violated children’s privacy across dozens of countries.
Reference: HRW: “How Dare They Peep into My Private Life” (2022); SOPIPA (California)
Standard 5: Algorithmic Systems in Children’s Services Any algorithmic recommendation or content curation system used in a service primarily serving users under 18 shall:
(a) Not optimise for engagement or session time as a primary metric — metrics must include wellbeing indicators;
(b) Be assessed for evidence of harm amplification (including self-harm, eating disorder, harassment, and extremism content) before deployment and annually thereafter;
(c) Default to chronological or editorially curated content feeds rather than engagement-optimised algorithmic feeds for users under 16, with explicit opt-in to algorithmic feeds for 16–17 year olds;
(d) Apply additional filtering for content promoting self-harm, suicide, eating disorders, and targeted harassment when serving users identified as minors;
(e) Be subject to audit under the Algorithmic Accountability framework.
Aspirational Standards
Aspirational Standard 1: Children’s Digital Rights Commissioner Jurisdictions should consider establishing a Children’s Digital Rights Commissioner — an independent office with authority to investigate platform practices, commission research, and issue guidance on children’s digital rights. Ireland’s proposed Digital Safety Commissioner and Australia’s eSafety Commissioner provide reference models.
Aspirational Standard 2: Youth Advisory Boards for Platform Governance Platforms serving children should establish youth advisory boards with actual decision-making power over design choices affecting young users — not merely consultation. Young people should have a genuine role in platform governance, not just product testing.
Standards Cross-Reference
| Standard | Body | Notes |
|---|---|---|
| UN Convention on the Rights of the Child | UN | 1989; ratified by 196 states; the foundational normative framework |
| UK Age Appropriate Design Code (Children’s Code) | ICO (UK) | 2021; 15 standards; most comprehensive design code globally |
| California AADC | California Legislature | 2022; design code approach applied to US context |
| EU Digital Services Act Article 28 | EU | 2022; advertising prohibition; risk assessment for minors |
| COPPA (Rule 2024) | FTC (US) | 2024 revision; strengthened data minimisation |
| EU GDPR Article 8 | EU | 2016/679; age of consent for data processing (16, or 13 with member state opt-down) |
| SOPIPA | California | 2014; student data protection; widely copied by other US states |
Pillar 3: Implementation
Procurement Requirements
Educational Technology Procurement Clause All contracts for educational technology shall include: (a) a prohibition on use of student data for any commercial purpose; (b) a data deletion timeline of [90] days post-contract; (c) an accessibility conformance requirement; (d) a right to audit data flows; (e) a notification obligation within [72] hours of any data breach affecting student data. Failure to comply with any of these conditions shall constitute a material breach giving the procuring body a right to terminate without penalty.
Transition and Timeline
| Milestone | Timeframe from adoption |
|---|---|
| Audit of existing educational technology contracts | 6 months |
| Prohibited practices cease in procured services | 12 months |
| Data minimisation requirements in all new EdTech procurement | Immediate |
| Existing EdTech contracts renegotiated or terminated | 24 months |
| Algorithmic harm assessments for platforms serving primarily minors | 18 months |
Reporting and Transparency
Platform Transparency Report Digital platforms that primarily serve users under 18, or that have more than [500,000] minor users, shall publish an annual Children’s Safety and Wellbeing Report including: (a) the number of minor users and age distribution; (b) content moderation actions for content harmful to minors; (c) algorithmic audit results; (d) complaints received from minors and their outcomes; (e) data deletion requests received and fulfilled; (f) changes to design features affecting minor users during the year.
Enforcement
Enforcement Clause The designated authority may: (a) investigate complaints about digital services’ compliance with this policy; (b) conduct proactive audits of algorithmic systems in services used primarily by minors; (c) impose administrative penalties scaled to global turnover; (d) require design changes as a condition of continued operation; (e) receive complaints from parents, children, civil society organisations, and educators. Children’s digital rights organisations shall have standing to bring collective complaints.
Pillar 4: Governance
Oversight Body
Children’s Digital Rights Authority The designated regulatory body shall have: a Children’s Technology team with expertise in child development, digital design, and data systems; authority to conduct unannounced audits of platform design and algorithmic systems; power to order algorithmic changes and design modifications; and a dedicated child-friendly complaints mechanism. The body shall publish an annual Children’s Digital Rights report.
Community Representation
Young People’s Panel The oversight body shall maintain a Young People’s Panel of 15–21 year olds with: the right to raise issues directly with the oversight body; access to non-confidential platform audit information; a formal response to any issues raised; and compensation for their time. Panel members shall be recruited to reflect demographic diversity.
Audit and Review
Biennial Independent Review This policy shall be reviewed every two years given the pace of platform evolution. Reviews must include: direct consultation with young people; assessment of evidence on harms and benefits; evaluation of whether prohibited practices are adequately defined; and assessment of enforcement effectiveness.
Real-World Examples
United Kingdom — Age Appropriate Design Code (Children’s Code)
Enacted: 2020 (statutory code under the Data Protection Act 2018) In force: September 2021 Regulator: Information Commissioner’s Office (ICO) Link: https://ico.org.uk/for-organisations/uk-gdpr-guidance-and-resources/childrens-information/childrens-code-guidance-and-resources/ Summary: Fifteen standards applicable to any online service “likely to be accessed by children.” Standards include: best interests of the child; data minimisation; default settings protecting children; no nudge techniques; no profiling by default; parental controls without surveillance. Has significantly influenced global platform design — Google, YouTube, TikTok, and others made design changes in response. Community critique: enforcement actions have been slow; the Code applies to UK users but platforms serve global audiences with different protections.
California — Age-Appropriate Design Code Act (AB 2273)
Enacted: 2022 Status: Implementation challenged in court; US District Court partially blocked enforcement (2023); case ongoing Link: https://leginfo.legislature.ca.gov/faces/billNavClient.xhtml?bill_id=202120220AB2273 Summary: Modelled directly on the UK Children’s Code. Requires businesses to consider the best interests of children, prohibit dark patterns, limit data use, and conduct data protection impact assessments. Legal challenges from NetChoice argued First Amendment concerns with the algorithmic restriction provisions. The litigation illustrates the tension between children’s protection and platform speech rights that any effective policy must navigate.
European Union — Digital Services Act (DSA) Article 28
In force: February 2024 Regulator: European Commission (for Very Large Online Platforms); national Digital Services Coordinators Link: https://digital-strategy.ec.europa.eu/en/policies/digital-services-act-package Summary: Prohibits very large online platforms from targeting minors with advertising based on profiling. Requires annual systemic risk assessments including risks to minors. Imposes specific mitigation requirements for identified risks. The DSA’s enforcement architecture — with the Commission having direct authority over the largest platforms — provides stronger enforcement than delegated national frameworks. Community critique: the advertising prohibition is narrower than the UK Code’s approach; risk assessment obligations depend on platforms self-reporting their risks.
Australia — eSafety Commissioner
Established: 2015 (Online Safety Act 2021 expanded powers) Link: https://www.esafety.gov.au Summary: Australia’s eSafety Commissioner has broad powers to: require takedown of harmful online content; investigate complaints; conduct research; develop industry codes; and issue public warnings about non-compliant platforms. The Commissioner has issued compliance notices to Twitter/X, Meta, and Google. Australia’s 2024 legislation banning children under 16 from social media is the world’s most restrictive age-based platform ban — its practical implementation and effects are under active scrutiny.
Ireland — Data Protection Commission (DPC) Children’s Investigations
Active: 2023–present Summary: The Irish DPC, as lead EU supervisory authority for many tech giants headquartered in Ireland, has conducted landmark investigations into children’s data practices. A €310 million fine against TikTok (2023) for failing to protect children’s data remains the largest GDPR children’s penalty. The DPC’s investigations demonstrate that GDPR enforcement, when applied to children’s data, can produce significant change — and significant resistance from platforms.
Gaps and Known Weaknesses
- Age verification implementation — Requiring age assurance without creating surveillance infrastructure is technically unsolved at scale. This model sets the principles but the technical standards are still developing.
- End-to-end encryption tension — Strong encryption protects children from predators and protects LGBTQ+ youth from exposure; it also prevents detection of child sexual abuse material (CSAM). This tension is genuine and not resolved in this model.
- Global platform jurisdiction — Platforms serving children globally are subject to different protections in different countries. A child in one jurisdiction may have far weaker protections than a child in another. International coordination is essential but underdeveloped.
- Educational technology market concentration — A small number of vendors (notably Google and Microsoft) provide educational technology to the majority of schools globally. Their market power makes procurement conditions difficult to enforce in practice.
- Game design and gambling mechanics — Loot boxes, in-app purchases targeting children, and gambling-like game mechanics are addressed incompletely in current frameworks. This model’s prohibition on variable reward mechanisms addresses the most harmful design patterns but full treatment of gaming is needed.
Cross-Domain Dependencies
| Related Domain | Relationship |
|---|---|
| AI Adoption | AI systems in educational settings and content recommendation for children require heightened governance |
| Algorithmic Accountability | Algorithms amplifying harmful content to minors require independent audit and bias assessment |
| Digital Accessibility | Children’s technology must be accessible to children with disabilities |
| Smart Cities & Privacy | Smart city surveillance affects children in public spaces; biometric data in schools |
| Right to Repair | Educational technology hardware longevity; device repairability in schools |
Glossary
Age Appropriate Design: Design principles requiring that digital services accessible to children consider children’s best interests, with privacy-protective defaults, no exploitative design patterns, and age-appropriate content and data practices.
Dark Patterns: Design choices that manipulate users into taking actions contrary to their interests — including hidden unsubscribe options, confusing cancellation flows, pre-ticked boxes, and deceptive urgency prompts. Particularly harmful when applied to children.
Variable Reward Mechanisms: Design features that provide unpredictable positive reinforcement (e.g., “like” notifications, slot machine-style rewards) — exploiting dopamine response in ways analogous to gambling. Documented to be more addictive than predictable rewards.
COPPA (Children’s Online Privacy Protection Act): US federal law (1998, amended 2024) requiring parental consent before collecting personal data from children under 13. The most established children’s privacy framework globally, though limited to data collection by services directed at children.
GDPR Age of Digital Consent: Under GDPR Article 8, children may consent to data processing for information society services from age 16 (or as low as 13 with member state opt-down). Below the age of consent, parental consent is required.
SOPIPA (Student Online Personal Information Protection Act): California law (2014) prohibiting operators of websites, online services, online applications, or mobile apps directed primarily to K-12 students from using student information for commercial purposes or selling it to third parties.
FERPA (Family Educational Rights and Privacy Act): US federal law protecting the privacy of student education records. Applies to schools and districts receiving federal funding.
Contributing to This Policy Model
This model is maintained in the open. Priority contribution needs:
- Age verification technical standards — approaches that achieve age assurance without creating surveillance infrastructure
- Gaming and gambling mechanics — comprehensive treatment of monetisation design targeting children
- Global South examples — children’s digital rights frameworks from African, Asian, and Latin American contexts
- LGBTQ+ youth protections — model language that protects LGBTQ+ youth from harassment while preserving access to affirming communities
- Neurodiversity and cognitive accessibility — specific provisions for children with autism, ADHD, and learning differences
Open an Issue to propose changes or additions. See CONTRIBUTING.md for the contribution process.
All substantive changes go through a minimum 14-day public comment period before merging.
Changelog
| Version | Date | Summary of changes |
|---|---|---|
| 0.1 | 2026-04-05 | Initial draft — four pillars, real-world examples from UK, California, EU, Australia, Ireland |
This policy model is provided for educational and advocacy purposes. It requires adaptation by qualified legal practitioners before formal adoption. It is not legal advice.
Policy Assistant
Choose your persona to open the right prompt builder for this policy domain.