ICO fines Reddit £14.47m for children’s privacy failures






Regulatory Alert — ICO fines Reddit £14.47m for children’s privacy failures


🛡️ Regulatory Alert · February 2026

ICO fines Reddit £14.47m
& raises the bar on children’s age assurance

The UK ICO found Reddit unlawfully processed children’s personal information and failed to implement robust age assurance. The decision underscores that self-declaration is not enough where children are at risk.

🇬🇧
24 Feb 2026
Primary source: ICO enforcement news
🧒
Age assurance
Robust verification expected for services likely accessed by children
⚠️
7 min
What product and privacy teams should do now

📅 February 2026
✍️ DPO Advisors
⏱️ 7 min read
ICO
CHILDREN
AGE ASSURANCE
⚠️

Action required. If your service is likely to be accessed by children, treat age assurance as a core compliance control. Review lawful bases, DPIAs, and default settings for minors.

What the ICO decided

On 24 February 2026, the UK Information Commissioner’s Office announced a £14.47m fine against Reddit. The ICO found that Reddit failed to process children’s personal information lawfully and did not apply robust age assurance, despite minimum age terms. The ICO also emphasized that relying on self-declaration presents risks where children can easily bypass controls.

Issue
No robust age assurance meant Reddit lacked a lawful basis to process data of children under 13 (as framed by the ICO).

DPIA
The ICO states Reddit failed to carry out a DPIA assessing and mitigating risks to children before January 2025.

Controls
Age assurance introduced in July 2025 included maturity gating and age prompts at account creation.

Regulator view
Self-declaration alone is not sufficient when children may be at risk.

Signal
The ICO is focusing on online services that primarily rely on self-declaration as an age measure.

Trend
Children’s privacy enforcement continues as part of broader supervisory work and cross-regulator coordination (including with Ofcom).

🔍 What “good” looks like for children’s privacy controls

Controls

Translate “children’s privacy” into product requirements you can test and evidence
🧾

DPIA first
Run a child-focused DPIA before launch or significant change. Identify harms, mitigations, and measurable acceptance criteria.

🔒

Age assurance aligned to risk
Select age assurance methods proportionate to the content and features. Avoid “policy-only” minimum age gates.

📊 Likely supervisory focus (qualitative)

Robust age assuranceHigh
Verification that is hard to bypass

Child-focused DPIAHigh
Risk assessment before launch or change

Privacy-by-default settingsMed-High
High privacy for minors by default

Transparency and user rightsMedium
Clear notices and accessible controls


Age assurance and children’s data: operational requirements

Children’s privacy compliance is not a single feature toggle. It is a set of end-to-end controls: how you detect age, what defaults apply, how risky features are restricted, and how you evidence decisions and outcomes.

🔑 Core principle: if children are likely users, you must be able to demonstrate that your controls prevent exposure and reduce data use — not just that your terms prohibit minors.

📱 A practical age assurance flow

🧒

Age signal
Verification layer

⚙️

Defaults
High privacy

🛡️

Enforcement
Restriction + logs

  • 🔒
    Choose an assurance method that matches the harm. The higher the risk, the stronger and harder-to-bypass the method should be.
  • 🧾
    Do the DPIA early. Include children-specific harms, mitigations, and test cases for bypass attempts.
  • ⚙️
    Set privacy-by-default. Limit profiling, sharing, and discoverability for minors by default.
  • 🧪
    Test controls like security. Red-team age gates and maturity restrictions across devices and channels.
  • 📋
    Keep evidence. Logs, decisions, and QA outcomes should be available for regulator review.

Four concrete actions to take now

Use this enforcement outcome as a benchmark. If your platform is likely to be accessed by children, align legal basis analysis, product controls, and evidence to regulator expectations.

ACTION 01
🧭

Run a child-focused DPIA
Document risks, mitigations, and measurable acceptance criteria for age assurance and risky features.
ACTION 02
🔐

Upgrade age assurance
Move beyond self-declaration where risk is high. Ensure bypass resistance and a clear user journey.
ACTION 03
⚙️

Set minors’ defaults
Reduce data use by default: limit visibility, messaging, recommendations, and targeted ads for minors.
ACTION 04
📎

Build an evidence pack
Prepare documentation and logs: DPIA, policies, control designs, QA tests, and monitoring metrics.

⚠️ Three lessons for privacy teams

Lesson 1
Minimum age terms are not a control. Regulators expect effective enforcement.
Lesson 2
DPIAs must be timely and practical, especially for children’s risk processing.
Lesson 3
Self-declaration is fragile. Where harm is high, treat robust age assurance as foundational.
🛡️

Need a children’s privacy readiness review?

DPO Advisors can help you assess age assurance options, update DPIAs, and translate the Children’s Code expectations into testable product requirements.

Talk to our experts →