As we predicted in our Lewis Silkin's data, privacy & cyber team watch outs for 2026 "Children's data remains a global priority. There is a clear expectation that platforms popular with children will demonstrate end-to-end accountability by mapping child journeys, evidencing proportional age-assurance measures and aligning content safety controls with UK GDPR duties, OSA and DSA obligations".

The ICO certainly agrees and has kicked off 2026 with a clear message to online platforms: get your house in order when it comes to children's data, or face the consequences.  First there was the MediaLab fine earlier this month – for more information see The ICO steps up on protecting children online – and now on 24 February 2026, the ICO announced a £14.47 million fine against Reddit for failing to protect children's personal information. 

Why was Reddit fined?

Despite Reddit's policy prohibiting users under the age of 13 from accessing its platform, the company failed to implement any age verification measures until July 2025. Even when measures were finally introduced, Reddit relied on self-declaration age assurance, when accessing mature content, during the account creation process. The ICO had warned Reddit that self-declaration is "easy to bypass" leaving children at risk. 

The ICO found two key failures: 

  1. First, Reddit had no robust age assurance mechanisms which meant it had no "lawful basis" for processing personal data of children under the age of 13 increasing their risk to being exposed to inappropriate and harmful content.
  2. Second, the company failed to carry out a data protection impact assessment focused on risks to children before January 2025, even though teenagers were permitted to use the service.

The result? Children were potentially exposed to inappropriate content, and the ICO was not impressed. The ICO signalled that platforms must do better and urged the wider industry to reflect on their age‑assurance practices and make necessary improvements as a matter of urgency.

What's the bigger picture?

Information Commissioner John Edwards made his position crystal clear: simply asking users to declare their own age is not good enough. The ICO is now actively monitoring platforms that rely primarily on self-declaration methods.

This fine is part of a broader push under the ICO's Children's Code strategy, which reported "strong progress" in December 2025. The regulator is clearly willing to use its enforcement powers where platforms fall short.

What should organisations do?

If your organisation operates an online service likely to be accessed by children, now is the time to review your approach to age assurance. The ICO expects organisations to match the rigour of their age verification methods to the level of risk on their platform. Self-declaration alone is unlikely to cut it where children may be at risk.

With two significant fines in quick succession and the ICO promising continued focus in this area, 2026 looks set to be the year of children's data protection enforcement.

Children under 13 had their personal information collected and used in ways they could not understand, consent to or control. That left them potentially exposed to content they should not have seen. This is unacceptable and has resulted in today's fine. Let me be clear. Companies operating online services likely to be accessed by children have a responsibility to protect those children by ensuring they're not exposed to risks through the way their data is used. To do this, they need to be confident they know the age of their users and have appropriate, effective age assurance measures in place. John Edwards, UK Information Commissioner
ICO continues to show teeth when it comes to protecting children's data - will this be the theme of 2026?

Authors