In a two-pronged approach covering online safety and data protection laws, both Ofcom and the ICO have written to major tech platforms to ask them to enforce their minimum age rules with highly effective age checks.
Ofcom
Ofcom is requiring tech companies to take action in four areas:
- Effective minimum-age policies. Ofcom research shows that widespread minimum age policies of 13 are still not being properly enforced by tech companies, with 72% of children aged 8-12 accessing their sites and apps. Despite not being explicitly required by the Online Safety Act 2023, it is calling on platforms to do this, using highly effective age assurance.
- Failsafe grooming protections. This means ensuring strict controls to stop strangers being able to contact children they do not know on their platforms. That includes using highly effective age assurance to check users' ages.
- Safer feeds for children. Ofcom says that algorithms are children's main pathway to harm online. Ofcom is issuing information requests to large platforms so it can assess these systems. Due to the complexities and technical nature of this work, it will take time to assess the responses. However, Ofcom says that it will not hesitate to take enforcement action if it identifies failings in how companies promote content to children.
- An end to product testing on children. New AI tools are launched regularly and widely used by children, without parents knowing if and how they have been tested for safety. Ofcom expects platforms to notify Ofcom that they have, as required by law, assessed the risk of significant updates before they are deployed.
Ofcom has set a deadline of 30 April for the tech platforms to report back to it on the action they will take, and it is urging them to publish this. In May, Ofcom will report on how the companies have responded and will announce any next steps for regulatory action. At the same time, it will release new research on how far children's online experiences have changed during the first year of the Online Safety Act being in force.
ICO
Separately, the ICO has published an open letter to social media and video‑sharing platforms operating in the UK, calling on them to strengthen age assurance measures so young children can't access services that are not designed for them. The open letter sets out the ICO's expectations that platforms with a minimum age must move beyond relying on children to self-declare their ages, which they can easily bypass. Instead, platforms should make use of the viable technology that is now readily available to enforce their own minimum ages and prevent these children from accessing their services.
The ICO has also written directly to certain larger platforms, asking them to demonstrate how their age assurance measures meet these expectations. This is part of its Children's Code strategy to improve children's privacy standards across social media and video-sharing platforms. However, the ICO wants companies to go further on age assurance. It recently fined Reddit £14.47 million and MediaLab (owner of Imgur) £247,590 for failing to implement age assurance measures and for processing children's personal information unlawfully in a way that potentially exposed children to inappropriate, harmful content.
In addition, the ICO remains concerned about how social media and video‑sharing platforms process children's data to generate recommendations, especially when this leads to harmful content or increases the risk of addiction to platforms.
We recently published an article on the European Commission's investigation into Shein, which among other things, is considering Shein's recommendation systems and the upcoming Digital Fairness Act will also seek to deal with some of these concerns. Meanwhile, the ICO has two ongoing investigations into this issue.
Next steps
Both regulators will publish an updated joint statement in March 2026, which outlines the main areas of interaction between online safety and data protection as they relate to age assurance. We also await the outcome of the UK government's consultation on children's online wellbeing. This week, MPs rejected amendments to the Children's Wellbeing and Schools Bill, which included an amendment which would require all regulated user-to-user services to implement highly effective age assurance measures to prevent under-16s from becoming users. However, the Commons supported the UK government's proposal of their own amendment that would allow them to act more swiftly on a social media ban/restriction following the outcome of the consultation in the summer.
In the meantime, we advise that you take immediate steps to review and strengthen your age assurance practices. The regulators have made clear that simply relying on children to self-declare their ages is no longer acceptable.
