The Children's Wellbeing and Schools Act has received Royal Assent. It includes a new power enabling the UK government to act quickly in response to findings from the ongoing Children's Digital Wellbeing consultation.

On 27 April, Minister Olivia Bailey told the House of Commons that the government had listened carefully to concerns raised across both Houses about the need to act swiftly once the consultation concludes. The Secretary of State "must" act following the consultation, not "may".

The government says it is consulting on the mechanism, but it definitely intends to impose some form of age- or functionality-based restriction for children under 16. Significantly, it also suggests that changes will be made regardless of the consultation's outcome (although it may already have a sense of the direction of responses received to date).

Any consideration of restrictions such as curfews would be in addition to these measures, not instead of them. The government's stated focus is on "addictive features", harmful algorithmically driven content, and functionality such as stranger pairing, which it says can be particularly damaging to children's safety and privacy.

The Act requires the government to publish a progress report within three months of Royal Assent, reflecting its intention to respond quickly once the consultation has concluded. Following that report, it will have 12 months to lay regulations, although it says it intends to do so by the end of 2026.

In exceptional circumstances, the government may be able to extend that timeline by a further six months. It says it would do so only in serious, unforeseen circumstances and would need to return to Parliament to explain why an extension was required. Reflecting concerns about harmful and addictive design, the Act also specifies that the Secretary of State must have due regard to such features when deciding how to exercise the power and when making future regulations.

The wider context

At EU level, the European Commission has recently found that some social media platforms have breached the Digital Services Act by failing to diligently identify, assess and mitigate the risks of minors under 13 accessing their services. It used its guidelines on the protection of minors to assess compliance with DSA requirements to ensure a high level of privacy, safety and security for minors. It has also developed an EU age verification app, which it says can serve as a reference framework for age verification using a user-friendly and privacy-preserving method. For more information about age assurance, see our article here. The Commission is continuing to investigate other potential breaches of the DSA, including whether platforms have properly assessed and mitigated risks such as exploiting the weaknesses and inexperience of minors and driving addictive behaviour through design.

Alongside protecting children online, the Crime and Policing Act 2026 has also recently received Royal Assent. Among other things, it criminalises the making, adapting, supplying (or offering to supply) so-called nudification tools. It also strengthens the law on non-consensual intimate image abuse by creating new offences of "screenshotting" an intimate image without consent, enabling courts to make deletion orders for non-consensual intimate images, and placing new duties on online platforms to ensure such images are taken down within 48 hours.

UK government commits to introducing social media restrictions for children – regardless of consultation outcome

Authors