At the very heart of the debate is making sure that age assurance is fit for purpose: it should protect children from harmful content while preserving free expression, privacy and anonymity for users wishing to access lawful content. Age assurance is effectively how services decide if a user is a child and if they should apply enhanced protections.
Regulators, governments and industry alike have acknowledged age assurance technology is still developing but it’s clear that it’s no longer acceptable for global businesses operating digital products or services that may be accessed by children to simply ask users to declare their own age.
We look at the key legal and regulatory developments shaping age assurance across the UK and EU and suggest practical steps for businesses to meet their current and forthcoming obligations.
- UK developments – regulatory bar is rising
- EU developments – towards a harmonised framework
- Practical recommendations
UK developments: the regulatory bar is rising
The UK’s regulatory framework centres on the ICO’s Age Appropriate Design Code (Children’s Code) and the Online Safety Act 2023 (OSA) enforced by Ofcom. The UK GDPR and the Data (Use and Access) Act 2025 further strengthen protections for children online.
Ofcom and the Online Safety Act 2023
The OSA requires service providers to implement age assurance to ensure that children are not normally able to encounter harmful content. The age assurance deployed must be "highly effective" at correctly determining whether a user is a child – a standard with significant implications for the technologies businesses choose to deploy (see our article here).
In 2025, Ofcom required services in scope of the OSA to complete children’s access assessments, children’s risk assessments and to have measures in place to ensure that children could not access pornography and harmful material.
Ofcom's enforcement position has hardened considerably. It has extended its age assurance enforcement programme to all platforms allowing users to share pornographic material and has launched a dedicated monitoring and impact programme for the largest platforms. Ofcom has also issued its first OSA financial penalty – a £1 million fine for an adult website provider for failing to have robust age checks in place.
Perhaps most significantly, on 12 March 2026, Ofcom wrote to major technology platforms requiring them to enforce their minimum age policies using highly effective age assurance, giving the platforms until 30 April 2026 to report on the specific actions they intend to take.
Businesses should regard this not as a routine piece of regulatory correspondence but as a clear precursor to enforcement action against platforms that fail to act.
On 26 March 2026, Ofcom and the ICO issued a joint statement on the interaction between online safety and data protection as they relate to age assurance (see below).
The regulators are collaborating and clarifying their enforcement approach, reducing any potential wiggle room for businesses. If you are in scope, you need to act now. Beyond monetary penalties, the reputational damage from non-compliance may be significant.
The ICO's Children's Code and finally fines!
The ICO's Children's Code translates data protection duties into practical design principles and privacy features. Its core requirement is appropriate age assurance.
After the Children’s Code took effect, the ICO initially adopted a collaborative approach, working with platforms to drive compliance. It has some success stories where the ICO’s intervention has driven meaningful behavioural change, as well as international collaboration, e.g. an agreed common international approach to age assurance with ten national data protection regulators. The ICO also issued an updated Opinion on Age Assurance explaining how services and age assurance providers can use the technology in compliance with data protection law in a risk-based and proportionate way.
On 1 December 2025, the ICO published its Children's Code Strategy progress update, reporting on its review of age assurance practices across 17 platforms popular with children in the UK. It also announced a targeted monitoring programme focused on platforms relying primarily on self-declaration as their sole age assurance mechanism.
2026 heralded a new phase, the time for talking is over and the ICO is ready to enforce. The UK Government launched a consultation on whether to ban social media for under 16s and the Prime Minister met senior leaders from major social media companies to demand they "step up and take responsibility" for children’s online safety, signalling if they fail to do so new powers will be used to address parents’ concerns and strengthen protections for children.
The ICO is also actively enforcing with a £247,590 fine for MediaLab/Imgur for failing to implement any age checks and Reddit fined £14.47 million for failing to implement age assurance measures and processing children's personal information unlawfully. The Reddit decision is particularly significant as despite maintaining a policy prohibiting users under 13 from using the platform, Reddit failed to implement any age verification measures until July 2025 and even then, relied only on self-declaration.
On 12 March 2026, the ICO issued an open letter to social media and video-sharing platforms, calling on them to strengthen age assurance measures and move beyond self-declaration. On 25 March the ICO and Ofcom published their joint statement on age assurance (see below) and on 7 April the ICO launched the Switched on to privacy campaign to help parents discuss protecting children’s personal information online, including age settings. It is clear the ICO is taking a holistic approach to protecting children online.
Joint statement from Ofcom and the ICO
The Ofcom/ICO joint statement resolves much of the uncertainty around how service providers should reconcile their OSA duties with data protection obligations. It deserves careful analysis by any business deploying or considering age assurance technology.
Unsurprisingly, the statement confirms, in unequivocal terms, that self-declaration alone is not considered effective for verifying age, nor restricting underage access. This co-ordinated position signals businesses cannot play one regulator off against the other, e.g. by arguing that data protection concerns justify a lighter-touch approach to age assurance.
Both regulators set out a shared flexible, technology-neutral approach, confirming that services may choose the most appropriate age assurance method for their context, as long as it is effective and proportionate to the risks involved. They use four criteria to assess highly effective age assurance (HEAA): technical accuracy, reliability, robustness and fairness, alongside broader considerations of accessibility and interoperability.
Helpfully, the statement says that open banking verification, photo ID matching, facial age estimation, mobile network operator age checks and digital identity wallets can meet the HEAA standard. By contrast, the regulators confirm that self-declaration, debit card verification and general contractual restrictions in terms of service are not acceptable.
Where a service cannot reliably establish a user's age that is appropriate to the risks that arise from the data processing, the regulators expect the Children's Code standards to apply to all users as a default baseline. This creates a powerful incentive for businesses to invest in robust age assurance, as the alternative is to apply child-safe defaults to the entire user base.
The statement also emphasises the importance of conducting and reviewing data protection impact assessments (DPIAs) that you must conduct before you process data. If you haven’t looked at your DPIA recently, now would be the time to revisit it and make sure that you’ve addressed the evolving risks and that you comply with the recent regulatory statements.
EU developments: towards a harmonised framework
The EU is building a harmonised framework for age assurance driven by the Digital Services Act (DSA), the GDPR, the European Data Protection Board’s (EDPB) Statement on Age Assurance and the European Commission's age verification blueprint. Its approach differs from the UK in certain respects, most notably in its emphasis on privacy-preserving technical architecture. However, the direction of travel is strikingly consistent, service providers must do more to identify and protect children.
GDPR and the EDPB Statement on Age Assurance
Under Article 8 of the GDPR processing personal data of children under 16 (or as low as 13, depending on the Member State) by information society services requires parental consent, implicitly requiring service providers to make reasonable efforts to assess users' ages.
On 11 February 2025, the EDPB adopted a statement on age assurance, setting out high-level principles derived from the GDPR, including lawfulness, data minimisation, risk-based approaches and data protection by design and default. The EDPB's approach complements the ICO's Opinion on age assurance, though the ICO provides more detailed guidance on particular age assurance methods and examples of implementation through published case studies.
The DSA and the protection of minors
The DSA places binding obligations on online platforms to mitigate risks to children, including exposure to harmful or inappropriate content. Article 28 of the DSA requires online platforms to take appropriate and proportionate measures to ensure a high level of safety, privacy and security of minors and prohibits the targeting of minors with personalised advertising.
On 14 July 2025, the EU Commission published its DSA guidelines on protecting children online, clearly setting out its expectations. The guidelines recommend age verification for adult content platforms and other platforms posing high risks to minors and specify that age assurance methods should be accurate, reliable, robust, non-intrusive and non-discriminatory. The EU Commission also issued a prototype of an age verification app, known as the age verification blueprint (see below), work on which is ongoing.
For businesses operating across both the UK and EU, there is significant alignment between the two regimes. A provider complying with Ofcom’s guidance under the OSA is likely to meet most DSA requirements though nuances exist, e.g. EU guidance covers loot boxes, while the OSA does not. For more details see our legislative comparison table here.
The EU Age Verification Blueprint
The EU Commission’s age verification blueprint outlines a common technical framework for privacy-preserving age checks. Its core principle is that a user's age status should be verified without disclosing the user's identity to the service provider. In other words, no underlying identity data is transmitted to websites or platforms and the system is designed in accordance with data minimisation and privacy by default principles.
A second version of the blueprint was published in October 2025, adding onboarding using passports and ID cards and support for the Digital Credentials API. Pilot testing is underway in Denmark, France, Greece, Italy, Spain, Cyprus and Ireland. On 15 April 2026, the EU Commission announced its free age verification app was “technically ready” and would be “soon available” to citizens to use, stating “there are no more excuses” and the EU is ready to enforce and hold accountable online platforms that do not protect children.
The blueprint is also closely linked to the EU Digital Identity Wallet, which is due to be rolled out by the end of 2026. It could offer a single, interoperable age verification solution, but the timeline for full deployment is uncertain and businesses should not delay compliance action in anticipation of the wallet's arrival.
National regulatory activity across the EU Member States
Notwithstanding the EU's push towards harmonisation, several Member States have moved ahead with national age assurance requirements that in some cases go beyond the DSA and GDPR, e.g. in December 2023, the Spanish DPA published guidance on age verification and protection of minors from inappropriate content and Germany issued joint guidance in October 2024 proposing nine principles for building an age verification system.
France has been particularly proactive, with the CNIL publishing an analysis of online age verification in September 2022, and more recently adopting the SREN Law requiring providers of adult content to verify that users are over 18. Arcom’s mandatory technical standard for age verification systems for pornographic sites became applicable in January 2025, with penalties of up to €150,000 or 2% of worldwide annual turnover, whichever is higher, for non-compliance.
A growing number of EU Member States are pursuing minimum age requirements for social media. The European Parliament's IMCO Committee has urged an EU-wide "digital minimum age" of 16, without parental consent, for social media, video-sharing platforms and AI companions. Proposed national minimum ages include Austria (14), Denmark (15), France (15), Spain (16) and Greece (15).
Looking ahead, the forthcoming Digital Fairness Act, expected in 2026, will address manipulative design, dark patterns, loot boxes and the protection of vulnerable groups, including children. Businesses should anticipate further obligations regarding platform design features that may exploit or harm younger users.
What should I be doing now?
- Audit your current age assurance mechanisms. As should now be clear, self-declaration alone is insufficient where children are likely to access your service.
- Close the gap between policy and practice. The ICO’s Reddit decision shows that a minimum age policy must be meaningfully enforced. If your terms of service state a minimum age, you must deploy age assurance mechanisms that are reasonably capable of preventing underage access. We’d advise that you assess if your current measures would withstand this level of regulatory scrutiny.
- Conduct or refresh your DPIA. Under the Children’s Code if you offer an online service likely to be accessed by children, you must do a DPIA. The Ofcom/ICO joint statement makes clear that businesses must conduct DPIAs before they process data - and keep them under review. Your DPIA should explicitly address risks to children and document the mitigation measures you have adopted.
- Map your obligations across jurisdictions. Global businesses must track both UK (OSA, UK GDPR and Children's Code) and EU (DSA and GDPR) requirements, as well as applicable national laws in key markets with active age verification regimes, e.g. France, Australia and the US states.
- Vet your age assurance providers. If you are using a third-party age assurance solution, check that it meets the HEAA standard. Consider using ICO-approved certification schemes, such as the Age Check Certification Scheme, to identify providers that meet UK data protection standards.
What should I be doing in the next 6 to 12 months?
- Track the UK’s and EU Member States’ proposed action on children’s online wellbeing. The outcome of these consultations/actions may include a statutory minimum age for social media access. Businesses should begin internal discussions now about how they would implement such a requirement.
- Anticipate the EU Digital Fairness Act. As mentioned above, this is likely to impose additional requirements on platform design, dark patterns and features that exploit children. Include product and design teams in compliance planning now, not after the legislation is adopted.
- Evaluate the EU age verification blueprint. If you operate in the EU the blueprint and the forthcoming EU Digital Identity Wallet may offer an interoperable, privacy-preserving solution that reduces the complexity of multi-jurisdictional compliance. Consider engaging with the pilot programmes and technical specifications now to assess feasibility.
- Build cross-functional governance. Compliance is not solely a legal or compliance function. Children's data remains a global priority, with a clear expectation that platforms will demonstrate end-to-end accountability, i.e. by mapping child journeys, evidencing proportionate age assurance measures and aligning content safety controls with legislative obligations and duties. This requires collaboration across legal, compliance, design, product, engineering, audit and safety teams.
How do I deal with the online safety versus data protection debate?
A key challenge is the tension between verifying users' ages (an online safety obligation) and minimising the collection and processing of personal data (a data protection obligation). Age assurance processing must be necessary, proportionate and lawful.
The Ofcom/ICO joint statement provides helpful guidance on how to navigate this tension in practice. Businesses should collect only the information strictly necessary to confirm a user's age or age range, be transparent about how they use age assurance data and provide clear privacy notices. Users must be able to challenge inaccurate decisions.
The privacy-preserving methods favoured by both UK and EU regulators, particularly the EU's emphasis on solutions where no underlying identity data is transmitted to service providers, represent the regulatory ideal. Businesses that adopt these approaches now will be best positioned to meet the requirements of multiple jurisdictions simultaneously.
Conclusion
The direction of travel globally is clear: regulatory scrutiny of protecting children online is intensifying and regulators intend to hold services to account. Inaction is no longer a defensible strategy.
If you are subject to UK and/or EU age assurance requirements you should treat these developments as a clear signal that compliance cannot be a last-minute tick box exercise. Minimum age policies must be backed by meaningful technology, DPIAs must be current, thorough and kept under review, compliance maps must be multi-jurisdictional and governance must be cross-functional.
If you have any questions, or we can help you navigate this complex, multi-jurisdictional landscape, please do get in touch with your usual LS contact.
