Organisations offering online services likely to be accessed by children in the UK must meet clear duties aimed at protecting minors online. The ICO's Children's Code (Age Appropriate Design Code) translates these duties into practical "design principles and privacy features" (for more information see our article here). A key requirement under the Children's Code is appropriate age assurance i.e. tools and approaches that estimate or verify a user's age so that safeguards can be tailored or access restricted where necessary. Alongside this, the Online Safety Act 2023 imposes a "legal duty to protect children online" such as "using effective age assurance" methods. Together, these approaches place the child's best interests at the forefront of online services. 

The ICO's probe into Imgur 

Against this regulatory backdrop, the ICO launched a review on 3 March 2025 into how social media and video sharing platforms were using children's personal data. As part of this, the ICO examined Imgur, an image sharing and hosting platform owned by MediaLab.AI Inc (MediaLab). The review identified several concerns with Imgur's approach to age verification. 

By 10 September 2025, the ICO had issued MediaLab with a notice of intent to impose a penalty. Shortly after, on 30 September 2025, Imgur blocked UK access to its platform. The ICO described this as a "commercial decision" by MediaLab, although the timing clearly reflected the seriousness of the regulator's findings. On 5 February 2026, the ICO confirmed that MediaLab had failed to protect children's personal data between September 2021 and September 2025 and imposed a fine of £247,590. 

Despite Imgur's own terms and conditions stating that children under 13 years could only use the platform with parental supervision, MediaLab had no safeguards in place to determine a user's age or to obtain parental consent when accessing Imgur. Specifically, the ICO found that MediaLab had:

  1. Failed to implement age verification measures leaving it unable to identify minors.
  2. Processed personal data of "children under 13 years without parental consent" or without "any other lawful basis".
  3. Failed to conduct a Data Protection Impact Assessment to "identify and reduce the privacy risks to children" using the platform.

In practice, the absence of age assurance meant that children using the site were at risk of encountering harmful content including material related to "eating disorders, homophobia, antisemitism and images of a sexual or violent nature". 

When determining the penalty, the ICO considered the number of children affected, the degree of harm, the duration of the breach, and MediaLab's global turnover. The regulator also acknowledged MediaLab's cooperation and commitments to address its compliance gaps, but noted that failure to implement the necessary measures could lead to further regulatory action.

The timing and outcome of the investigation naturally give rise to further reflections. The breaches date back to 2021, yet meaningful enforcement action has only materialised now,  more than a year after the ICO began its investigation. The relatively modest level of the fine, particularly when set against the global scale of Imgur and its millions of users, may raise eyebrows. MediaLab's cooperation will undoubtedly have influenced the final figure, but whether the company will appeal the penalty remains to be seen.

Key takeaways 

With age assurance becoming increasingly central to digital regulation, particularly in light of the government's recent consultation exploring a ban on social media use by children following Australia's approach, scrutiny of online platforms is only likely to increase. This is especially true given the ICO's strengthened governance and enforcement powers under the Data (Use and Access) Act 2025. Even where financial penalties appear modest, the reputational and operational consequences for online services can be significant. This latest decision serves as a clear reminder that compliance cannot be an afterthought.

Organisations are encouraged to:

  • Review and verify age assurance mechanisms and parental consent processes are robust.
  • Conduct or update Data Protection Impact Assessments, explicitly addressing risks to children and documenting mitigation measures.
  • Ensure terms, policies and technical safeguards align, so that what is written in user facing documents is genuinely supported by operational practice.
“ MediaLab failed in its legal duties to protect children, putting them at unnecessary risk. For years, it allowed children to use Imgur without any effective age checks, while collecting and processing their data, which in turn exposed them to harmful and inappropriate content.  Age checks help organisations keep children's personal information safe and not used in ways that may harm them, such as by recommending age-inappropriate content. This fine is part of our wider work to drive improvements in how digital platforms use children's personal data. Ignoring the fact that children use these services, while processing their data unlawfully, is not acceptable. Companies that choose to ignore this can expect to face similar enforcement action. ”
The ICO steps up on protecting children online

Authors