In a week when addictive design has been on the news agenda, the UK government has announced a trial of social media bans, digital curfews, and time limits on apps in the homes of 300 teenagers, This sits alongside the alongside the digital wellbeing consultation, which ends on 26 May 2026.

As well as these government initiatives, the House of Commons Science, Innovation and Technology Select Committee held a follow-up session after publishing its report into social media, misinformation, and harmful algorithms in July 2025. 

First, the Committee considered online misinformation, including recent issues such as fake imagery during the Iranian conflict and deepfakes that misrepresent political figures. It challenged the effectiveness of tools such as "community notes" and AI-assisted drafting.

It also considered the broader landscape of online harms and the impact of the Online Safety Act, especially AI-generated child sexual abuse videos; the persistence of misogynistic and hateful content from banned individuals; and the exploitation of livestreaming 'gift' systems to target and reward minors. 

The Committee also assessed corporate governance and accountability.  The Committee is concerned that many moderation policies appear to be failing to keep pace with the scale of the harm. In addition, even though APIs are meant to be open, many in the academic community report that they still cannot effectively scrutinise algorithms.

Finally, the Committee scrutinised the proposed social media age limits and the role of parental controls. While some platforms expressed neutrality on a ban, they also acknowledged that current age assurance remains imperfect. The Committee remains unconvinced that "safety by design" is being properly implemented.

It is also carrying out a new inquiry into Neuroscience and digital childhoods which is examining the impact of digital devices on brain development, as well as physical impacts, the differences between devices and uses, and the differing impacts on those of different ages and from different backgrounds.

At European level, we are waiting for the first draft of the Digital Fairness Act and further afield we've seen lawsuits in the US about the addictive design of certain tech platforms.

We also continue to see regulatory interest in online choice architecture from a consumer law perspective.

It's clear that regulators and governments are keen to make sure that platforms are safer for children more generally and for women and girls more specifically.  As a result, platforms need to review their customer journeys to balance users' safety with commercial imperatives.

The move to combat addictive design features signals increased liability for social media, gaming and AI platforms

Authors