Research shows that 97% of young people go online every day and 78% of 13 to 17-year-olds check their devices at least hourly. Around 25% display problematic smartphone use which mirrors addiction. 

Surveys also show that 90% of people in the EU believe action to protect children online is a matter of urgency, particularly in relation to social media's negative impact on mental health (93%), cyberbullying (92%) and the need for effective ways to restrict access to age-inappropriate content (92%).

With this in mind, Australia is introducing new rules from 10 December. Ten social media firms will be required to take reasonable measures to stop children under 16 from having accounts on their platforms.  According to the BBC, the Australian government is under pressure to extend the ban to gaming platforms as well.

European Parliament report

MEPs have adopted a non-legislative report expressing deep concern over the physical and mental health risks minors face online.

Minimum age for social media platforms

To help parents manage their children's digital presence and encourage age-appropriate online engagement, the European Parliament proposes a harmonised EU digital minimum age of 16 for access to social media, video-sharing platforms and AI companions, while allowing 13- to 16-year-olds access with parental consent.

Expressing support for the Commission's work to develop an EU age verification app and the European digital identity (eID) wallet, MEPs emphasise that age assurance systems must be accurate and preserve minors' privacy. They have also said that those systems do not relieve platforms of their responsibility to ensure their products are safe and age-appropriate by design.

To incentivise better compliance with the EU's Digital Services Act and other relevant laws, MEPs suggest senior managers could be made personally liable in cases of serious and persistent non-compliance, especially with regard to protecting children and age verification.

Stronger action by the Commission

Parliament is also calling for:

  • a ban on the most harmful addictive practices and default disabling of other addictive features for minors (including infinite scrolling, autoplay, pull-to-refresh, reward loops, harmful gamification);
  • a ban on sites not complying with EU rules;
  • action to tackle persuasive technologies, such as targeted ads, influencer marketing, addictive design, and dark patterns under the forthcoming Digital Fairness Act;
  • a ban on engagement-based recommendation systems for minors;
  • applying DSA rules to online video platforms and outlawing loot boxes and other randomised gaming features (in-app currencies, fortune wheels, pay-to-progress);
  • protecting minors from commercial exploitation, including by prohibiting platforms from offering financial incentives for kidfluencing (children acting as influencers);
  • urgent action to address the ethical and legal challenges posed by generative AI tools, including deepfakes, companionship chatbots, AI agents and AI-powered nudification apps (that create non-consensual manipulated images).

Other jurisdictions

Several other jurisdictions are looking at Australia's impending ban for under 16s, including Denmark and Malaysia.  The UK has gone down the route of trying to limit harmful content – it remains to be seen if a social media ban for youngsters would be more effective.

Children should be at least 16 to access social media, say MEPs

Authors