dcc
The UK’s leading video game industry is forecast to continue its growth despite challenges within the global games market. In recent years, this growth has been accompanied by new legislation to regulate online platforms including online games companies.

The lay of the land

The UK’s Online Safety Act 2023 (OSA) regulates game studios which operate multiplayer or social games, friends functionalities, online chat forums or any other form of user-to-user services. Its aim is to protect all online users, including video game players, from exposure to user generated illegal and harmful online content.

The OSA reflects the UK Government’s commitment to making the internet a safer place for all users, with similar commitments coming from Europe and made clear in the EU’s Digital Services Act and forthcoming Digital Fairness Act. With many video game companies now offering products and platforms which support the creation and dissemination of user-generated content (UGC), OSA compliance continues to be a hot topic of discussion. But how does the OSA affect the games industry in practice?

Compliance burdens

The OSA imposes several compliance requirements which are regulated, investigated and enforced by Ofcom, the regulator for online safety.

  • Risk assessments: In-scope game studios need to carry out regular OSA risk assessments to identify the illegal harms which may arise in the context of their games – for example, whether chat rooms could support dissemination of hateful content, or if any illegal in-game behaviour might encourage real-world illegal harms.
  • Child risk assessments: Additional – and more rigorous – risk assessments are required wherever children are a likely game audience. Ofcom issued guidance on best practice in 2025.
  • Content moderation: All in-scope platforms must actively monitor and remove illegal content, which may include human, manual or automated content moderation tools.
  • In-game reporting systems & complaints procedures: Players must be able to easily report illegal and harmful content within games, with straightforward complaint procedures.
  • Updated Terms of Service: games’ EULAs and Terms of Service must specify how users are protected from illegal content, explain relevant complaint handling policies and new breach of contract rights for users. Where relevant, Terms of Service also need to clarify how children are prevented from coming across illegal or harmful content, including about any proactive technology used to do so.
  • Reporting and record-keeping: operators of online games must keep records of risk assessments and outcomes and report findings to Ofcom.

Ofcom’s powers

Ofcom has the power to impose fines of up to £18m or 10% of annual global revenue on offending companies, as well as to conduct investigations, implement business disruption measures and to bring criminal proceedings against senior managers and directors. It has already carried out several investigations into various breaches of the Online Safety Act, as well as issuing fines for issues such as failing to cooperate with it, and failing to carry out risk assessments or deploy highly effective age assurance.

Vulnerabilities & key compliance challenges

While single player, offline games are not caught by the OSA, most game studios are now developing and publishing games which feature an online or user-to-user element, ranging from web-driven live text, voice chat or livestreaming functionalities to being set in virtual reality and metaverse environments.

With research showing that online gaming is being accessed by an ever-younger audience, the OSA’s focus on the safety of children in the online environment is a key challenge for online games providers with offerings likely to be played by a younger audience. Effective content moderation systems, child-appropriate complaint handling procedures and stringent age verification tools are all functionalities which game companies must be thinking about.

What can I do now?

Ofcom and the ICO recently wrote to various tech platforms asking them to take certain steps to improve their age assurance and protect children, and the government is consulting on possible restrictions on children using social media as well as improving age verification, so looking at your age assurance measures is a very good place to start.

As well as the Online Safety Act, companies should also be watching out for the EU’s Digital Fairness Act which is likely to cover other aspects of child safety online such as regulating loot boxes and virtual currencies as well as further regulation of so-called dark patterns where online interfaces manipulate consumers into a certain course of action.

Authors