December has been another busy month for online safety developments, with both Ofcom and the ICO being active.
Ofcom fines file-sharing service £20,000 under section 132 of Online Safety Act 2023
Ofcom has imposed a penalty of £20,000 on the provider of the Im.ge service regarding its failure to comply with two statutory information requests. Ofcom said that the fine has been set with its Penalty Guidelines in mind. In addition it is requiring the Im.ge Service to take immediate steps to comply with section 102(8) of the Online Safety Act by providing the information requested in two statutory information requests, which included a copy of the record of the Illegal Content Risk Assessment for the service; and information about its qualifying worldwide revenue. If it still doesn't comply, Ofcom will impose a daily rate penalty of £100 per day. Ofcom is continuing to investigate the Im.ge Service's compliance with its duties to complete, and keep a record of, a suitable and sufficient illegal content risk assessment and its compliance with the illegal content duties in respect of child sexual abuse material remain ongoing. It will provide updates on these matters in due course.
AVS changes its age assurance processes
Ofcom also confirmed that AVS Group Ltd, which it recently fined £1 million for not having robust age checks in place on 18 adult websites, has now introduced a new age assurance process on all sites that were the subject of Ofcom's investigation.
Guidance on how tech firms should treat bereaved parents
In accordance with its Roadmap, Ofcom is seeking views on proposed industry guidance relating to how tech firms respond to requests from bereaved parents about their child's use of a service in the event of their death. Ofcom has spoken to bereaved parents about the significant barriers they've faced when seeking answers from tech firms about their children's use of their sites and apps before their death. The Online Safety Act places requirements on tech firms about how they can increase transparency and reduce uncertainty for bereaved parents making requests in difficult circumstances. To help services to comply with these duties, the Act also requires Ofcom to produce industry guidance. The draft guidance expects services, among other things, to provide clear and accessible information in their terms of service about their information disclosure policy, including how bereaved parents can make requests and the evidence required; make sure that they respond to bereaved parents in a timely manner; offer a support function, which might be a helpline, to bereaved parents; and operate a simple to use, easy to access and transparent complaints function, with a nominated team or individual responsible for handling concerns, and a clear response timeframe. The consultation ends on 23 March 2026.
Preserving information about a deceased child's online activity
Separately, Ofcom has issued its final Guidance for online platforms that sets out what information they are required to retain about a child's online activity when Ofcom issues a Data Preservation Notice. Ofcom issues Data Preservation Notices at the request of a coroner investigating the death of a child. They require the recipient to retain information relating to a child's online activity from the point they receive the notice, making sure that the information is available to the coroner if requested. It has already issued a number of these notices since the provisions came into force on 30 September. Some online platforms automatically delete a user's data if their account is inactive for a certain period. To enable swift action after a child's death, Ofcom has set out the types of information coroners may wish to provide to Ofcom about the child if known at the time, and the types of information it will generally require platforms to preserve. It has also updated its guidance about its information gathering powers to reflect this.
ICO to monitor how mobile games protect children's online privacy
Earlier this month, the ICO announced that it would be scrutinising how popular mobile games played by children in the UK protect their online privacy. It highlighted that around 90% of UK children play games on digital devices, so it is launching a monitoring programme targeting ten popular mobile games. The review will assess the games' compliance with default privacy settings, their geolocation controls, and their targeted advertising practices. It will also consider any other privacy issues identified during the review process. The ICO says that the new focus on mobile games follows significant progress in improving children's privacy standards across social media and video-sharing platforms through its Children's code strategy. We will be writing more about this topic in the new year.
UK government announces ban on nudification apps
As part of its strategy to reduce violence against women and girls, the UK government has highlighted the rise of nudification apps. It has said that it will ban nudification apps and other tools designed to create synthetic non-consensual intimate images to stop women and girls' images being tampered with and exploited without their consent. This aims to target the firms and individuals providing and supplying such tools. We are waiting for more detail on when and how this will be done.
Ofcom has received some criticism for its slow implementation of the Act, but it is getting into its stride now with more fines, facilitating platforms to change their practices, and issuing new guidance. Platforms and service providers should note that the ICO is monitoring them as well. If you come within the scope of the Act, or need advice about whether you do, please contact the team.
