Yesterday we published an update on online safety developments, but there's more to say as the announcements just keep coming.

Protecting women and girls

Ofcom has said that it will be fast-tracking its decision on proposed new requirements for tech firms to use technology to block illegal intimate images at source. Last year, it consulted on a range of additional online protections designed to push tech platforms to go further in tackling illegal content online. As part of these measures, it proposed a new requirement for sites and apps to use a proactive technology – known as "hash matching" – to detect intimate images that are shared without consent, such as explicit deepfakes.

Ofcom says that due to the urgent need for better online protections for women and girls, who are disproportionately affected by non-consensual intimate image abuse, it has decided to accelerate its timeline.  In May, it will announce its final decision on its proposals that tech firms should expand their use of proactive technology to prevent illegal intimate images from reaching users. Subject to the parliamentary process, Ofcom would expect any new Illegal Harms Code measures to come into effect by the summer.

Its decisions on the remaining additional protections, which include proposed new protocols for tech firms to respond to spikes in illegal content during a crisis, will follow this autumn, as planned.

Update on investigations – big fines for lack of age checks!

Ofcom has fined porn company 8579 LLC £1.35 million for not having age checks in place, plus £50,000 for failing to respond to an information request.  8579 LLC must immediately implement highly effective age assurance or face a daily penalty of £1,000. 

It has also fined Kick Online Entertainment SA £800,000 for failing to put in place age checks to protect children from pornographic content plus £30,000 for failing to comply with an information request.

Provisional notices of contravention issued

Ofcom has also issued 4chan Community Support LLC with a provisional notice of contravention. It says there are reasonable grounds for believing that it has contravened its duties under section 9, 10 and 12 of the Online Safety Act, especially its duty to conduct a suitable and sufficient illegal content risk assessment, its duties in relation to its Terms of Service; and its duty to protect children from encountering pornographic content through the use of highly effective age assurance. Ofcom will provide updates on the investigation in due course.

Ofcom has also issued the provider of the Im.ge service with a provisional notice of contravention again because it believes that it has not carried out a suitable and sufficient illegal content risk assessment. Ofcom's investigation continues to examine whether there are reasonable grounds to believe that the provider of Im.ge has failed, or is failing, to comply with its duty to protect users from illegal content. It will update in due course. 

It is closing its investigation into the provider of Nippybox. The service has become unavailable to UK users and, to Ofcom's knowledge, more widely. Having assessed the impact on UK users, it has decided to close the investigation. Ofcom may decide to re-open its investigation if appropriate.

It is crucial that organisations carry out risk assessments, respond to information requests promptly and fully, and make sure that they have robust age gating in place where appropriate.

Ofcom publishes final guidance for the online safety super-complaints regime

The super-complaints regime enables eligible organisations to bring systemic issues to Ofcom's attention about features of regulated online services, or conduct of those services, that may lead to material risk of significant online harm or adverse impact on the right to freedom of expression. The legislation sets out criteria which decide which organisations are eligible to make a complaint and how Ofcom will decide if a complaint is admissible under the Online Safety Act. In summary, an organisation must demonstrate that it represents people in the UK. This can be people who use regulated online services, the public or a specific group of people. An eligible organisation must also be able to be relied on to act independently of regulated online services, routinely contribute significantly, as an expert, to public discussions about any aspect of online safety matters and must be relied on to have regard to Ofcom's guidance about making super-complaints.  Ofcom has now published its final guidance for organisations wishing to submit a super-complaint. 

Ofcom keeps busy with online safety investigations

Authors