Ofcom has launched an investigation into whether the provider of two online image boards has failed to comply with duties to protect people in the UK from illegal content. Due to the nature of these sites, Ofcom has decided not to name them or their provider.
Background
It is illegal in the UK to share non-consensual intimate images (NCII) or child sexual abuse material (CSAM). Under the Online Safety Act 2023, providers of 'user-to-user' services are required to assess and mitigate the risk of UK users encountering this type of content on their platforms. This is something that has a disproportionate impact on women and girls, and making sure sites and apps tackle this is one of Ofcom's highest priorities.
When the new duties on tech firms came into force last year, Ofcom started an enforcement programme against services that are used to distribute CSAM. As a result, some have deployed automated tools to detect and swiftly remove such content, while others have withdrawn from the UK.
Ofcom says that in total, it has launched investigations into nearly 100 platforms. It has issued nearly a dozen fines for non-compliance, including against a nudification site which has withdrawn from the UK.
It recently announced that it will be fast-tracking its decision on proposed new requirements for tech firms to use technology to block non-consensual intimate images at source, bringing it forward to May.
Investigation
It has now opened a formal investigation to establish whether the provider of these sites has failed to comply with its duties under the Act:
- to conduct a suitable and sufficient illegal content risk assessment;
- to use proportionate measures to prevent individuals encountering priority illegal content – including NCII and CSAM;
- to use proportionate systems and processes to minimise the length of time priority illegal content is present;
- to swiftly take down illegal content when it becomes aware of it;
- to specify in its terms of service how individuals are to be protected from illegal content; and
- to operate content reporting and complaints procedures in relation to illegal content.
Ofcom says that it will provide an update on this investigation as soon as possible.
The investigation comes against the background of more action from the UK government, which has urged tech companies to go "further and faster" in implementing safety measures to protect women and girls online. In addition, Ofcom has issued information notices to more than 40 firms of the largest and riskiest sites and apps in the world, formally requesting more than 70 risk assessments from them. Failure to provide a sufficient response, on time, could result in enforcement action. Ofcom says it will use the responses to identify gaps in risk assessments and drive improvements.
At EU level, the European Commission has preliminarily found Pornhub, Stripchat, XNXX and XVideos in breach of the Digital Services Act (DSA) for failing to protect minors from being exposed to pornographic content on their services. The Commission's preliminary findings indicate that Pornhub, Stripchat, XNXX and XVideos did not diligently identify and assess the risks that their platforms pose to minors accessing their services. Neither did they implement effective measures to prevent minors from accessing their services, and so failed to protect minors' rights and wellbeing. If the Commission's views are ultimately confirmed, the Commission may issue a non-compliance decision, which can trigger a fine proportionate to the infringement, which can be up to 6% of the total worldwide annual turnover of the provider. The Commission can also impose periodic penalty payments to compel a platform to comply.
The stakes could be about to get much higher
The UK government has recently announced that it has tabled an amendment to the Crime and Policing Bill which, if passed, would mean that tech executives could be held personally liable if platforms fail to comply with Ofcom's enforcement decisions to remove people's intimate images that have been shared without consent. This would mean senior executives who commit the offence without a reasonable excuse could be liable to imprisonment or a fine, or both.
It is important that platforms carry out the necessary risk assessments, have robust content reporting procedures and make sure that any untoward content is swiftly taken down. If you need help reviewing your systems and processes to check for their compliance under the Online Safety Act, do contact us.
