25 November is International Day for the Elimination of Violence against Women. So it was timely, that, following consultation, Ofcom published its final guidance about protecting women and girls online. It deals with content and activity where service providers have duties under the Online Safety Act 2023 and where the content disproportionately affects women and girls. The guidance focuses on four key areas of harm and sets out various measures for tech companies. There have been some changes based on feedback.
What does a tech company have to do?
Ofcom says that tech companies should make sure that governance and accountability processes address online gender-based harms, that they conduct risk assessments that focus on harms to women and girls; and they should be transparent about women and girls' online safety.
They also need to conduct abusability evaluations and product testing, set safer default settings and reduce the circulation of content depicting, promoting or encouraging online gender-based harms.
The final three actions are giving users better control over their experiences, enabling users who experience online gender-based harms to make reports, and responding appropriately when online gender-based harms happen.
For each action, Ofcom sets out what a baseline of safety looks like (foundational steps) for service providers to meet their duties to protect UK users. It also highlights additional good practice steps to illustrate how providers can build on the foundational steps to create safer experiences for women and girls, give their users more autonomy, and provide assurance that users can seek appropriate redress for any harm that does occur.
The four areas of harm and what measures to employ
Misogynistic abuse and sexual violence
Tech firms should consider:
- introducing "prompts" asking users to reconsider before posting harmful content;
- imposing 'timeouts' for users who repeatedly attempt to abuse a platform or functionality to target victims;
- promoting diverse content and perspectives through their recommender "for you" systems to help prevent toxic echo chambers; and
- de-monetising posts or videos which promote misogynistic abuse and sexual violence.
Pile-ons and coordinated harassment
Tech firms should consider:
- setting volume limits on posts ("rate limiting") to help prevent mass-posting of abuse in pile-ons;
- allowing users to quickly block or mute multiple accounts at once; and
- introducing more sophisticated tools for users to make multiple reports and track their progress.
Stalking and coercive control
The guidance suggests that tech firms should consider:
- bundling safety features to make it easier to set accounts to private;
- introducing enhanced visibility restrictions to control who can see past and present content;
- ensuring stronger account security; and
- removing geolocation by default.
Image-based sexual abuse
Under the guidance, tech firms should consider:
- using automated technology (hash-matching) to detect and remove non-consensual intimate images;
- blurring nudity, giving adults the option to override;
- signposting users to supportive information including how to report a potential crime.
More broadly, Ofcom expects tech firms to subject new services or features to '"abusability" testing before they roll them out, to identify from the outset how they might be misused by perpetrators. Moderation teams should also receive specialised training on online gender-based harms. It also says that companies should consult with experts about design policies and safety features that work effectively for women and girls, while continually listening and learning from survivors' and victims' real-life experiences, for example, by running user surveys.
Tech and harms evolve rapidly, and Ofcom expects providers to regularly look at what they may need to do to respond to changing threats and risks from online gender-based harms. Ofcom also expects services with the highest risk and largest reach to do more to make sure that they achieve safer experiences for women and girls.
Ofcom has set out five areas of action for itself, and among other things, will report in summer 2027 on progress. If progress is inadequate, it will consider making formal recommendations to the government about where the Online Safety Act may need to be strengthened. It will also be enforcing and updating its guidance on illegal harms to cover the new priority offence of cyberflashing and to introduce hash-matching measures. It has also written an open letter to tech companies setting out its expectations that they will comply with the guidance. It will also continue to enforce the Online Safety Act more broadly, as well as continuing with its research and engagement programme.
Commentators have expressed disappointment that the guidance isn't binding and pointed out that whether it works will depend on the attitude of the platform concerned. Others have suggested that the government should put the guidance on a statutory footing. However, as Ofcom will not move to make recommendations to the government for new laws until and unless it deems progress is inadequate following its report in 2027, additional legislation to protect women and girls is unlikely for another two years.
