Ofcom Releases New Online Safety Guidance for Tech Firms: Protecting Women and Girls

Ofcom Releases New Online Safety Guidance for Tech Firms: Protecting Women and Girls

Today, Ofcom has set forth new draft guidance to tackle online harms against women and girls. With insights from victims, survivors, advocacy groups, and safety experts, this ambitious guidance outlines practical steps for tech firms to take responsibility when designing online platforms that prioritise user safety.

The internet can be a hostile space, especially for women and girls, often silencing voices, enabling abusers, and fostering misogynistic communities. Ofcom’s measures aim to combat these threats, enforcing the UK’s online safety laws that require platforms to assess and mitigate gender-based online harms. This includes addressing coercive control, stalking, and intimate image abuse, as well as protecting children from exposure to harmful content.

Key Areas of Concern

The guidance focuses on four critical issues that disproportionately impact women and girls online:

  1. Online Misogyny – Addressing content that promotes or normalises misogynistic behaviours, including the encouragement of sexual violence.
  2. Pile-ons and Online Harassment – Combatting targeted abuse and threats, particularly against women in public life, such as journalists and politicians.
  3. Online Domestic Abuse – Recognising the role of technology in coercive and controlling behaviours within intimate relationships.
  4. Intimate Image Abuse – Tackling the non-consensual sharing of intimate images, cyberflashing, and AI-generated explicit content.

A Safety-by-Design Approach

To mitigate these harms, Ofcom’s draft guidance highlights nine areas where tech firms must take proactive measures, emphasising a ‘safety-by-design’ approach. Key recommendations include:

  • ‘Abusability’ Testing – Identifying vulnerabilities in services that could be exploited by malicious users.
  • Preventing Intimate Image Abuse – Leveraging tools like StopNCII.org to detect and remove non-consensual intimate imagery.
  • User Prompts – Encouraging reconsideration before posting harmful content, such as misogynistic material or gender-based abuse.
  • Enhanced Account Controls – Implementing bundled settings to help users protect themselves from online pile-ons.
  • Visibility Settings – Allowing users to modify or delete past content to safeguard their digital footprint.
  • Strengthened Security Measures – Implementing multi-factor authentication to prevent unauthorised access and surveillance.
  • Geolocation Restrictions – Disabling location sharing by default to reduce risks such as stalking.
  • Moderation Team Training – Ensuring content moderators are equipped to handle online domestic abuse cases effectively.
  • Accessible Reporting Tools – Providing clear and supportive pathways for users to report harm.
  • User Surveys and Transparency – Gathering feedback to understand risks and reporting data on harmful content prevalence and outcomes.

Recognising StopNCII

One of the commendable inclusions in Ofcom’s guidance is the recognition of StopNCII.org as an example of ‘good practice’ for online providers. Our vital tool that helps prevent the sharing of non-consensual intimate imagery has been commended for its groundbreaking ‘technology for good’ approach.

David Wright CBE CEO of SWGfL said: ‘‘We are delighted to see that StopNCII.org has rightly been highlighted as an example of ‘good practice’ within Ofcom’s latest guidance on women and girls’ online safety. We hope the StopNCII initiative sets a positive and forward-thinking example on how service providers can make their platforms safer and better protected against intimate image abuse material online.

 We feel this is only the start though – it is time to set the clear expectation that NCII content should be made illegal and that the implementation of StopNCII should be mandatory for any service provider who allows the uploading or sharing of any media-based content. StopNCII’s hashing technology is allowing us to tackle intimate image abuse at scale with privacy and security for the victim at its core – it is time for this to be a mandatory requirement.’’

Kate Worthington Senior Practitioner at the Revenge Porn Helpline said: ‘’Our increasing case load on the helpline is only the tip of the iceberg– we are in the midst of a problem we are struggling to contain. NCII content will inevitably continue to grow as technology evolves and it is now time for this content to be made illegal in the same way as CSAM if we ever hope to provide the best protections for victims.’’

A Call for Feedback

Ofcom is now inviting feedback on the draft guidance, with a deadline of May 23, 2025. This consultation period provides an opportunity for further refinement of the measures before final guidance is issued later this year. Additionally, tech firms will be expected to continuously assess emerging threats, with Ofcom set to evaluate their progress 18 months after the final guidance takes effect.

Back to Magazine

Related Articles