Response to Ofcom’s Online Safety Act Codes of Practice

Response to Ofcom’s Online Safety Act Codes of Practice

SWGfL welcomes the new Ofcom codes of practice and guidance under the UK’s Online Safety Act, which is coming into effect in March 2025. These codes mark a significant step forward in creating a safer online environment, particularly through enhanced protections for women and girls and stronger mechanisms to tackle intimate image abuse (NCII).

Key Advancements for Women and Girls

Ofcom’s recognition of the disproportionate impact of online harms on women and girls is a critical milestone. The codes mandate platforms to take down NCII once identified and offer clearer guidance on identifying and removing harmful content, including cyberflashing and posts linked to coerced sexual exploitation by criminal gangs.

This is particularly relevant to our work at the Revenge Porn Helpline, where we’ve seen the devastating impact of intimate image abuse, with women making up 95% of the content we report. Over the last four years, our caseload has risen more than tenfold, from 1,500 cases in 2019 to nearly 19,000 cases in 2023. This dramatic increase highlights the urgency of tackling NCII and demonstrates the scale of the issue facing victims today.

Survivors often express frustration about delays in removing harmful content or the lack of accountability from platforms. As one survivor shared:

"When my images were shared, I felt completely powerless. Without the help of the Revenge Porn Helpline, I don’t think I could have faced it. Knowing they understood the process and could get images removed gave me hope.”

The introduction of user tools, such as blocking and muting accounts harassing or stalking individuals, further demonstrates a commitment to safety by design. Measures to restrict direct messaging from non-connected accounts also set important benchmarks for protecting vulnerable users, particularly children.

A Spotlight on Hash-Matching Technology

We welcome Ofcom’s inclusion of automated tools like hash-matching in its consultation on NCII prevention. From our experience with StopNCII.org, hashing technology has proven to be a groundbreaking solution, preventing the spread of intimate images across multiple platforms while safeguarding user privacy.

Ofcom’s guidance recognises that “proactive systems to identify and prevent harmful content are critical to reducing harm at scale” (Paragraph 3.2).

We will engage extensively in the Spring 2025 consultation to ensure hash-matching tools like those implemented by StopNCII.org are fully integrated into the final codes.

Accountability and Enforcement

The introduction of enforcement powers, including fines and court orders for platforms that fail to meet their obligations, is a long-overdue step. As Ofcom highlights:

“Transparency reporting will help hold platforms to account and ensure victims understand what action has been taken” (Paragraph 6.4).

For victims of online abuse, accountability has often been a missing piece. These new enforcement measures signal a cultural shift towards prioritising user safety and ensuring tech firms meet their responsibilities. However, the Ofcom regime is not about supporting individuals, it's aimed at systematic change from the platform perspective as opposed to responding to an individual report. Unfortunately, this will mean that the enforcement measures will inevitably take a long time as it is being aimed at those non-compliant platforms.

Challenges and Recommendations

While the advancements outlined by Ofcom are promising, challenges remain in ensuring effective implementation:

  1. Algorithmic Testing: Ofcom’s codes stress the need for algorithms to effectively detect and remove harmful content. However, as noted in the guidance, unintended consequences—such as over-blocking legitimate content—must be avoided.
  2. Training and Moderation Resources: Effective enforcement will hinge on the quality of moderation. Moderation teams must receive comprehensive training on nuanced issues such as NCII, coercion, and exploitation.
  3. Support for Smaller Platforms: Smaller platforms often lack the resources to implement robust safety measures. Tailored guidance and scalable tools are essential to prevent these spaces from being exploited.

These challenges underscore the importance of ongoing collaboration between platforms, regulators, and organisations like SWGfL.

Looking Ahead

The promise of additional protections in 2025, including further measures for intimate image abuse and the use of AI to tackle illegal harms, is encouraging. To ensure success, these measures must be matched by robust consultation, transparency, and a commitment to prioritising user safety.

David Wright CBE (CEO of SWGfL) commented:

At SWGfL, we applaud Ofcom’s focus on protecting women and girls. As we continue supporting these efforts, we’ll contribute to upcoming consultations to ensure solutions like StopNCII.org are included in the codes to identify and prevent NCII. Everyone should benefit from technology free from harm, and the Online Safety Act is the UK’s defining opportunity to empower individuals to feel safer online.

Back to Magazine

Related Articles