A Safer Future for Women and Girls: Why Evidence Must Shape the Next Phase of NCII Prevention

A Safer Future for Women and Girls: Why Evidence Must Shape the Next Phase of NCII Prevention

Ofcom’s Online Safety in 2025 review has arrived at exactly the right time, and its message is clear. Intimate image abuse is growing, evolving, and urgently needs a coordinated, system-wide response. The review highlights the responsibility on platforms to tackle this harm, signals that progress will be publicly assessed in 2027, and raises the alarm on fast-emerging risks like synthetic intimate imagery and nudification tools.

The takeaway is simple: we now have enough evidence to move from isolated efforts to a truly collective approach. The tools exist, the infrastructure exists, and survivors deserve a system where every platform plays its part.

A Crucial Moment for Online Safety and VAWG

The timing of Ofcom’s review couldn’t be more important. As the consultation on the Online Safety Act’s Additional Measures draws to a close, the Government is also preparing to publish its next Violence Against Women and Girls (VAWG) strategy. This week’s briefing at Downing Street (which CEO David Wright CBE attended) made it clear that online abuse, including NCII, is recognised as both widespread and deeply harmful.

Ofcom’s latest evidence backs this up. Ofcom notes that around one hundred thousand services fall within scope of the Online Safety Act. This scale alone demonstrates that the challenge cannot be solved through isolated or fragmented approaches. A collective, shared infrastructure will be essential if meaningful protection is to be delivered.

The review confirms that services must meet their duties around preventing NCII abuse and that Ofcom will report publicly on progress. It also highlights the rapid rise of AI-generated intimate content. This is a strong signal: the regulator understands the scale of the problem and the urgency with which it needs to be addressed.

What We Can Learn from Enforcement So Far

One of the strongest examples in the review is age assurance. After Ofcom began enforcement this year, every major pornography service introduced age checks and visits to websites dropped by around a third.

The same level of regulatory clarity will be essential for non consensual intimate imagery. Just as age assurance accelerated once expectations were backed by enforcement, NCII prevention will rely on consistent signals that platforms will be assessed on their performance and held accountable where they fall short.

Compliance will improve when requirements are clear, survivor-centred tools are readily available, and platforms know that enforcement is credible. But protecting victims will take more than compliance alone, it will require collaboration across the entire tech ecosystem.

Hashing Is A Proven, Privacy-Preserving Solution

One of the most effective tools we have for preventing the repeated sharing of intimate images is perceptual hashing. Through StopNCII.org, survivors can generate a secure digital fingerprint of their content without ever uploading the image itself. Platforms can then use this fingerprint to detect and block any attempt to repost the content across participating platforms.

This gives users dignity and control, and it gives platforms a clear, privacy-preserving way to act quickly. Many major tech companies and specialist services already use StopNCII.org, and the system is ready to support a much wider rollout.

With an estimated 100,000 services falling under the Online Safety Act, a shared approach is essential.

Why Collective Action Matters Now More Than Ever

NCII abuse does not stay on one platform. The same image can appear on dozens of services within hours. With AI tools creating synthetic intimate images at unprecedented speed, the challenge is only growing. Relying on each platform to act alone will never be enough.

The next phase must be genuinely collective. When any platform identifies NCII material and removes it, the perceptual hash should be added to the shared StopNCII.org dataset so other platforms can block it before it resurfaces. This approach strengthens responses to both real and synthetic content and ensures that one platform’s safety standards don’t rely on another platform’s failings. This supports Baroness Owen’s recent amendments put forward to the Crime and Policing Bill which calls for a statutory NCII register which would act as a regulated source of verified NCII hashes that platforms and internet service providers must use to block access and prevent further distribution of content.

Recent measures in the United States, through the Take It Down Act, require platforms to remove reported content within forty eight hours. This is an important step, but experience shows that removal alone is not enough. Without shared hashing, the same content often remains active on other services, leaving victims exposed long after action has been taken in one place. True protection comes from shared action, not isolated effort.

Readiness to Scale StopNCII.org

We are preparing for the next stage of adoption. SWGfL already operates large-scale digital services such as ProjectEVOLVE and 360Safe, which support tens of thousands of organisations. We are now expanding StopNCII.org to welcome thousands more providers and join the existing partners who have already integrated.

What we need now is collective commitment, from platforms, regulators, and government, to make NCII prevention a shared responsibility.

Looking Ahead to the VAWG Strategy

The upcoming VAWG strategy is a major opportunity to cement this vision into national policy. NCII abuse is a form of gendered violence, and tackling it requires ambition and coordination.

The forthcoming strategy offers a timely opportunity to embed these principles into national policy. A clear commitment to supporting shared hashing infrastructure, encouraging platform adoption, and enabling coordinated disruption of verified NCII would signal that the United Kingdom intends not only to respond to this harm, but to lead the global effort to prevent it.

By recognising the role of perceptual hashing, supporting shared datasets like StopNCII.org, and embedding survivor-centred tools into the strategy, the Government can set a global standard for how countries respond to this harm.

What we need now is a united commitment to move beyond compliance and towards collaboration. If platforms, regulators, and government work together, we can build a system where women and girls are genuinely protected, and no one ever has to face this harm alone.

The evidence is clear and the technology is ready. With shared resolve, we can build a safer digital environment for women and girls and ensure that non consensual intimate image abuse is met with a unified and effective response.

Back to Magazine

Related Articles