Ofcom’s Online Safety in 2025 review has arrived at exactly the right time, and its message is clear. Intimate image abuse is growing, evolving, and urgently needs a coordinated, system-wide response. The review highlights the responsibility on platforms to tackle this harm, signals that progress will be publicly assessed in 2027, and raises the alarm on fast-emerging risks like synthetic intimate imagery and nudification tools.
The takeaway is simple: we now have enough evidence to move from isolated efforts to a truly collective approach. The tools exist, the infrastructure exists, and survivors deserve a system where every platform plays its part.
A Crucial Moment for Online Safety and VAWG
The timing of Ofcom’s review couldn’t be more important. As the consultation on the Online Safety Act’s Additional Measures draws to a close, the Government is also preparing to publish its next Violence Against Women and Girls (VAWG) strategy. This week’s briefing at Downing Street (which CEO David Wright CBE attended) made it clear that online abuse, including NCII, is recognised as both widespread and deeply harmful.
Ofcom’s latest evidence backs this up. The review confirms that services must meet their duties around preventing NCII abuse and that Ofcom will report publicly on progress. It also highlights the rapid rise of AI-generated intimate content. This is a strong signal: the regulator understands the scale of the problem and the urgency with which it needs to be addressed.
What We Can Learn from Enforcement So Far
One of the strongest examples in the review is age assurance. After Ofcom began enforcement this year, every major pornography service introduced age checks and visits to websites dropped by around a third.
The same must apply to NCII. Compliance will improve when requirements are clear, survivor-centred tools are readily available, and platforms know that enforcement is credible. But protecting victims will take more than compliance alone, it will require collaboration across the entire tech ecosystem.
Hashing Is A Proven, Privacy-Preserving Solution
One of the most effective tools we have for preventing the repeated sharing of intimate images is perceptual hashing. Through StopNCII.org, survivors can generate a secure digital fingerprint of their content without ever uploading the image itself. Platforms can then use this fingerprint to detect and block any attempt to repost the content across participating platforms.
This gives users dignity and control, and it gives platforms a clear, privacy-preserving way to act quickly. Many major tech companies and specialist services already use StopNCII.org, and the system is ready to support a much wider rollout.
With an estimated 100,000 services falling under the Online Safety Act, a shared approach is essential.
Why Collective Action Matters Now More Than Ever
NCII abuse does not stay on one platform. The same image can appear on dozens of services within hours. With AI tools creating synthetic intimate images at unprecedented speed, the challenge is only growing. Relying on each platform to act alone will never be enough.
The next phase must be genuinely collective. When any platform identifies NCII material and removes it, the perceptual hash should be added to the shared StopNCII.org dataset so other platforms can block it before it resurfaces. This approach strengthens responses to both real and synthetic content and ensures that one platform’s safety standards don’t rely on another platform’s failings. This supports Baroness Owen’s recent amendments put forward to the Crime and Policing Bill which calls for a statutory NCII register which would act as a regulated source of verified NCII hashes that platforms and internet service providers must use to block access and prevent further distribution of content.
We’re already seeing this principle recognised internationally. In the United States, the Take It Down Act requires content to be removed within 48 hours, a vital step, but without hash-sharing - the same content can remain online elsewhere. True protection comes from shared action, not isolated effort.
Readiness to Scale StopNCII.org
We are preparing for the next stage of adoption. SWGfL already operates large-scale digital services such as ProjectEVOLVE and 360Safe, which support tens of thousands of organisations. We are now expanding StopNCII.org to welcome thousands more providers and join the existing partners who have already integrated.
What we need now is collective commitment, from platforms, regulators, and government, to make NCII prevention a shared responsibility.
Looking Ahead to the VAWG Strategy
The upcoming VAWG strategy is a major opportunity to cement this vision into national policy. NCII abuse is a form of gendered violence, and tackling it requires ambition and coordination.
By recognising the role of perceptual hashing, supporting shared datasets like StopNCII.org, and embedding survivor-centred tools into the strategy, the Government can set a global standard for how countries respond to this harm.
What we need now is a united commitment to move beyond compliance and towards collaboration. If platforms, regulators, and government work together, we can build a system where women and girls are genuinely protected, and no one ever has to face this harm alone.





