We welcome the government’s latest announcement to strengthen action against violence against women and girls. These long-anticipated measures, alongside enforcement from Ofcom, place increased responsibility on industry to address the growing prevalence of non-consensual intimate image abuse, introducing both protective and preventative measures.
While amendments to the Crime and Policing Bill are welcome, and long advocated for by Baroness Owen, with support from SWGfL, there are concerns that some measures may not go far enough in practice.
Government Announcement
Tech firms will be required to take down abusive images within 48 hours under new laws designed to protect women and girls, with ministers putting platforms on notice to detect and remove intimate images shared without consent. Through amendments to the Crime and Policing Bill, companies will be legally obliged to remove such content within 48 hours of it being flagged, face fines of up to 10% of their global revenue or having their services blocked in the UK.
The government has also set out plans to ensure victims only need to report an image once, with content removed across multiple platforms and prevented from being re-uploaded. Regulators are considering treating this material with the same severity as child sexual abuse and terrorist content, enabling it to be digitally marked and automatically taken down if it reappears.
Further guidance will be issued to internet service providers on blocking access to sites hosting this content, as part of wider efforts to tackle a growing trend of intimate image abuse and return control to victims.
The introduction of a 48-hour takedown is a particularly welcome step, and we recognise Baroness Owen’s advocacy, which has been vital in bringing this measure forward. Faster removal provides victims with greater protection and reassurance. However, measures such as guidance for internet service providers must go further, guidance alone is insufficient, and concrete action will be needed to ensure consistent outcomes.
Proposals for internet service providers to block access to sites hosting illegal content aim to target rogue websites that fall outside the scope of the Online Safety Act. However, questions remain about how effective this approach will be in practice. Stronger, more coordinated measures, such as a non-consensual intimate image (NCII) register, clearer systems for identifying and removing content and a fully enforceable code of practice will help provide clarity for providers and ensure greater accountability across platforms and infrastructure, and close existing gaps in protection for victims.
Ofcom Enforcing Stronger Measures: Prioritising Hash-Matching Technology
Ofcom has also announced it will fast-track decisions on new requirements for tech companies to use technology to block illegal intimate images at source. The regulator has been consulting on additional online safety measures, including the use of “hash matching” tools to proactively detect and prevent non-consensual intimate images, such as explicit deepfakes, from being shared.
Citing the urgent need to strengthen protections for women and girls, who are disproportionately affected by this abuse, Ofcom plans to publish its final decision on these proposals in May, with new measures expected to come into force as early as this summer, subject to parliamentary approval. Further measures, including requirements for platforms to respond to spikes in illegal content during crises, are expected later in the year.
SWGfL Response
At SWGfL, we have long called for stronger action to tackle non-consensual intimate image abuse, informed by over a decade of supporting UK adult victims through the Revenge Porn Helpline. Our frontline experience, alongside our data-driven global analysis on the scale of NCII abuse, continues to demonstrate that this harm disproportionately affects women and girls and must remain central to the government’s Violence Against Women and Girls strategy.
The introduction of a 48-hour takedown requirement and Ofcom’s move to accelerate hash matching technology represent a significant shift towards systemic prevention. Importantly, the capability to deliver this already exists. Through StopNCII.org, we have demonstrated that hashing can prevent intimate images from being shared at scale, protecting victims before further harm occurs. In this respect, the direction now being set reflects regulatory alignment with established, operational best practice.
Alongside enforcement measures, clarity will be critical. Where proposals relating to internet infrastructure providers are framed as guidance, there is a risk that outcomes will vary in practice. Providers have consistently indicated that clear regulatory expectations enable consistent action. Stronger, coordinated mechanisms, such as a non-consensual intimate image register (NCII Register), would provide that clarity, ensuring content can be reliably identified, categorised and acted upon across platforms, hosting services, domain registrars and access providers. We also need Ofcom to enforce the measures through a code of practice, ensuring that platforms are not able to avoid what is currently in the guidance.
The true measure of these reforms will not be their announcement, but their implementation. Victims and survivors must remain at the centre, and enforcement supported by clear technical mechanisms will determine whether these changes close the remaining protection gaps.
David Wright CBE, CEO of SWGfL, said:
“We welcome the government’s commitment to a 48-hour takedown of non-consensual intimate images and Ofcom’s move to accelerate hash-based prevention measures. We pay tribute to Baroness Owen’s tireless advocacy, whose amendments on the register and takedown have been vital. Reflecting over a decade of support for UK victims through the Revenge Porn Helpline, these steps align with the Violence Against Women and Girls strategy, as women are disproportionately affected.
However, we anticipate guidelines may be proposed for internet infrastructure providers, including ISPs. Guidance alone is insufficient. A binding NCII register would ensure both accountability and clarity, empowering platforms and infrastructure alike. With evidence and collective action, we can close the protection gap and build a safer future.”
These measures are significant, but their success will ultimately be judged on the protection and outcomes delivered for victims and survivors.





