Two years have passed since the Online Safety Act became law. The legislation marked a historic turning point in the United Kingdom’s approach to online safety, establishing a regulatory framework designed to reduce harm, improve accountability and require platforms to take greater responsibility for their users.
In the time since, Ofcom has moved from drafting to delivery, consulting on a series of Codes of Practice and Additional Safety Measures that will determine how the Act operates in practice. Within this framework, the recognition of non-consensual intimate image abuse (NCII) has emerged as one of the most significant and fast-developing areas of progress.
From victim-centred reporting tools to international collaboration and technological innovation, the landscape is evolving. Yet the central challenge remains: how to turn the promise of legislation into consistent, proactive protection for those at risk of image-based abuse.
Rising Demand and Evolving Harms
Data from the Revenge Porn Helpline shows that the scale of NCII continues to grow at an alarming rate. Reports have risen from 1,685 in 2019 to more than 22,000 in 2024, representing a thirteen-fold increase in five years. Over the same period, more than 430,000 intimate images have been reported and removed with the Helpline’s support.
This surge reflects both increased awareness and a continuing rise in perpetration. Modelling suggests that if current trends persist, more than 46,000 individuals could require support annually by 2027. Yet this figure still represents only a fraction of those affected. Our analysis indicates that approximately 1.42 per cent of adult women in the United Kingdom experience NCII abuse each year. For every victim supported, dozens more remain silent, highlighting the scale of under-reporting and the ongoing need for accessible, trusted support.
The emergence of synthetic or artificially generated NCII has further complicated this picture, with the Revenge Porn Helpline recording a 26 per cent increase in such cases between 2023 and 2024.
Building on a Stronger Legal Foundation
The Online Safety Act promises an essential foundation by placing statutory duties on user-to-user and search services to mitigate illegal content and protect users from harm. Ofcom’s Additional Safety Measures consultation, published earlier this year, represents an important next step in translating those duties into practice.
For the first time, the regulator has proposed explicit requirements for the proactive prevention of NCII, building upon the Illegal Harms Codes that were first introduced in 2024. The consultation also proposes that NCII content should be formally recognised as illegal material within the meaning of Schedule 7 of the Act. Such a clarification would close a long-standing gap between NCII and child sexual abuse material, enabling platforms and access providers to adopt more consistent and timely responses to blocking and removing content.
SWGfL has welcomed Ofcom’s leadership in this area, particularly its focus on prevention, proportionality and victim protection. Throughout our consultation response, we have encouraged Ofcom to ensure parity between NCII and CSAM, and to continue engaging with civil society and victim services as its Codes move toward implementation.
Technology, Collaboration and Global Reach
The past two years have also seen tangible progress in the development and adoption of proactive safety technology. StopNCII.org, launched by SWGfL in 2021, remains the world’s first privacy-preserving, device-side hashing tool designed to prevent the redistribution of intimate images without consent.
To date, the platform has supported more than 740,000 individual cases and generated 1.8 million unique hashes, preventing almost 38,000 verified re-uploads of abusive content across participating platforms. This model has proven both scalable and effective, demonstrating that privacy and prevention can co-exist.
Leading technology companies have increasingly recognised the importance of this approach. Companies including Meta, TikTok and Microsoft continue to use StopNCII.org hashes within its detection and moderation systems, while other major platforms, including Google, have signalled their intent to integrate the StopNCII standard into their own tools. Ofcom has also referenced the model in its consultation materials, recognising it as best practice for tackling intimate image abuse.
Future Systems to Tackle NCII
Looking ahead, we are set to introduce a groundbreaking new framework that will represent the next phase in the evolution of NCII prevention, led by SWGfL and supported by the Foreign, Commonwealth and Development Office.
This infrastructure will also enable trusted international NGOs to participate, expanding protection beyond the United Kingdom and ensuring that NCII can be addressed wherever it appears. It will create the first truly global mechanism for reporting, verifying and removing intimate image abuse content while maintaining the privacy and dignity of survivors.
The model will extend protection to those affected by synthetic NCII In doing so, it will address one of the most urgent and complex emerging harms within the online safety landscape.
Rights, Remedies and Redress
While Ofcom’s regulatory framework represents significant progress, it also marks a shift in how individuals are able to challenge platform decisions. The repeal of the Video Sharing Platform (VSP) regulations has effectively removed the previous right to impartial dispute resolution, meaning that individuals now have limited routes to contest moderation outcomes or content removal decisions.
The Online Safety Act, while ambitious in its scope, does not currently provide an equivalent mechanism. As SWGfL has previously highlighted, this leaves a gap in user protection, particularly for victims of harmful content seeking swift and impartial resolution when content is not removed or when reporting processes fail.
Ensuring effective systems for redress and accountability will be essential to upholding the spirit of the legislation. As regulatory powers expand, so too must the clarity, consistency and fairness of processes that determine how individual harms are resolved.
Closing the Gaps
The progress made since the Online Safety Act became law is significant and should be recognised. Ofcom’s phased approach to consultation and implementation demonstrates both ambition and realism, given the scale of the regulatory change required.
Nevertheless, challenges remain that will need to be addressed in the years ahead. The “technically feasible” clause within Ofcom’s proposals, for instance, may require careful monitoring to ensure it does not unintentionally weaken enforcement. Smaller and mid-sized services may also need additional support to meet expectations, particularly as new forms of abuse such as synthetic NCII emerge and evolve.
SWGfL believes that these issues are best addressed through ongoing collaboration between regulators, platforms and support organisations. The direction of travel is positive; what matters now is ensuring that the pace of implementation matches the urgency of the harm.
What Will Happen Next?
Two years on, the foundations are encouraging. The United Kingdom’s regulatory framework is moving from principle to practice, and Ofcom’s forthcoming Codes of Practice will play a pivotal role in translating legislative ambition into operational impact.
There is much to commend in Ofcom’s approach so far: meaningful consultation with survivors and civil society, clear signalling to industry, and a willingness to evolve its regulatory expectations in line with technology. These are encouraging signs of a regulator finding its footing in a complex and fast-changing landscape.
The real test will come in implementation and enforcement, but for now, Ofcom should be afforded the time and space to embed these frameworks properly. SWGfL remains committed to supporting this process, sharing evidence from frontline experience and continuing to demonstrate how victim-centred technology can deliver the outcomes the Online Safety Act was designed to achieve.
Through StopNCII.org, the Revenge Porn Helpline and planned developments, SWGfL continues to show how privacy-preserving technology, victim-focused design and international cooperation can transform the way NCII is addressed worldwide.





