25% off on anonymous reporting and safeguarding tools for schools - click to find out more

Revenge Porn Helpline Warns About the Continued Threat of Nudification Apps Despite New Legislation Tackling ‘’Deepfake’’ Imagery

Revenge Porn Helpline Warns About the Continued Threat of Nudification Apps Despite New Legislation Tackling ‘’Deepfake’’ Imagery

The Revenge Porn Helpline welcomes the UK Government’s commitment to strengthening criminal law to address the growing harm caused by non-consensual synthetic sexual imagery. We recognise that the proposed measures to criminalise the creation and requesting of sexually synthetic imagery (“deepfakes”), represents an important although delayed acknowledgment of the scale and seriousness of technology-facilitated abuse.

However, while we are encouraged by the direction, we remain concerned around how the proposed legal definitions and scope of the new offences will address the gaps that are currently remaining, particularly around nudification apps and tools.

Gaps Will Remain

Synthetic sexual image abuse is evolving rapidly evidenced by current media exposure and reports to the helpline increasing. Perpetrators consistently adapt their behaviour to exploit legal and technical loopholes, and there is a real risk that the lack of clarity around the nudification apps and tools ban will allow harm to continue.

The first of these issues is the provider’s perspective. The recent controversies surrounding Grok illustrate the scale of this problem. It is expected that AI tools with capabilities to undress will still be available. With the proposed ban applying to apps and tools that are ‘designed exclusively’ around nudification, there is concern that these tools  will be able to claim that nudification is not a primary feature in order to continue being available. The harm caused by these tools will not change, but under the current framing, many providers would still fall outside the scope of the law.

How Will This Be Policed?

The second is the behaviour of the perpetrator wanting to use these tools to create synthetic sexual imagery. While this new legislation is criminalising the behaviour to create or request creation of this imagery, we are seeking clarification around how this new legislation will be enforced. From our frontline experience, supporting victims of intimate image abuse, the success of this legislation will not be measured by its wording alone. It must be judged by tangible outcomes for victims as well as platforms joining the StopNCII.org initiative that treats synthetic sexual imagery the same way as genuine NCII.

For the Revenge Porn Helpline, meaningful proof of success will be seen in:

  • A clear increase in the speed and volume of takedowns of non-consensual synthetic sexual material
  • A demonstrable rise in investigations, enforcement action and prosecutions where perpetrators and platforms that allow such content to be created are identified and held accountable.

Without strong enforcement mechanisms, there is a risk that perpetrators will continue to find ways to cause harm while survivors remain unprotected.

Sophie Mortimer (Revenge Porn Helpline) said: “While we welcome the Government’s commitment to criminalising the creation and request of non-consensual synthetic sexual imagery, the reality is that perpetrators will continue to exploit gaps unless the wider harm is addressed. Tools that generate this content will not disappear overnight, and without clear, robust enforcement and accountability for both perpetrators and the platforms that enable this harm, many survivors will still be left without protection. For us, the true test of this legislation will be whether it leads to faster takedowns and an increase in investigations and prosecutions. Only then will we begin to see meaningful change.”

Back to Magazine

Related Articles