Since the Online Safety Act received royal assent in October 2023, it has received significant media attention around what impact it will have on our online safety, and its potential to make the internet a safer place. With so much discourse around what the Online Safety Act will do, who it will affect, and how it will work, it can be hard to understand how effective the Act will be in addressing the concerns of adult victims of intimate image abuse (IIA).
What We Know So Far
It’s still too early to say exactly how the Online Safety Act will impact victims/survivors of IIA. Whilst the Act is technically now law, the implementation process is still ongoing – meaning Ofcom (the Act’s Regulator) is currently still developing guidance and codes of practice which will regulate how online platforms will need to remove harmful and illegal content.
Ofcom has been working with key stakeholders, including the tech industry, charities, and campaigners, which will heavily shape these guidelines, and consequently the legal provisions available for adult victims of IIA.
There are several harmful content categories that online platforms will have to address—such as child sexual abuse material (CSAM), extreme pornography and terrorist content. Non-consensual intimate images (NCII) will also be included under the Act, however, the protections for NCII are currently not as robust as they are for CSAM content.
The Legal Status of NCII Under the Act
Whilst it is now illegal to share intimate images without consent, it is still currently legal to possess NCII, which means that Internet Service Providers (ISPs) cannot currently block the content from being accessible. This means that adult victims of intimate image abuse who discover their content on a site that refuses to comply with a removal request have very limited options to get the images blocked or taken down.
In comparison, it is illegal to possess, share or create CSAM, and online platforms are required to have robust systems in place to detect and remove it. This process ensures that CSAM content can be removed widely, and when it can’t be removed, it can be blocked by Internet Service Providers.
At the Revenge Porn Helpline, we firmly believe a similar level of protection is needed for adults affected by intimate image abuse.
The Impact of the Act
Right now, the Online Safety Act does not protect adults who have had their intimate images shared on websites that refuse to remove them.
By failing to provide strong or effective measures to prevent the spread of NCII, we have seen first-hand how victim-survivors of this criminal form of abuse often feel isolated and devastated by the potential damage that circulated images could have on their future. Robust regulations within the Online Safety Act would have a significant and positive impact on the lives of many.
What next?
At the Revenge Porn Helpline, we are campaigning for stronger protections for all adults in the UK who have had their intimate images shared without consent. We believe it is time to ensure that non-consensual intimate images are classified as fully illegal so they can be blocked and removed across the UK, breaking the relentless cycle of abuse once and for all.
Currently, the Revenge Porn Helpline has over 30,000 intimate images that are unable to be removed online due to issues with the law, international boundaries and non-compliant websites. This needs to change.
We believe that an NCII Register would ensure that a list of specific non-consensual videos and images could be classified as illegal through a court process, resulting in strengthened abilities to remove content and allowing internet service providers to block people from viewing NCII further.
Whilst Ofcom’s current powers include fining and disrupting these non-compliant independent sites, the measures of the Online Safety Act are designed around wider non-compliance, not the failure to remove individual pieces of content. This means a site would have to ignore requests repeatedly before triggering the enforcement process which will take months, and the ultimate sanction of blocking is seen as the most extreme measure.
Due to the nature of NCII, it is vital that action is taken swiftly to remove NCII and prevent the wider spread of this content.
What are we doing to help?
The Online Safety Act has made significant steps forward in preventing online harms, however, it is clear that more still needs to be done. At the Revenge Porn Helpline, we continue to campaign for robust and effective protections for NCII victims-survivors. We will continue to work alongside the Government and Ofcom to advocate for stronger legislation and regulations to stop NCII within the Online Safety Act.
The Revenge Porn Helpline has successfully worked to remove over 90% of NCII reported to us, and we are determined to make sure that more can be done to remove and block the remaining 10% of content that we have not been able to take down.
If you need support for non-consensual intimate image abuse, our Helpline remains dedicated to assisting all adults in the UK who have been affected. Individuals can access guidance by contacting 0345 6000 459 or emailing help@revengepornhelpline.org.uk