25% off on anonymous reporting and safeguarding tools for schools - click to find out more

Responding to AI misuse and “Undressing” Images on Social Media

Responding to AI misuse and “Undressing” Images on Social Media

We are increasingly seeing reports of individuals who have been affected by the misuse of artificial intelligence on social media platforms, particularly where AI tools have been used to create synthetic sexualised images of individuals as well as using AI to undress people using their publicly available photos. These images are not real, but the harm they cause is very real and in recent times has become an evolving area of intimate image abuse. If you have been affected by this form of abuse though, it is important to know that support is available.

Understanding What’s Happening

AI-generated sexual images are often created without consent and shared online to humiliate, control, or harm someone. Unfortunately, this also includes perpetrators creating them for personal use and sharing amongst peers, meaning that those targeted will not often realise these images exist. This can include:

  • Synthetic nude or sexual images created from everyday photos
  • Images shared on social media, messaging apps, forums, or websites
  • Threats to share these images unless demands are met

Critical Legislative Gaps

Despite increasing public awareness and government commitments, there are currently significant gaps in the law when it comes to AI-generated sexual images of adults.

The UK government has legislated to criminalise the creation of synthetic sexual content through provisions in the Data (Use and Access) Bill. However, at the time of writing, this offence has not yet been commenced, meaning people affected by AI-generated sexual images have no redress from the very law that has been publicly referenced as addressing this harm. Questions have repeatedly been raised in Parliament about when these protections will come into force, but clear timelines have yet to be provided.

Similarly, while the government’s Violence Against Women and Girls (VAWG) Strategy includes commitments to ensure the removal of intimate images shared without consent and to prevent AI from facilitating abuse i.e. nudification apps, these commitments have not yet translated into effective protections or enforcement mechanisms for people affected by synthetic sexual images yet.

This gap is also reflected in platform community guidelines. Social media companies often state that they are committed to removing illegal content. However, AI-generated sexual images of adults are frequently deemed legal, even when they are clearly abusive and non-consensual. This can result in inconsistent or delayed action, leaving those affected without protection.

As public discussion of this issue continues, it is clear that stronger, more defined legal protections are needed. Until the law fully catches up and delivers on its promises, people affected by AI-generated intimate image abuse are often left relying on platform policies, reporting mechanisms and 3rd party support rather than laws that should enforce the protection they need.

What to Do If This Has Happened to You

1. Report the content to the platform

Most social media platforms will have policies against intimate image abuse, including AI-generated sexual images.

  • Report the image directly using the platform’s reporting tools
  • Clearly state that the image is non-consensual and AI-generated
  • If possible, include links, usernames, and screenshots (but avoid repeatedly viewing the image if this is distressing)

If you are unsure how to report content or your report is not acted on, there are other options available.

2. Contact the Revenge Porn Helpline

If you are based in the UK, you can contact the Revenge Porn Helpline for free, confidential advice and support.

We can help you:

  • Report and request removal of non-consensually shared intimate images
  • Advise on the situation and provide additional signposting where required

If you are not in the UK, our 122 StopNCII.org partner NGOs around the world may be able to provide you support – please visit the StopNCII Global NGO Network | StopNCII.org

3. Use StopNCII.org for prevention

If your images are at risk of being shared across multiple platforms, StopNCII.org can help instantly.

StopNCII.org allows you to:

  • Create a secure digital fingerprint (hash) of the image immediately
  • Prevent the image from being shared across participating platforms
  • Take action from your own device without uploading the image itself

This tool is free, privacy preserving and available to anyone over the age of 18, anywhere in the world.  If you are under 18, visit Take It Down

You Are Not Alone

No one deserves to have their images misused, altered, or sexualised without consent, whether by a person or by AI. Being targeted in this way is a form of abuse.

If you are feeling overwhelmed, or unsure what to do next, please remember, you are not alone and help is available.  

Sophie Mortimer Manager of the Revenge Porn Helpline: “We are seeing an increase in cases where AI is being used to create sexualised images of people without their consent, including so-called ‘undressing’ images created from everyday photos online. These images may be synthetic, but the distress experienced by those targeted is very real. At present, many adults affected by this abuse are falling through critical gaps in the law, despite the Government announcing that action is being taken. Until robust enforcement is put in place, abuse like this will continue.  Anyone affected should know they are not alone, and that through the Revenge Porn Helpline and StopNCII.org, we can support in getting images removed.”

Back to Magazine

Related Articles