President Joe Biden and Vice President Kamala Harris have been steadfast in their commitment to addressing gender-based violence throughout their administration. A key component of this mission includes addressing the risks and harms posed by emerging technologies such as artificial intelligence (AI), which has fuelled a disturbing rise in image-based sexual abuse. This abuse includes the non-consensual sharing of intimate images (NCII) and child sexual abuse material (CSAM), often amplified through AI-generated content. As the administration continues to fight gender-based violence, they continue to highlight StopNCII.org to tackle the increasing misuse of technology in this sphere.
The Rising Threat of AI-Driven Image-Based Abuse
The advent of AI has brought profound innovations, but it has also enabled new forms of abuse. AI can be exploited to create non-consensual intimate images, generate deepfakes, and disseminate manipulated content across the internet. This trend disproportionately impacts women, children, and LGBTQI+ individuals, with devastating effects on their safety, well-being, and ability to participate in everyday life, from school to work. The issue has become so urgent that Vice President Harris, in her remarks before the AI Safety Summit in London, emphasized the need for immediate global action to combat image-based sexual abuse.
As part of the Biden-Harris Administration's broader efforts to mitigate the risks associated with AI, a coalition of AI model developers and data providers, including Adobe, Anthropic, Cohere, Microsoft, and OpenAI, have pledged to take significant steps to prevent the creation and distribution of NCII and CSAM. These commitments are a crucial step forward in addressing the role of AI in perpetuating gender-based violence.
StopNCII.org: A Crucial Resource
Central to the fight against image-based sexual abuse is StopNCII.org, an online tool dedicated to helping any adult in the world regain control over their intimate images. StopNCII.org provides a free, globally available tool that allows individuals to proactively hash intimate images they fear may be shared without consent. These unique digital fingerprints are stored in a secure, encrypted database used by industry platforms like Facebook and Instagram to prevent the spread of the images from occurring online.
The Biden-Harris Administration has supported StopNCII.org to create stronger defences against image-based abuse. Microsoft, in particular, has taken steps to use StopNCII.org’s detection in delisting of duplicate non-consensual images across its Bing search engine. The tool serves as an essential lifeline for adults, giving them a way to fight back against the misuse of their images while providing an added layer of protection through partnerships with tech companies.
Industry Commitments and the Role of Technology Companies
In response to the White House’s Call to Action to Combat Image-Based Sexual Abuse, leading tech companies have made voluntary commitments to combat NCII and CSAM. These include responsible sourcing of datasets, developing AI safeguards, and removing nude images from AI training sets. Beyond AI model developers, companies like Meta and Snap Inc. have integrated StopNCII.org directly into their reporting systems, empowering victims to report violations more easily.
Meta, for example, has removed tens of thousands of Instagram accounts involved in sextortion scams, while also expanding its use of StopNCII.org to protect against image-based abuse. By ensuring that the tools to report, detect, and prevent such abuse are readily available, tech platforms are increasingly becoming allies in the fight to protect vulnerable individuals from exploitation.
Expanding Efforts to Curb Payment Services for Image-Based Abuse
In addition to tech platforms, financial service providers like Cash App and Square have joined the effort by curbing payment services for companies involved in image-based sexual abuse. These companies are investing in systems and partnerships designed to detect and mitigate payments tied to the creation, distribution, or monetization of abusive content. This move is critical, as many abusers profit from the spread of non-consensual images, often using anonymous payment platforms to evade detection.
By expanding their participation in industry initiatives to share intelligence about sextortion and other forms of image-based sexual abuse, these financial service providers are contributing to a multi-layered approach to preventing abuse at every step—from creation to monetization.
A Broader Ecosystem for Survivor Protection
As the fight against image-based sexual abuse evolves, it requires collaboration across multiple sectors—private companies, academic institutions, civil society organizations, and governmental bodies.
As part of this ecosystem, StopNCII.org remains a vital resource for survivors, giving them direct access to tools that can prevent further harm. The site’s partnership with leading companies like Microsoft and Meta demonstrates that when tech companies collaborate with charitable organisations, they can create more robust solutions to stop the proliferation of harmful content.
A Continued Commitment to Survivor Protection
As we mark 30 years since the passage of the Violence Against Women Act, it’s clear that while significant progress has been made, much work remains in building a world free from sexual violence, harassment, and abuse—both online and offline. The Biden-Harris Administration continues to welcome voluntary actions from industry partners, while working closely with organizations like StopNCII.org to protect the rights, safety, and dignity of adults.
Standing with adults and ensuring their safety is not only a moral imperative but a public policy priority for this administration. The alignment between the White House, tech companies, and solutions like StopNCII.org shows that when public, private, and civil society sectors work together, we can create real change to combat image-based sexual abuse and protect vulnerable individuals across the globe.