The passage of the Take It Down Act in the United States earlier this year marked a watershed moment in the global response to non consensual intimate image abuse (NCII). For the first time, federal law in the US requires platforms to remove reported NCII content within 48 hours, and to make proactive use of perceptual hashing technologies to identify and eliminate duplicates. This represents a clear recognition that swift intervention is necessary to protect victims from enduring the devastating consequences of their intimate images spreading online.
We have long supported the principles embodied in this legislation. Victims deserve rapid relief, and legislation should hold platforms accountable for their responsibilities. While some have raised concerns about the limitations of the Act, including our partner the Cyber Civil Rights Initiative (CCRI), we share the view that something is better than nothing. Just as we have welcomed the Online Safety Act in the UK as progress, while also recognising its limitations, we welcome the Take It Down Act as a step forward that provides important new protections for victims within the US.
The strengths and limitations
The Act’s requirement that content be removed within 48 hours is both right and necessary. Platforms already have the tools to respond within such a timeframe, and survivors should expect nothing less. Where the challenge becomes more complex is in identifying and removing duplicates of that same content. The Take It Down Act requires the use of perceptual hashing, a technology that can recognise altered or edited versions of an image or video. This is a significant advantage over cryptographic hashing, which only identifies exact duplicates. But the benefit comes at a cost. Perceptual hashing is far more computationally intensive, particularly for platforms dealing with vast volumes of user generated content.
Industry partners have rightly raised concerns about the burden of compliance. Removing a single file on report is straightforward; identifying and removing thousands of modified duplicates within the same 48 hour window is considerably more demanding. This is not to excuse delay, but to recognise the scale of the technical challenge. Policymakers must balance ambition with feasibility, ensuring that legal obligations are both enforceable and effective.
Building on progress
For all its progress, the Take It Down Act remains limited in scope. It applies only to victims within the United States, leaving others without recourse. It does not require the sharing of perceptual hashes across platforms, meaning the same image may be removed from one service while remaining available on another. And while it strengthens victim rights, it does not fully embed the kind of systemic cooperation that global scale requires.
This is where the next step must be taken. We have long argued that a NCII Register is the most efficient and effective means of disruption. By allowing verified NCII content to be logged once and then acted on universally, a Register provides the legal and technical foundation to block, delist, and prevent re upload across platforms. It is precisely the mechanism required to address the gaps that both the UK’s Online Safety Act and the US Take It Down Act currently leave unresolved.
StopNCII.org already demonstrates how this can be achieved in practice. By enabling survivors to generate their own perceptual hashes on their device and share them securely with participating platforms, it offers a privacy preserving and scalable model. Were the sharing of these hashes mandated and supported by safe harbour protections for platforms that deploy them responsibly, we could move from piecemeal compliance to a systemic global solution.
Towards international convergence
The lesson from the US is not that perfection is possible in one legislative step, but that progress is possible and necessary. Just as the UK was among the first to recognise NCII as a distinct harm in law, the US has now established a clear framework for platform accountability. Other countries will undoubtedly follow. The opportunity lies in ensuring that these approaches converge, learning from one another’s strengths while addressing weaknesses.
The Take It Down Act shows what can be achieved when governments act with urgency. But urgency must be matched with sustainability. A future proof solution will require international collaboration, technical standardisation, and the embedding of mechanisms like the NCII Register into law. Anything less risks leaving victims at the mercy of fragmented systems, where protections depend on geography, platform size, or enforcement capacity.
Conclusion
We applaud the progress made in the United States. We applaud the progress made here in the United Kingdom. But victims need more than applause. They need action that is consistent, comprehensive, and cooperative.
The path forward is clear. Take down reported content swiftly. Detect and remove duplicates. Share perceptual hashes through StopNCII. Build a global NCII Register that empowers enforcement and disruption across borders. Provide safe harbour for those platforms that adopt these measures in good faith.
The Take It Down Act is a milestone, but it is not the destination. The destination is a digital ecosystem where non-consensual intimate image abuse has no place to persist.