White House Calls for Tech Industry to Combat Synthetic Sexual Content

White House Calls for Tech Industry to Combat Synthetic Sexual Content

On Thursday the 23rd May, the White House officially called on tech industry companies and financial institutions to shut down a growing market of synthetic sexual content (also known as deepfakes), which is being made with artificial intelligence technology to transform someone’s likeness into an explicit image.

In the absence of federal legislation in America, the White House have called for voluntary cooperation, in the hope that private sector companies can “curb the creation, spread and monetization of such non-consensual AI images”. In a document released to media on Thursday, call to actions have cited the need for cooperation from AI developers, payment processors, financial institutions, cloud computing providers, search engines and popular corporations who control what media makes it onto their platforms.

Although platforms from this sector condemn the creation, sharing and monetisation of synthetic sexual content, this call demonstrates the need for more rigorous and wider action to be taken to combat a devastating form of intimate image abuse that is growing at a prevalent pace worldwide.

View the Call to Action from the White House

Need for StopNCII.org

The need for divisive and active measures is emphasised by the White House highlighting StopNCII.org as a tool that needs to be utilised by tech platforms and people who have been impacted by this form of intimate imagery. Since its inception in 2021, StopNCII.org has grown a prominent network of participating partners, who have all implemented the tool’s world’s first hashing technology, including Microsoft PhotoDNA technology, to prevent forms of intimate images being shared further. With the rise of synthetic sexual content, and the accessibility of the AI tools that are being used to create it, we implore the mandatory implementation of StopNCII.org technology to protect adults across the world.

Furthermore, those who have had their likeness ingrained into these images, or have had intimate images non-consensually shared, can create a case through StopNCII.org which allows partnering platforms to take action to remove their intimate images.

Sophie Mortimer from the Revenge Porn Helpline (operated by SWGfL, alongside StopNCII.org) says:

“This week’s call from the White House has cited a “phenomenal acceleration” of this form of damaging non-consensual intimate imagery. This growing rate is alarming for us at the Revenge Porn Helpline as we work to combat and support those who have been impacted by intimate image abuse. We encourage this call from the White House and strongly implore that industry platforms, gatekeepers and tech companies join the likes of Facebook, OnlyFans, TikTok, Bumble, and many more to implement StopNCII technology.

The creation of synthetic sexual content has created complexities within the law, and although the call from the White House highlights the specific need for comprehensive federal legislation, the participation from tech industry companies will strengthen the global fight against this content.”

To read more from the White House and better understand why this call is so vital, please read the following article. To understand how companies can join StopNCII.org as a participating partner, please visit the StopNCII.org website. From there, adults who want to prevent their intimate images from being shared across participating platforms can find out about how the tool works, as well as eligibility criteria.

The growth of synthetic sexual content has emphasised the far-reaching and devastating impact of intimate image abuse, however, with collaboration and determined action, a fight can be made against this, to give people the full protection they deserve.

Learn more about StopNCII.org

Back to Magazine

Related Articles