SWGfL welcomes this week’s announcement that social media companies will soon be legally required to prevent users from promoting self-harm online. Under new rules, platforms will be obliged to block such harmful content before it appears, rather than relying on removal after publication.
This important shift brings self-harm material into line with existing requirements around suicide-related content under the Online Safety Act. The government’s decision responds to growing calls to ensure young people are protected from harmful and coercive online material.
The Impact of Harmful Content
Research and testimonies have previously highlighted how exposure to self-harm content can increase vulnerability and risk to harm, particularly for children and young people. In some cases, as tragically demonstrated in the death of Molly Russell, the consequences can be devastating. The threats posed by online grooming and coercion into self-harm, highlighted earlier this year by the National Crime Agency have also shown how dynamic and predatory these risks have become. Effective regulation must be proactive, adaptive, and uncompromising in protecting children from evolving harms.
David Wright CBE, CEO of SWGfL, said:
Preventing harmful content before it reaches users is a necessary framework to improve protections against this type of content. But these commitments must be matched with robust enforcement. It is not enough for the law to demand change, platforms must be held to account for how they respond, and the Online Safety Act must be equipped to ensure promises translate into real protections. Without enforcement, the risk remains that harmful content will continue to slip through the cracks.
At SWGfL, we support this significant progress but will continue to emphasise that only through consistent enforcement, strong oversight, and genuine accountability, children will begin to feel safer in these digital spaces.