Reports of Harmful Content Rise by 20% in 2024

Reports of Harmful Content Rise by 20% in 2024

As part of the helpline services provided by SWGfL, we are proud to launch the Report Harmful Content Report for 2024, which provides a comprehensive analysis of the trends, challenges, and successes encountered in the past year.

Download Report

A Year of Growth and Greater Demand

In 2024, the Report Harmful Content service experienced a substantial growth compared to the previous year. With 6,222 cases handled (a 20.5% increase from the previous year) it's clear that the demand for impartial, trusted support in navigating harmful online experiences is more critical than ever. Through the thousands of cases, we’ve seen a surge in people seeking guidance.

Impact That Matters

Behind each statistic lies an individual seeking help. Whether someone has been harassed, had their privacy violated, or witnessed deeply distressing content, RHC exists to provide immediate, empathetic support and clear paths towards resolution.

In 2024, we successfully encouraged industry to remove harmful content in 86% of in-remit cases, often within hours of initial contact. It is important to note that nearly half of all takedowns were achieved within 24 hours which is a testament to the strength of our trusted flagger relationships and the dedication of our team.

Harmful Content Affecting Children

One of the most concerning insights from this year’s report is the disproportionate impact of harmful content on children and young people. Over 24% of our cases involved individuals under 18, including a troubling number of children under 12.

This underscores a key priority for us: continuing to advocate for and build stronger protections for younger internet users. At the same time, we must also recognise the needs of older users, who, although fewer in number, are not immune to the threats of online abuse.

Evolving Online Threats

The complexity of online harm continues to evolve. From bullying and harassment to the circulation of child sexual abuse material (CSAM), the breadth of harms reported reminds us that technology’s impact on human wellbeing is multifaceted. What remains constant however, is the need for responsive, knowledgeable services that can intervene effectively, whether that’s through mediation, education, or direct platform engagement.

Small Platforms, Big Risks

Another notable trend this year has been the increasing presence of harmful content on smaller, high-risk platforms. While major platforms like Facebook and Instagram remain common locations for content, 37% of the harmful content we identified occurred on lesser-known services, where moderation standards are often insufficient. As the Online Safety Act continues to shape platform accountability, we urge policymakers to account for the significant damage that can arise from these under-regulated spaces.

Looking Ahead

As we publish the 2024 RHC Report, we are reminded that tackling online harm requires more than just technical solutions, it requires a collective commitment towards support, education, and empowerment. We remain steadfast in our resolve to ensure that every person in the UK has access to tools, information, and support to feel safe online.

Find out more at reportharmfulcontent.com.

Back to Magazine

Related Articles