A Sit Down With Report Harmful Content

A Sit Down With Report Harmful Content

We spoke with Kathryn Tremlett (Manager of Report Harmful Content) to discuss the importance of the service and the impact it has had since it first began. We discuss the development of the helpline along with how it works to support victims of online harm including the latest trends that have been seen from the latest annual report due to release later this week.

Many of us know what Report Harmful Content is, but for those who don’t, could you give us a brief outline?

Report Harmful Content is a national service that encourages anyone to report harmful content online by providing advice on how to report harm online to industry platforms, including links to the correct reporting channels, and offering mediatory support where the correct reports have already been made to industry. Having received a report, Report Harmful Content will either explain and justify the providers reply or provide further assistance and representation to action the harmful content as required. Report Harmful Content:

  • Provides information on community standards
  • Directs victims to provider reporting mechanisms
  • Provides recourse for victims/ witnesses
  • Provides mediatory support encouraging industry to take action against 8 types of harmful content online

What was the basis for deciding to launch this reporting platform?

Report Harmful Content is the first service globally to provide cross platform recourse for victims of legal but harmful content online. Our experience in running 3 separate helplines, tells us that harm online rarely happens in isolation. Instead, it’s calculated, coordinated and often unrelenting across a multitude of different platforms. Until Report Harmful Content’s creation, victims of online harm would have to navigate multiple websites and industry safety centres to find out how to report and seek recourse for harmful content online.

The internet is not just for children and there is a distinct lack of support services available for adults experiencing or witnessing harm online. Report Harmful Content’s main purpose is to help everyone to report harmful content online. We are the only service in the UK that offers mediatory support to adult members of the public needing assistance with harmful but legal content online. Until now, victims of online harm were unable to get this kind of support which causes a great deal of distress and concern.

How has it developed over the years?

The idea for this national service dates back to 2011 around a coffee table with a few industry colleagues. Fast forward to 2018 and the government drive to regulate online spaces was shared in their green paper consulting on an online safety bill. The time was right and we collaborated with stakeholders, industry and the Department of Culture Media and Sport to develop a platform to support victims of online harm. Throughout 2018, Report Harmful Content consulted with focus groups and gathered user research to determine next steps and how we could improve the site. We went on to refine the website and launched into pilot phase at the end of 2018.

After a year of operating in pilot phase, Report Harmful Content launched to the public in December 2019. Shortly afterwards, we found ourselves working amidst a global pandemic with the majority of the UK population accessing almost everything via the internet.

Can you explain how Report Harmful Content helps people who have already submitted reports to industry?

Where people have used the correct reporting routes with industry but haven’t had the outcome they hoped for, that’s where our mediatory role comes in. Report Harmful Content will check submitted reports and industry responses against platform-specific community standards in order to provide users with further explanation and advice on actions they can take. Our relationships with industry platforms is unique and we work together to ensure that correct platform reporting mechanisms are utilised where possible before acting in this capacity.

The Report Harmful Content website provides a plethora of advice about all manner of harms online and how to report these on commonly used industry platforms. This is tailored to the needs of our clients and is regularly updated/ refreshed based on insights obtained from our helpline services and behavioural trends.

Where we act in a mediatory capacity, we encourage industry to take action on 90% + reports.  This doesn’t just involve removing harmful content but also helping victims of harm regain access to accounts and encouraging industry to apply sensitivity filters to content which is unsuitable for viewing by a younger audience.

We are not a traditional helpline service, in that we do not communicate with victims via telephone. All our support is offered online, ensuring that everything is logged and responses take place in an efficient manner. Technology has been vital in how the helpline operates, as have our unique industry partnerships.

Part of the website includes correct legislation / online harms advice, which has been developed and edited in line with changes to behavioural trends and insights observed from reports. Our online safety expertise is at the heart of our advice and guidance which will guide future development.

What impact has the service had and what trends have you seen in regards to reports?

Report Harmful Content’s latest annual report, "Through These Walls", is due to be released later this week, analysing data between January 2020-December 2020. The report shows that the Report Harmful Content website received 17,406 visitors and practitioners dealt with 644 unique cases, a 292% rise on the previous pilot year.

Cases involving bullying and harassment were most common, followed by pornographic content, abuse and impersonation.

  • Report Harmful Content found that online harassment disproportionately affected women and was often perpetrated by ex-partners

Three common trends were identified:

  • Cluster of domestic abuse, coercive control and harassment issues. This trend disproportionally affected women and in a quarter of cases involved intimate image abuse as an additional harm
  • A 255% rise in reports with a wider issue of hate-speech. Most reports had a primary issue type of harassment or abuse
  • Young males actively searching for harmful content and reporting it. Pornography was the only harm that was predominantly reported by males

What plans are there for the future?

The insights we have gained from running Report Harmful Content have helped to shape new service development for customers, including a downloadable button for organisations to install on their website. This button provides ease of access to all commonly used platform reporting mechanisms for their website community. Other industry platforms can include the button on their website to link directly to Report Harmful Content if a user experiences or witnesses any harmful content online. We aim for this service to be a staple part of online reporting and, from the engagement we’ve seen so far, it is encouraging to see it being used across a wide demographic. 

As we work hand in hand with industry, we on board many of them as industry partners. These include social media sites, dating and gaming platforms. Already we have partnered with 24 globally recognised industry platforms and we look to build even more partnerships as the years go on.

Why is Report Harmful Content so important for the digital age?

At a time where the government has pledged that the UK will be the safest place in the world to be online, this pioneering approach is exactly what is needed and what will help with upcoming regulation of the internet through the Online Safety Bill. Government and Ofcom policy makers are taking note as they recognise the value of Report Harmful Content in this space.

Report Harmful Content is one of SWGfL’s 3 helplines: the only services in the UK who have unique relationships with industry enabling effective removal of content with a success rate of over 90%.

We believe no one should suffer the consequences of harmful content online and Report Harmful Content exists to help realise this.

About Report Harmful Content 

Back to Magazine

Related Articles