Synthetic Media:
Support and Advice for Schools

  1. Tailored
  2. Topics
  3. Understanding Synthetic Media (Deepfakes)
  4. Support and Advice for Schools

As a new form of media, knowing how to respond to the rise of synthetically created content in schools can be complex, particularly if students are being affected or harmed by it. Throughout the education sector, developing an understanding of how synthetic media may be used can help ensure that schools, students, and staff have the appropriate safeguarding measures and support available.

There are many ways synthetic media can be used in schools, and schools should have the necessary policies and practices to address and respond to this, particularly if media is created to cause harm. Some examples of how synthetic harmful content may be used negatively in schools include, but are not limited to:

Impersonation and Fake Media: Synthetic media can be used to create fake videos, images or messages of students or teachers, leading to bullying or reputational damage. This could have a severe impact on the individuals affected.

Academic dishonesty: Students might use synthetic media to create content they use in assignments or exams, which may influence their results and can make identifying their work harder.

False accusations: Teachers or students could be falsely accused of inappropriate behaviour through manipulated videos, potentially leading to unfair disciplinary actions or damage to individual and organisation reputation.

Talking about Synthetic Media

Just as technology adapts, our methods to address it need to adapt too. Synthetic media can be used in a variety of ways in schools, and as the technology becomes better understood, it may be utilised in ways that have a bigger, and potentially harmful, impact. As students engage with tools and apps that can create synthetic media, approaching students about their use and understanding of such technology is important to help protect their online safety and the safety of others.

If your school is looking to discuss synthetic media with students, it’s important to consider the following as part of your discussions:

  • The impact it can have on students
  • The ethics behind using and creating synthetic media
  • The importance of critical thinking and digital literacy
  • The school's stance on synthetic media and AI usage
  • Where students can find support

It’s important to remember that synthetic media and the technology behind it are constantly evolving, and the ways we respond to and utilise this form of media will change over time. To support this, implementing digital literacy and an understanding of online safety into your curriculum can ensure that students have continuous education about their online usage whilst gaining the critical thinking skills they need to correctly understand the impact of technology.

Where can I go for support addressing online safety concerns involving synthetic media?

If there has been an incident in your school that involves an online safety concern, such as synthetic harmful content being distributed, there is support and guidance available to address these incidents appropriately.

The Professionals Online Safety Helpline is available to professionals working with children and young people who may need help trying to manage a piece of synthetic content online. In past cases, we have seen AI used to generate inappropriate images depicting teaching staff and students, which can have a profound effect on those who have been targeted.

As with any case, the helpline will review any images alongside the context of what might be happening offline, and where appropriate will help callers to report the content to the platform. As well as the practical side of helping to report the content, the Professionals Online Safety Helpline can also help guide you through how you might respond to the incident with the young person involved and signpost you to any further sources of support. 

Responding to incidents involving AI-generated or digitally manipulated nudes and semi-nudes

With children becoming more aware of AI and how it can be used, schools are seeing more cases involving AI-generated or digitally manipulated nude images. As with any form of sexting incident at a school, there is important guidance that the designated safeguarding lead (DSL) should follow.

The UK Government has published ‘Sharing nudes and semi-nudes: advice for education settings working with children and young people’, which provides guidance on how schools and organisations in England can address sexting incidents, with advice on responding to the sharing of digitally manipulated and AI-generated nudes and semi-nudes.

Key considerations around AI-generated media include:

  • Guidance around sexually motivated incidents and how AI-generated imagery can be used in financially motivated incidents (sextortion attempts)
  • Advising young people that laws around the sharing of nudes and semi-nudes apply to digitally manipulated or AI-generated imagery of other children and young people

Further information on safeguarding can be found in the DfE KCSIE, which provides statutory guidance on safeguarding and child protection. KCSIE also includes information addressing child-on-child sexual violence and sexual harassment, alongside how to identify concerns and shape safeguarding strategies with consideration of technological interactions and advancements.

A young person is engaging in harmful sexual behaviour, what support is available?

When responding to incidents where young people are committing forms of harmful sexual behaviour, such as sharing synthetic sexual content online, it is important to understand the support and resources available to help strengthen your safeguarding approach.

Digital Literacy & Critical Thinking

Digital literacy supports the ability to effectively and critically navigate, evaluate, and create information using a range of digital technologies. Digital literacy supports online development and can encompass various skills and competencies required to function safely online. Digital literacy education also supports critical thinking which enables students to make critical judgements about what they view online.

With the use of technology and social media increasing in schools, improving digital literacy is more important than ever. In the context of synthetic media, providing education about how to understand and interpret online content helps to ensure that young people develop the critical thinking skills they need to distinguish between what is real and what may be synthetic.

While possessing digital literacy skills cannot guarantee that anyone will be able to identify synthetic media, it can support students to question and understand more about what they see online; supporting them to consider the ethics around using AI and responding to synthetic media.