The Professionals Online Safety Helpline provides free support to members of the children's workforce in the UK with an online safety concern. In this blog Kat Tremlett, Helpline Practitioner, offers an insight into the type of cases the helpline has been dealing with over the last 3 months.
The categories under which we record online issues changed as of the 1st January this year and we have just submitted our first report to Insafe using these. The headline stats from this report were that 20% of all contacts were about potentially harmful content, 20% about online reputation, 16% about data privacy concerns and 15% about questions to do with media literacy
A large portion of the potentially harmful content cases during this quarter lead back to historic or current allegations of abuse. Sadly, these are becoming more common and knowing how to respond in the correct way can seem like a bit of a maze for designated safeguarding leads (DSL’s).
The complexity is magnified 10 fold when you add into the mix every Local Authority (LA) having a different protocol for how to respond to a report of an allegation. Furthermore, in the cases we’ve been responding to recently, the Local Authority Designated Officer’s (LADO’s) handling of an allegation can vary dramatically if it comes from an online source rather than verbally or via the police.
It’s vital that any allegation of abuse is taken seriously regardless of how it is made or discovered (i.e. online, verbally, via the police) and, if the allegation is serious that at the very least the LADO calls a strategy meeting to discuss further action.
Lack of resources for young people with Special Education Needs and/ or a Disability (SEND)
The majority of cases about media literacy were requests for resources about online safety for young people with SEND. There is a real lack of resources and understanding in this area in the UK and little support for professionals trying to help young people. In particular how a young person with SEND’s social cues differ to other young people their age making them more vulnerable online. In reality this comes back to the age old argument of the behaviour being the thing to work on rather than the technology itself.
A handful of cases highlighted young people under 16 engaging in risky sexual behaviours online. One of our helpline practitioners discovered the hashtag #Paypalme is being used to solicit sexual acts/ sending of indecent images when money is paid into account holder’s (often under 16) PayPal account leading to clear CSE concerns.
Sugar Daddy style dating apps such as Seeking Arrangements are also being used by some under 16 year olds as a way of earning money. These behaviours are extremely risky placing young people, potentially under the age of consent, in very vulnerable situation. Safeguarding processes should be followed if they come to light
The Game of the Month
Fortnite –The helpline has received several calls about this game recently in the wake of scaremongering articles about it in the national media and well intentioned safeguarding messages sent out from local authorities to schools across the UK. Its PEGI rated 12 but does have a lot of cartoon style violence in it. This shouldn't' really be a problem if younger players are supervised well. The general problems we see on these type of games are not associated with the game itself rather the online chat that can be turned on allowing younger users to potentially speak to strangers. We advise that parents supervise any game play where this functionality is enabled and would remind them that you can turn off the online chat function from within the gaming device, in this case most likely Xbox Live. Our friend Andy Robertson gives parent’s all the information they need to know about the game here: www.askaboutgames.com/parents-guide-to-fortnite-pegi-12/ .
Do you have an online safety issue?
If you’re a member of the children’s’ workforce requiring assistance with an online safety concern, please contact the Professionals Online Safety Helpline for further advice.