With calls being made for the Online Safety Bill to proceed without any further delays, it has become clear that online safety has become a public priority.
Two recent examples highlighting the widespread impact of online abuse were the recent airing of the BBC documentary ‘Deepfake Porn – Could You Be Next?’ which looked at the rise of intimate image abuse through deepfake technology, while the release of the Girls’ Attitude Survey by the Girl Guide Association revealed one in three girls (aged 11-21) have been put off certain jobs where high profile women get abused online.
Both these reveal how online abuse impacts real lives and the need for change to ensure everyone is safe online.
A Central Reporting Hub
Working towards that goal, SWGfL is currently developing the ground-breaking platform Minerva which will be a central reporting hub for victims of online abuse.
With the use of AI technology, Minerva will assist users who have been affected by online abuse through a cohesive approach. This includes identifying patterns of abuse by a perpetrator, while users will also be directly signposted to the support services as well as being assisted with reporting and removing online abuse.
For example, a pattern of behaviour may include a user being harassed online while also being physically stalked. Minerva will connect these types of behaviour by a perpetrator and provide the user with a more comprehensive understanding of what they are experiencing.
Building this comprehensive view of each case is an important step in online safety as the laws around online abuse, particularly intimate image abuse, are complex and confusing resulting in a low number of prosecutions.
Some these complexities, such as whether victims are entitled to automatic anonymity when reporting to the police, are highlighted by Clare McGlynn KC from Durham University in an example below:
Karen’s landlord installs a camera in her bathroom and takes nude images of her without her consent. He distributes these online.
Janaya’s boyfriend takes nude images of her in the bathroom with her consent. But then distributes them online without her consent.
Both women have had images of them distributed online without their consent, but only Karen is granted automatic anonymity when reporting to the police, as the landlord may have committed a sexual offence when taking the images. Janaya’s boyfriend has potentially committed what is labelled as a communications offence which does not come with automatic anonymity, yet the harm the women experience is likely very similar.
Another complication is that while the landlord’s voyeurism offence of taking images is only an offence if done so for sexual gratification, if he distributes the images with the same purpose, that is not an offence. The distribution offence is only made out on proof of motives to cause distress.
So, if Janaya’s boyfriend distributed the images of her with the intent to cause distress, that is an offence i.e he wanted to harm her. But, if he shared the images on his friends’ WhatsApp group ‘for a bit of fun’ then there’s no intent to cause distress and it’s not an offence.
Meanwhile if distribution was for financial gain, it is not an offence.
It all depends on the type of offence and the type of motivation.
Different Jurisdictions
There are also different laws in different jurisdictions of the UK, for example :
- Downblousing is an offence in Northern Ireland, but not in England, Wales or Scotland.
- Distribution of deepfakes is an offence in Scotland, but not in England, Wales or Northern Ireland.
- If a boyfriend alters an ordinary image to make it pornographic and shares it online, that is an offence in Scotland, but not in England, Wales or Northern Ireland.
Commenting on the delay in the Online Safety Bill, Revenge Porn Helpline manager, Sophie Mortimer said the Helpline receives reports of intimate image abuse on a daily basis.
She said, “Intimate image abuse is an issue that is becoming more widespread with millions of people around the world affected.
The Online Safety Bill has the potential to protect and support adults and children in the UK in online spaces: any delay, no matter how small, can only result in more harm caused and more reports to helplines like ours.’'
Minerva is being developed by SWGfL in partnership with the Department for Digital, Culture, Media and Sport (DCMS) and funded by the Tampon Tax Fund. SWGfL is working with a team of experts, including partners and academia to develop the pioneering platform. It will incorporate leading tech intelligence and is due to be launched in Spring 2023.