In my role as Harm Reduction Officer with South West Grid for Learning (SWGfL), I’m lucky enough to work with our partners in the UK Safer Internet Centre (UKSIC) – Childnet International and the Internet Watch Foundation (IWF).
I’ve been working with the IWF since 2002. Back then I was a Police Officer working on the Internet Child Abuse Unit in the Avon and Somerset Constabulary. You can’t help but admire the team at the IWF. The work they do in battling to remove child sexual abuse imagery from the internet is truly amazing. Thanks to them, just 0.2% of the world’s known online child sexual abuse images are hosted here in the UK. Back in 1996, when the IWF was set-up, that figure was approximately 18%.
Getting up to speed
Because it’s my job to educate multi-agency staff and police officers on preventative practices and harm reduction for young people online, I spend a lot of time in schools promoting the work of the IWF. However, I knew that the IWF had recently announced that it was now able to proactively seek out the online images and videos of child sexual abuse, instead of just handling incoming public reports. Not only this, but it had introduced some new technology which meant IWF analysts had access to newly advanced, innovative methods of better identifying and removing these illegal images.
So, I kindly asked if I could pay them a visit. I wanted to spend some time with the analysts, getting up to speed on the trends that they’re seeing and getting a better grasp on the issue for the benefit of the staff and officers I talk to everyday.
Arriving at the Hotline
When I arrived on Monday morning, I found a group of dedicated men and women sat at pod-type workstations, assessing and grading images and videos of child sexual abuse with their morning cuppa. They tell me that Monday mornings are the busiest part of the week. That’s because they need to work through the backlog of reports received from the public over the weekend.
Reporting images to the IWF can be done anonymously, or the reporter can provide their email address so they can request an update on what’s happened with their report. Last year (2015), these guys received 68,092 reports that were confirmed as child sexual abuse URLs (web site addresses). That’s a huge 118% increase on the previous year (2014). Sadly, this is an issue which, despite the best efforts of the IWF the wider INHOPE network, isn’t going away anytime soon. But, there’s hope yet…
New powers and new tools
Firstly, since April 2014, the IWF analysts have been able to proactively search for images of child sexual abuse online. Using intelligence from public reports and experience in assessing these images, they’re bringing this stuff down faster than ever. The team is currently 13 strong, but with the volume of content out there, they could no doubt do with an even bigger team.
The analysts told me that a lot of the time, they see the same images every day. The IWF strongly feels that each time a child’s image is uploaded, shared and viewed, that victim is re-victimised again and again. Until recently, there’s been very little anyone could do stop this from happening.
With Facebook, Google, Microsoft, Twitter and Yahoo!, the IWF has developed a ‘game-changing’ service which stands to eliminate online child sexual abuse imagery like never before – the IWF Image Hash List. There was a real buzz in the Hotline around the newly rolled-out Image Hash List. In short, how it works is this: an image of child sexual abuse is given a unique code, known as a ‘hash’, which is fed onto the hash list. Using the list on their services, Member companies from IWF’s membership base of 128 companies worldwide are now able to automatically identify and remove known images of child sexual abuse using the unique code generated by IWF analysts.
What a fantastic way to clamp down on the circulation and distribution of, and access to, these horrific child sexual abuse images. It feels like the balance is tilting back our way. It feels like we’re bringing a much stronger fight to those who upload, distribute and view these images.
One of the more complex issues we deal with across the UK Safer Internet Centre is ‘youth produced sexual imagery’. Commonly known by adults as ‘sexting’, but young people tend to refer to them simply as ‘nudes’. I won’t take up time here discussing the myriad problems that regularly arise from this recent trend, but the growing issue of ‘private’ images being shared beyond just the intended recipient is having a devastating effect on boys and girls in schools and communities up and down the country. I look forward to learning how we can employ the hash list to stop further distribution of these images in the future.
Thanks to the Hotline team
What I got out of a great couple of days was the feeling of the quiet determination and competence of a small group of people, doing a thankless but fantastic job. I owe a big thank you to all of the individuals who took the time out of their daily tasks to lead me through some of the methodology and issues. In particular, I’d like to thank the Hotline Manager for facilitating the two days and giving me so much time to sit and learn from his team of analysts.
If you want to know more about the IWF and help spread the word, check out their most recent Annual Report. It makes for an interesting read (rest assured, it’s definitely not boring).