Privacy vs Safety - The Conflict

Privacy vs Safety - The Conflict

This blog was written by Carmel Glassbrook. The views expressed in the blog are Carmel’s and do not reflect the wider views held by the South West Grid For Learning Trust.

Privacy is the number 1 priority for Facebook and associated platforms, moving forward. In his blog, Mark Zuckerberg outlines his vision for a future of ‘privacy-focused social networking’, admitting that;

frankly we don't currently have a strong reputation for building privacy protective services..

One of the solutions for privacy is Encryption. For Facebook, their Initial plan is to encrypt their messaging services ‘end-to-end’.

In very basic terms, this means that the only people able to read your private conversations are you, and whomever you are talking to. It means that even the platform hosting your conversation will not be able to see it.

Encryption is like a safe house with no doors, windows or codes to get in. Some safety experts want there to be a back door into this house so that if law enforcement need to, they can see what’s happening inside. The problem is that once that back door is installed, the whole house becomes vulnerable. Inevitably, someone with the skill and the time will find a way to unlock the back door, then they will have the opportunity to see what’s going on, invite others in and share what’s’ inside it with anyone.

Building an entry for ‘authorized’ people is not a bad idea, but it’s highly unlikely that this function would not be abused by others. There is also a question of corruption and who is ‘authorized’. In the UK you may be quite comfortable in your trust for the police, but in other countries where corruption is rife, how could an online platform like Facebook ensure that they are handing the keys to someone with all the right intentions? People who are not going to use what they find to falsely imprison, stalk, abuse or harass someone. “Well”, you might say “just encrypt for those countries and not ours?” Facebook is a global platform; you cannot encrypt one side of a conversation and not the other, it just doesn’t work. If we want to continue to connect globally and have chats with people in other parts of the world using these platforms, we have to all fit under the same rule.

Why?

It's really easy to sit in our western privilege and think, “but why would I ever need this level of privacy?” again, it may be easier to think globally. Think of the young LGBTQ+ person living in central Russia using these platforms to connect with other LGBTQ+ people, try dating, and explore their sexuality. For them, the implications of someone seeing what’s in their inbox could be life-threatening.

Back in the UK there are other privacy concerns around this data and quite rightly, if I chat to my friend on messenger and say “I really love your new Doc Martins”, I should NOT then see an advert for Doc Martins in my news feed. Targeted advertising may not seem like the biggest abuse of privacy but the implications are much wider. The Open Rights group have done a considerable amount of research and can explain all this in much greater detail.

Here Comes the Conflict

Safety. How does one balance privacy and safety? So far, I’m not sure anyone knows. Some will argue that having this increased level of privacy makes every user experience safer. Others will argue that not having a link for law enforcement puts more people at risk, especially children. I attended a Hackathon at Facebook in London back in the last quarter of 2019 where we were tasked with coming up with ideas to marry the two. The problem that we were focused on was grooming of children through messenger.

We know that sex offenders go to places where children are online, just like offline. We also know that grooming happens via messaging and yes, whilst we know that children are not really on Facebook, they are on Instagram and WhatsApp, both owned by FB.

Without encryption, a child could report grooming happening within messenger (in a perfect world*), the police would then investigate this, they may look at the messages that child reported and decide that, yes, there is a problem with this person. They could then, with the right evidence, submit a RIPA to Facebook to have a look at that person’s other messages, gather more evidence, safeguard more children and prosecute. With encryption they would not be able to investigate this far into their messages. If the initial child was astute enough to take screenshots of the messages, then there is some evidence that the police could use but this puts the onus on the child or the victim to; A) make a report and B) provide evidence.

* But this is not a perfect world and even without encryption, the likely-hood of the police going to these lengths are slim, read this article to understand more about the RIPA process and why it’s not as easy as it sounds; Schools, Fake Accounts and RIPAs - What you can do

So, How Can Platforms Find the Right Balance?

At hackathons, lots of ideas are thrown about. A lot of people are still calling for a back door or for encryption to not happen at all, but sadly this feels a bit like ‘locking the stable door once the horse has bolted’.

For me the key is education, platforms like Facebook do have some really good reporting features, but do you know how and when to use them? Probably, because you’re reading this article, but does your 13-year-old niece? Does she understand what grooming is, what it will look like and how it might feel? Are young people confident enough to trust their own instinct, and then strong enough to wave a flag and let someone know when something isn’t right? For some maybe, but for most maybe not, therein lies the vulnerability of childhood. These skills tend to be something we pick up as we move through life. For some, 13 years on this planet won’t have taught them that yet. For others with severe difficulties or disabilities, this may never come. Is there a social responsibility for online platforms to try to educate their users, especially the younger ones? Maybe.

But we can’t expect them to. If we expect this and they fail, who is really to blame?

Facebook, and likely others to follow, are really pushing the privacy angle and at the moment are focused on messaging. But, will it go any further and if so, why? Is it really for user safety?

This conversation isn’t going away anytime soon, and it’s not as though there are many choices either. We all care about our privacy and despite some calling for it to end, encryption is going to happen and indeed already is. We have to learn and adapt. Having an encrypted service does not make it unsafe, it just means we have to develop some different tools. RIPA doesn’t really work as it is, so what are we actually loosing?

What’s in the Pipeline?

The government has now announced it is "minded" to grant new powers to Ofcom which will allow the regulator to sanction social media companies for not complying with the regulator, though it is still very unclear what those regulations will look like and what kind of sanctions Ofcom will be able to place on them. Maybe this allows scope for change and setting boundaries about what information platforms can and have to share in order to safeguard. This time of change is also reflected in the ‘online harms white paper’ which will hopefully push some change in this area.

One of the things we would really like to change is the process for law enforcement to gain important data and information from social networks and would like this to be much more streamlined and effective. We hear from law enforcement teams within these organisations that the information they have to share is plentiful, but the laws that restrict our law enforcement agencies from being able to ask for it means that a lot goes under the radar. The RIPA process can take months to complete, by which time it may be too late. If this could be a simpler, more efficient process, our police will have a better chance at acting on crimes, when they happen.

We must work to safeguard and protect users, especially children online. Privacy is a double-edged sword in that it will offer safety in one aspect, but could also impede it in another by obstructing investigations and the sharing of information to safeguard. The government are going to have to work with NGO’s and the regulator Ofcom to strike that balance.

This will take some time and will be a process, but it’s a step in the right direction. Self-regulating in this industry has proven to be flawed, even Mark Zuckerberg admits that Facebook should not be making their own rules and “ We need a more active role for governments and regulators”

So, watch this space and chuck your two pennies in where you can. It may feel like a massive challenge and it is, but it’s kind of exciting to be at the start of this journey and see how it unfolds. The technology revolution is well underway and we are just some of the first players.

Back to Magazine

Related Articles