For many of us, children and adults alike, the opportunities provided by internet services are wide and varied. We are able to communicate easily and quickly across the globe, share content and access products and services that previously would have been difficult, or impossible, to access. We achieve this by making use of internet-connected devices and platforms that many of us fail to adequately understand. As adults do we read and understand the terms and conditions of service many apps and devices require us to agree to before using them? Evidence suggests that many adults don’t – in fact many of those terms and conditions are difficult for anyone to read. If they are so difficult for an adult to understand how can we expect a 13-year-old to read and understand these? It’s for this very challenge and others that the Data Protection Act 2018 included section 123:
The Commissioner must prepare a code of practice which contains such guidance as the Commissioner considers appropriate on standards of age-appropriate design of relevant information society services which are likely to be accessed by children.
The Information Commissioner’s Office (ICO) this week released their draft code: Age appropriate design: a code of practice for online services for consultation until 31st May 2019.
Here at SWGfL we believe that ‘everyone should benefit from technology, free from harm’ …” and will, of course, be submitting our own response to the consultation. Meanwhile, we’ve produced this article and infographic in order to help you learn more about the code.
Who is covered?
The code is only for providers of information society services (ISS). This means that those developers who are creating services should follow the code in order to safeguard children’s data. The ICO has previously stated that “If an ISS is only offered through an intermediary, such as a school, then it is not offered ‘directly’ to a child.” However, schools, in their role as Data Controllers, should be mindful that any service they use has been assessed in accordance with the Data Protection Act 2018 and General Data Protection Regulation 2018. Schools remain responsible for any personal data shared with an ISS.
What is expected?
There are 16 standards contained in the code:
- Best interests of the child
- Age-appropriate application
- Transparency
- Detrimental use of data
- Policies and community standards
- Default settings
- Data minimisation
- Data sharing
- Geolocation
- Parental controls
- Profiling
- Nudge techniques
- Connected toys and devices
- Online tools
- Data protection impact assessments
- Governance and accountability
The breadth of areas covered in this code appears to be wide and well-considered. Some of these are worthy of further consideration and discussion.
The notion of transparency is a really interesting consideration and directly links back to the start of this article. It’s clear that if providers of ISS are required to be more open then this will have positive benefits for children, enabling them to more fully understand the basis of their relationships with the provider, the personal data they are going to share and what the provider intends to do with it. The code suggests ways in which the information needed to make an informed choice could be presented to children and their parents.
Data minimisation, already specified in GDPR is given further details in the code. This is essentially concerned with reducing the amount of data that the provider needs to collect in order to provide their service. If the provider doesn’t always need the data, it shouldn’t collect it all the time.
We’ve been talking about geolocation for many years and the implications – both positive and negative – on child safety. Being able to track a child can be reassuring to a parents/carer, but conversely can also expose the child to abuse. The code suggests that geolocation should be off by default, with a clear indication to the child when their location is visible.
The code introduces an expectation around parental controls and the balance that needs to be struck between keeping a child safe and the UN Rights of the Child to privacy and freedom of association. The proposed mechanism is that the provider should make it clear to the child when they are being tracked or monitored (overt use rather than covert use).
Nudging is a technique used by many providers to encourage the user to select the option that benefits the provider, not the user. For example, making the button or font larger on the option they would prefer you to select, or making it ‘difficult’ to decline or control your choices. In many cases this can lead to more personal data being shared with the provider. The code requires that providers do not use nudge techniques to encourage users to share more personal data, but for providers to ‘nudge’ children towards healthy behaviours such as screen breaks.
We hope that you have found our summary of the code useful and that, if relevant, you find the time to review the code more thoroughly and consider making a response to the consultation. As The Commissioner herself states:
“The answer is not to protect children from the digital world, but to protect them within it.”