On the 2nd September 2021, the Age appropriate design code (known informally as the Children's code) came into force in the United Kingdom.
This is a data protection code of practice, created by the Information Commissioner's Office (ICO), that all 'information society services likely to be accessed by children' must abide by. In practice, this covers most organisations providing online services, such as social media platforms, streaming services and educational websites.
This code sets out 15 standards of 'age appropriate design', which are designed to minimise how much of a child's data is collected and used by online services without preventing their access to them. These standards are as follows:
In this article, we will look at each of these standards in more detail, and what online services need to do in order to comply with them.
This standard requires organisations to consider the best interests of the child when developing and designing online services. This may include considering how to support their need for safety, health, wellbeing, freedom of expression and privacy.
Organisations are required to carry out a data protection impact assessment (DPIA) to assess and control any risks to the rights and freedoms of the children who may access their service. As part of this assessment, organisations need to consider the differing ages, capacities and development needs of the children that may access their service, and how they can be supported.
This standard may be complex for some organisations to meet. More information on DPIAs can be obtained from the organisation's data protection officer, or found on the ICO website.
This requires an online service to take a risk-based approach to identifying the age of its users in order to make sure that the standards of this code are applied correctly.
The level of certainty with which a user's age must be identified will vary depending on the risks present. If there is no reasonable way to identify a user's age, it may be the case that these standards need to be applied to all users of the service.
This standard is similar to the transparency principle of the GDPR. It requires that all privacy information provided to users, as well as other published terms, policies and community standards, are concise, prominent, and in a language that is suited to the age of the child.
It also requires services to provide additional small explanations to users about how their personal data will be used at the point that use is activated or permitted.
Again, this standard is similar to a principle of the GDPR, specifically the requirement to process data lawfully.
This standard prevents children’s personal data from being used in ways that are detrimental to their physical health, mental health or wellbeing. It also prevents any use of data that is unlawful or goes against Government advice, regulatory provisions, etc.
This standard is quite simple in principle, and requires organisations to abide by their published terms, policies and community standards.
When a child accesses an online service, their settings must be set to 'high privacy' by default, unless the organisation running it can demonstrate that they have a compelling reason to do otherwise.
This standard is almost identical to the data minimisation principle in the GDPR, and requires organisations to process the minimum amount of personal data needed to provide an online service.
Organisations cannot disclose or share the personal data of children with third parties unless they have a compelling reason to do so.
Using a child's geolocation data can pose a number of risks to them. As a result, organisations must switch all geolocation options off by default for children, and provide a clear indicator whenever location tracking is active.
This standard requires any online service that provides parental controls to disclose this to the child. Recommendations of how to do so for each age group can be found on the ICO website.
Also, a service must provide an obvious sign to the child if these parental controls are being actively used to track their activity or location.
Profiling is when personal data is processed automatically to evaluate a person and create a profile about them. By default, processing must be disabled for children unless an organisation can demonstrate that it has a compelling reason to enable it.
This standard prevents online services from using nudge techniques, which are design features used to encourage someone to take a preferred path, in order to encourage children to disable privacy protections or provide more personal data than is necessary. For example, a service may display one button more prominently than the other or give a disingenuous description of what an option does.
Any manufacturers of children’s toys and other devices that can connect to the internet are required to conform to the standards of this code. This does not apply to electronic toys and devices that cannot connect to the internet.
Online services must provide prominent and accessible tools to help children exercise their rights under data protection law, and report any concerns they have about how their data is used. These tools will likely be similar to those that an organisation offers to help people exercise their rights under the GDPR.
For more information on the Children’s code, click here to visit the ICO website, and if you are looking for data protection training then consider taking one of our GDPR courses below: