In September of 2020, the Information Commissioner’s Office (“ICO”) finalised the Age-Appropriate Design Code (the “Code”), which is also known informally as the Children’s Code.
The Code is a requirement under the Data Protection Act 2018 and aims to protect children’s digital privacy while ensuring young people receive the “best possible access” to the Internet in the United Kingdom (“UK”). Under the Code, children are considered any persons under the age of 18.
It came into force in September of 2021, following a 12-month transition period.
Background of the ICO Children’s Code
The Children’s Code was championed into fruition by House of Lords Representative and children’s rights campaigner, Baroness Kidron. In 2018, Kidron successfully lobbied for an amendment to the Data Protection Act (“DPA”), which focused specifically on children’s data protection. This amendment became section 123 of the DPA, which mandated the ICO to introduce an Age-Appropriate Design code.
Who does the code apply to?
The Code applies to “relevant information society services which are likely to be accessed by children” in the UK. An information society service, broadly speaking, is a service provided by electronic means at a distance, so covers services delivered online.
Because the Code applies to services accessed in the UK, it means that it is also relevant to service providers outside of the UK, but who interact with the data of children based in the country.
Notably, the Code applies to services that are likely to be accessed by children, not just those specifically targeting them. This makes the Code’s jurisdiction expansive. Examples of services it applies to include mobile applications, online games, connected toys, social media platforms, websites and news platforms.
Read our in-depth blog on Data Protection by Design to understand how to implement best compliance practices when developing software.
The 15 standards in The Code
The Code takes the form of 15 standards. The ICO notes that these standards are not technical policies that must be implemented. Rather, they are:
“a set of 15 flexible standards – they do not ban or specifically prescribe – that provides built-in protection to allow children to explore, learn and play online by ensuring that the best interests of the child are the primary consideration when designing and developing online services.” – Elizabeth Denham, Information Commissioner
It’s worth noting that the Code itself is not a law, but a statutory code that sits within the DPA. The Code builds on and utilises principles from, the UK GDPR and applies them to children’s data protection. This means that, when it comes to legal repercussions, the ICO and courts will look at the Code in the context of UK GDPR violations, rather than giving fines for a direct violation of the Code itself. As a result, there is the potential for costly fines, of up to 4% of a company’s annual revenue, as laid out in the UK GDPR.
Below is an overview of the Code’s principles.
- The best interests of the child: Online services should be designed and developed with the child’s best interest at heart. In practice, this principle relates to Articles 5(1)(a) and 8 of the UK GDPR, whereby personal data should be “processed lawfully, fairly and in a transparent manner in relation to the data subject”, with children meriting specific protection due to their potential unawareness of the risks of sharing their data.
- Data protection impact assessment (“DPIA”): Organisations should undergo a DPIA before launching a new service, and for services already on the market, to ensure that they are processing children’s data ethically. Read our post for further guidance on conducting a DPIA.
- Age-appropriate application: Companies should “take a risk-based approach to recognising the age of individual users” and effectively apply the Code standards to users under 18. To establish the age of a user, the ICO suggests a range of solutions, including self-declaration, artificial intelligence and hard identifiers. The measure used will depend on the individual needs of the business and the data it processes. For example, if a business processes high-risk user data, then a self-declaration may not be stringent enough. In instances where a business cannot achieve a level of certainty over its users’ ages, the ICO advises to “apply the standards in the code to all users”.
- Transparency: Transparency is already a key data protection requirement under the UK GDPR, with Article 12 stating that privacy notices aimed at children must be provided in a manner that they can easily access and understand. The Code reinforces this objective, highlighting that privacy information must be “concise, prominent and in clear language suited to the age of the child”.
- Detrimental use of data: Organisations must not process or share children’s personal data in ways that could be detrimental to the child’s wellbeing. Potential issues include using children’s personal data to encourage prolonged engagement of an app or service and behavioural advertising services that market prohibited products (alcohol, high-fat foods etc) to children.
- Policies and community standards: In line with obligations in the UK GDPR to implement appropriate organisational measures, organisations must adhere to the standards and policies they have published regarding personal data, including data protection policies and age restriction rules.
- Default settings: Article 25 of the UK GDPR requires data protection by design and by default, which requires a ‘privacy first’ approach to data processing settings and configurations. The Code further encourages this, by mandating that privacy settings should be high by default, unless organisations can “demonstrate a compelling reason for a different default setting, taking account of the best interests of the child.”
- Data minimisation: The ICO defines data minimisation as “collecting the minimum amount of personal data that you need to deliver an individual element of your service. It means you cannot collect more data than you need to provide the elements of a service the child actually wants to use.” Children should be given as much choice as possible over the elements of a service they wish to use, and how much data they need to provide.
- Data sharing: Children’s data should not be disclosed unless there is a “compelling reason” to do so, and this reason should be in line with the best interests of the child. This principle is already built into the UK GDPR, in article 5(1)(b), which provides that data should be “collected for specified, explicit and legitimate purposes and not further processed in a manner that is incompatible with those purposes”.
- Geolocation: Geolocation data presents a risk to child safety if this data is misused. The code, therefore, mandates that geolocation options are switched off by default unless the organisation has compelling grounds to switch it on by default, while also taking into account the best interests of the child. In instances where a child’s location is tracked, this must be transparently communicated to the user. In any instance where a child switches on their location and it is visible to other users, this setting change must be forgotten after each session and reverted to off.
- Parental controls: While parental controls are there to help protect a child’s privacy, they can also impede their “rights to association, play, access to information and freedom of expression”. Organisations must therefore notify children about parental controls and monitoring in age-appropriate language.
- Profiling: Recital 38 of the UK GDPR already states that children require special protection regarding profiling, while Recital 71 mandates that profiling should not be used for automated decisions in regard to children. The Code reinforces these rules. It advocates that organisations turn profiling options off by default unless compelling reason can be given, that takes into account the best interest of the child.
- Nudge techniques: Nudging uses design features to gently manipulate a desired outcome from the user. Nudging techniques should not be used to encourage children to share unnecessary personal data or turn off privacy protections. However, they can be used to promote high privacy options and parental controls.
- Connected toys and devices (IoT): Digitally enabled toys and devices that connect to the internet must conform to the standards laid out in the Code. They must also “avoid passive collecting of data” while in listening mode. Devices that do not connect to the internet are exempt.
- Online tools: The UK GDPR gives data subjects numerous rights over their data, including the right of access and qualified rights to erasure, to restrict processing and to object to processing. Organisations should enable children to exercise these rights by providing prominent tools that are easy for them to use and understand.
The new Code aims to positively reinforce and strengthen the rights of young people already laid out in both the EU GDPR and UK GDPR. Despite this good intent, the breadth of services that the Code applies to means that many organisations will have to take significant strides to comply.
For example, news sites, social media applications and online games – which are likely used by, but not aimed at, teenagers – will need to address how they design, innovate and create online products and services used by children. This process is called threat modelling, and you shouldn’t start any new build of software without doing this first. In truth, a mature approach to good GDPR compliance, accountability, risk management and data protection by design should put such organisations in a good place to comply with the Code, but there is a way to go for many organisations.
It will be interesting to see whether the ICO will take a more proactive approach to enforcement against controllers within the scope of the Code than they have so far for general enforcement of the GDPR.
Given that the transition period is almost complete, organisations should already be taking steps to be compliant. If you are at the start of their journey, we advise reviewing the ICO’s Children’s Code Hub to familiarise yourself with the principles and whether they apply to your company.
If you are in scope, then you will need to conduct a DPIA for both existing and new services, and validate that the principles of the Code have been met. To help with this, the ICO has also published a template DPIA. As part of the assessment, you should assess the age of children accessing your services, and what means you will use for age verification.
You will then need to review your existing privacy notices, policies and tools and analyse their appropriateness for the younger age groups that access your services. This information should then be tailored to be more accessible and transparent for those ages. In some cases, the design of services and data collection practices may need to be altered for compliance – particularly in the case of nudge settings and profiling practices.
In cases where a company is confident that children will not be accessing their services, this will need to be documented, with compelling reasons given, to demonstrate to the ICO that due diligence has been carried out. Needless to say, keeping on top of your compliance will help to build further consumer trust in your brand and reputation.
You may also be interested in the differences between the ICO’s Children’s Code and California’s Age Appropriate Design Code.
Do you need help understanding your GDPR compliance obligations?
If you need support with meeting the requirements of the Children’s Code, we’re here to help. From DPIA assessments to implementing age-appropriate design, we can help you every step of the way. Get in touch today.