The TikTok GDPR fine: A lesson in mishandling children’s data 

November 14th, 2023 Posted in Data Protection

In recent years, TikTok has taken the social media world by storm, captivating millions of users with its short-form video content. However, the platform has faced increasing scrutiny regarding its data handling practices and its impact on children and teenagers. One significant development in this ongoing saga is the recent fine imposed by the Data Protection Commission (DPC). In this blog post, we’ll delve into the facts surrounding TikTok’s fine by the DPC and explore how it ties into the ICO’s Children’s Code (the “Code”).

The Facts: TikTok’s Fine by the DPC

The Data Protection Commission, Ireland’s independent regulator for data protection, plays a pivotal role in ensuring that companies handle user data in compliance with the General Data Protection Regulation (GDPR) in the European Union. In September 2021, the DPC initiated an investigation into TikTok’s data processing practices, focusing on how the platform handled the personal data of minors between 31st July 2020 and 30th December 2020.

In its investigation, the DPC raised several concerns about TikTok’s data practices, including:

  1. Lack of Transparency: The DPC found that TikTok’s privacy policy and practices were not clear or comprehensive enough, making it difficult for users, especially children and their parents, to understand how their data was being used. This infringed the rules on transparency laid out in Articles 12 and 13 of the GDPR.
  2. Data minimisation issues: TikTok was found to be in violation of the GDPR’s principles of data minimisation and data protection by design and by default, respectively. The platform’s automatic selection of public accounts (as opposed to private) for child accounts demonstrated that TikTok did not adequately assess the potential privacy risks for children.
  3. Breaching the GDPR’s principles of security and data integrity and data protection through design and default settings. TikTok’s introduction of its “family pairing” option was in conflict with these principles as it allowed users who couldn’t be verified as the child’s parent or guardian to configure less privacy-protective settings for the child’s account.
  4. Utilized manipulative user interface design tactics, often referred to as “dark patterns,” to encourage child users to choose settings that compromised their privacy. This action infringes on the GDPR’s principle of fairness. Notably, this discovery was the result of an objection raised by the Berlin Data Protection Authority, which was subsequently supported by the European Data Protection Board. This particular finding was not included in the DPC’s initial report.

As a result of these findings, the DPC issued a significant administrative fine against TikTok, totalling €345 million. This fine underscored the importance of complying with GDPR and ensuring the protection of user data, especially when it comes to minors.

Relevance to the ICO’s Children’s Code

The ICO’s Children’s Code, also known as the Age-Appropriate Design Code, is a set of guidelines and standards aimed at protecting the privacy and data rights of children in the digital age. The Code came into effect on 2nd September 2020 (with conformance required by 2nd September 2021) and applies to all online services that are likely to be used by children in the UK, regardless of where the service is based.

The TikTok fine by the DPC is highly relevant to the Code for several reasons:

  1. Age Verification: One of the core principles of the Code is the need for robust age verification mechanisms. TikTok’s inadequacies in this area, serve as a cautionary example of what the Code aims to address. Online platforms must implement effective age-verification processes to ensure that children are not exposed to age-inappropriate content. Additionally, under the Code, platforms must implement designs that encourage children to select settings that aid in protecting their privacy, not diminish it.
  2. Transparency and Privacy: The DPC’s findings regarding TikTok’s lack of transparency and unclear privacy policies align with the Code’s emphasis on clear, age-appropriate explanations of data processing practices. The Code requires online services to provide children and their parents with understandable information about data usage.
  3. Data Protection: Just like GDPR, the Code places a significant emphasis on data protection compliance. Online services, including social media platforms like TikTok, must prioritize the security and protection of children’s data, especially in light of the growing concerns about online privacy.
  4. International Impact: TikTok’s fine by the DPC is significant not only because it involves a popular platform but also because it serves as a warning to online services worldwide. The Children’s Code is designed to influence global best practices in child online privacy, as its principles can apply to any online service accessible to UK children.

The importance of GDPR accountability

TikTok’s recent fine by the Data Protection Commission highlights the importance of implementing robust data protection measures, particularly when it comes to children and teenagers. The platform’s shortcomings in age verification, transparency, consent, and data protection serve as a pertinent case study for the implementation of the ICO’s Children’s Code for organisations that provide online services to children.

As the digital landscape continues to evolve, the protection of children’s privacy and data rights must remain a top priority. The TikTok case demonstrates the need for greater vigilance and accountability from online platforms and regulators worldwide, echoing the principles laid out in the Code.

In an age where children are increasingly connected online, ensuring their safety and privacy is not just a legal obligation but a moral imperative. TikTok’s fine serves as a reminder that the digital world can no longer afford to overlook the unique vulnerabilities of its youngest users.

Does your organisation handle children’s data?

If you need help understanding and mitigating the risks your products or services may bring to children, or simply want to discuss how you could approach child-friendly data protection design in accordance with the applicable regulations and codes of practice, then please get in touch. We can steer you in the right direction or, if you need help, we can assist at every level to ensure that you are covered.

  • This field is for validation purposes and should be left unchanged.

Ben Lyszkowski

Written by Ben Lyszkowski

Ben is part of the Evalian® Data Protection Practice, working as a Data Protection Consultant. His work history includes healthcare, database construction and visa compliance. He has a First Class Honours Degree in Law from the University of Law, Bristol.