The ICO fines TikTok £12.7m for failing to protect children's privacy

ICO issues reduced fine to TikTok for breaching UK GDPR.

06 April 2023

Publication

On 4 April 2023, the ICO issued a £12.7m fine to TikTok1 for breaching UK data protection law between May 2018 and July 2020. This is the third largest fine imposed by the ICO to date. The ICO found that TikTok had not done enough to prevent children under 13 from using its platform without parental consent. The £12.7m fine is, however, markedly less than the £27m fine which TikTok was threatened with under the notice of intent in September 2022.

The law

TikTok was found to have breached multiple UK GDPR provisions:

  • TikTok used an estimated 1.4m underage users' personal data without explicit parental consent and failed to make reasonable efforts to verify parental consent, in breach of Article 8.
  • TikTok failed to provide users with clear information as to how their data was being collected, used and shared, in breach of Article 12, meaning users (especially children) were unlikely to be able to make informed decisions.
  • Accordingly, personal data had not been processed lawfully, fairly and in a transparent manner, in breach of Article 5(1)(a).

Notably, TikTok's terms of service explicitly prohibit children under 13 from creating an account, but the systems in place from 2018-2020 were not sufficient to enforce this ban. The ICO found that children's data could have been used to track or profile them, potentially presenting them with harmful or inappropriate content.

TikTok's breaches were found to have been aggravated by the fact that senior employees had been alerted that underage children were using their platform but failed to respond adequately.

The ICO had initially intended to impose a higher fine on TikTok, totalling £27m. However, as a result of TikTok's representations to the ICO, it decided not to pursue an additional breach relating to TikTok's unlawful use of "special category data" (ie, personal data revealing racial or ethnic origin, political opinions, religious beliefs and health or biometric data). The removal of this initially alleged breach reduced the fine by over half to £12.7m.

TikTok welcomed this reduction, adding that they have invested heavily in updating internal systems and safeguards since July 2020, including launching a safety team of 40,000 employees tasked with ensuring the safety of the platform.

Commercial context

TikTok has been a focus of significant regulatory attention in other jurisdictions: it was issued a record $5.7m (£4.5m) fine by the US Federal Trade Commission in 2019, a 186m won (£123,000) fine in South Korea in 2020 and a €5m (£4.3m) fine from the CNIL in France in 2022 for breaches of the relevant data protection law. Australia, the US, Canada, New Zealand, Norway, the EU and the UK have enforced a partial TikTok ban, and Afghanistan and India imposed complete bans of the app.

Regulatory action against social media platforms has significantly increased over the last five years and we expect interest to only intensify. Regulators and companies alike are fighting to keep up with rapidly developing technology in the sector to keep users safe.

Recent changes to guidance

This investigation period pre-dates the introduction of the ICO's Children's "age appropriate design" Code introduced in August 2020 to help protect children using digital services. The 15 codes cover transparency, data sharing, parental controls and profiling and are designed to work alongside more general parental control and guidance.

Whilst the Code is not legally binding, it has been coined a "statutory code of practice" by the ICO and provides guidance as to how the UK GDPR applies to services engaging children in the digital world. It therefore is a strong indication of the ICO's expectations and the standard expected of companies operating in the UK.

Next steps

  • Use this as a prompt to review your internal systems and safeguards (even if your company does not process the data of children). The ICO have been clear that ignorance of an issue with your safeguards is not an adequate defence.
  • Stress test your internal reporting processes. Do your employees know who to report to? How quickly are issues investigated? Do you keep a log of those decisions? Are decision-makers comfortable with their differing obligations in each relevant jurisdiction?
  • If children are likely to access your service (even if they are not targeted), familiarise yourself with the ICO's Children code (also known as the Age appropriate design code).

Read more about the fine here.  

We have experts across the world ready to answer your questions on data protection and cybersecurity in your jurisdiction. Contact us to find how we can help.


1 TikTok Inc and TikTok Information Technologies UK Limited ("TikTok")

This document (and any information accessed through links in this document) is provided for information purposes only and does not constitute legal advice. Professional legal advice should be obtained before taking or refraining from any action as a result of the contents of this document.