TikTok whacked with £12.7m fine for UK privacy failings

tiktok nThe Information Commissioner’s Office has slapped a £12.7m fine on TikTok Information Technologies UK and TikTok Inc for a host of breaches of data protection law, including failing to use children’s personal data lawfully.

The original ICO notice of intent – issued in September last year – set the fine at £27m, although the regulator did include a number of caveats in its initial ruling.
And, taking into consideration the representations from TikTok, the ICO says it decided not to pursue the provisional finding related to the unlawful use of special category data. That means this potential infringement was not included in the final amount of the fine set.

The ICO estimates that TikTok allowed up to 1.4 million UK children under 13 to use its platform in 2020, despite its own rules not allowing children that age to create an account.

UK data protection law says that organisations that use personal data when offering information society services to children under 13 must have consent from their parents or carers.

TikTok failed to do that, even though it ought to have been aware that under 13s were using its platform. TikTok also failed to carry out adequate checks to identify and remove underage children from its platform.

The ICO investigation found that a concern was raised internally with some senior employees about children under 13 using the platform and not being removed. In the ICO’s view TikTok did not respond adequately.

Information Commissioner John Edwards said: “There are laws in place to make sure our children are as safe in the digital world as they are in the physical world. TikTok did not abide by those laws.

“As a consequence, an estimated one million under 13s were inappropriately granted access to the platform, with TikTok collecting and using their personal data. That means that their data may have been used to track them and profile them, potentially delivering harmful, inappropriate content at their very next scroll.

“TikTok should have known better. TikTok should have done better. Our £12.7m fine reflects the serious impact their failures may have had. They did not do enough to check who was using their platform or take sufficient action to remove the underage children that were using their platform.”

The ICO found that TikTok breached the UK GDPR between May 2018 and July 2020 by providing its services to UK children under the age of 13 and processing their personal data without consent or authorisation from their parents or carers.

It also failed to provide proper information to people using the platform about how their data is collected, used, and shared in a way that is easy to understand. Without that information, users of the platform, in particular children, were unlikely to be able to make informed choices about whether and how to engage with it.

And finally, the ICO ruled that TikTok failed to ensure that the personal data belonging to its UK users was processed lawfully, fairly and in a transparent manner.

Since the conclusion of the ICO’s investigation of TikTok, the regulator has published the Children’s Code to help protect children in the digital world. It is a statutory code of practice aimed at online services, such as apps, gaming platforms and web and social media sites, that are likely to be accessed by children.

Related stories
Clouds gather over TikTok: Do marketers give a toss?
ICO takes cautious approach as TikTok faces £27m fine
TikTok rocked by fresh claims of 18 violations of GDPR
‘Super-regulator’ puts TikTok, AI and adtech on notice
TikTok in the dock again as privacy complaints mount
Government balks at TikTok plans to build British HQ
TikTok whacked again over abuse of kids’ personal data
Clock ticking on TikTok as EU probes data compliance