Fresh off the back of an embarrassing “grilling” by US Congress on national security grounds, TikTok has received a more concrete reprimand from the UK’s Information Commissioner’s Office (ICO) – a fine of £12.7 million ($15.8 million) for “misusing children’s data.”
Despite TikTok’s own rules disallowing children under the age of 13, the video-sharing app’s whirlwind success has meant that some 1.4 million kids in the UK used it in 2020 by the ICO’s estimates.
As anyone familiar with TikTok knows, it’s not a healthy environment for pre-teens thanks to its bizarre gamut of user-generated content. Heck, it’s not a healthy environment for any age.
Under UK data protection law, organizations that use personal data when offering “information society services” (basically any internet service you can think of) to children under 13 must have consent from their parents or guardians.
The ICO accused TikTok of failing to carry out adequate checks to identify and remove under-aged kids from the platform. During its investigation, the ICO cited internal concerns among senior employees that children under 13 were not being removed, and the watchdog deemed TikTok’s response inadequate.
The ICO found that TikTok had breached the UK General Data Protection Regulation (UK GDPR) between May 2018 and July 2020 by:
Information Commissioner John Edwards commented: “There are laws in place to make sure our children are as safe in the digital world as they are in the physical world. TikTok did not abide by those laws.
“As a consequence, an estimated one million under-13s were inappropriately granted access to the platform, with TikTok collecting and using their personal data. That means that their data may have been used to track them and profile them, potentially delivering harmful, inappropriate content at their very next scroll.
“TikTok should have known better. TikTok should have done better. Our £12.7m fine reflects the serious impact their failures may have had. They did not do enough to check who was using their platform or take sufficient action to remove the underage children that were using their platform.”
The ICO’s original notice of intent set TikTok’s fine at £27 million over provisional findings that the app was processing “special category data” (ethnic and racial origin, political opinions, religious beliefs, sexual orientation, health data etc.) without legal grounds to do so. Due to representations from TikTok, this was reduced.
A spokesperson for Tik Tok said: “We invest heavily to help keep under 13s off the platform and 40,000-strong safety team works around the clock to help keep the platform safe for our community”.
“While we disagree with the ICO’s decision, which relates to May 2018 – July 2020, we are pleased that the fine announced today has been reduced to under half the amount proposed last year. We will continue to review the decision and are considering next steps.”
It is reasonable for the ICO to take the protection of children online seriously following a number of young deaths tied to social media. In 2017, 14-year-old Molly Russell took her own life after falling into what her father described as the “bleakest of worlds” on Instagram.
In 2021, two girls, eight and nine years old, died trying to attempt the TikTok “Blackout Challenge.”
Last year, an investigation by the Center for Countering Digital Hate found that teen girls can be exposed to potentially harmful content about mental health and body image on TikTok every 39 seconds.
Once the algorithm has its claws in a vulnerable and impressionable mind, it can be taken to dangerous territory with ease. ®