Short-form video hosting platform TikTok has been fined £12.7m for multiple breaches of data protection law, including using the personal data of children under the age of 13 without parental consent.
The Information Commissioner’s Office (ICO) which upholds information rights in the public interest, estimated that TikTok allowed up to 1.4 million UK children under 13 to use its platform in 2020, despite its own rules of not allowing children of that age to create an account.
The UK data protection law disclosed that organizations that use personal data when offering information society services to children under 13 must have consent from their parents and TikTok failed to adhere to the law, knowing fully well that the platform is being used by some children under the age of 13.
Tekedia Mini-MBA edition 16 (Feb 10 – May 3, 2025) opens registrations; register today for early bird discounts.
Tekedia AI in Business Masterclass opens registrations here.
Join Tekedia Capital Syndicate and invest in Africa’s finest startups here.
The social media giant has been called out for failing to do due diligence by identifying underage individuals on the app and restricting them from gaining access to it. The ICO disclosed that such information was uncovered after some senior employees at the company expressed concerns about children under 13 using the platform and not being removed.
The UK Information Commissioner John Edwards said,
“There are laws in place to make sure our children are as safe in the digital world as they are in the physical world. TikTok did not abide by those laws. As a consequence, an estimated one million under 13s were inappropriately granted access to the platform, with TikTok collecting and using their data. That means that their data may have been used to track them and profile them, potentially delivering harmful, inappropriate content on their very next scroll.
“TikTok should have known better. TikTok should have done better. Our £12.7m fine reflects the serious impact their failures may have had. They did not do enough to check who was using their platform or take sufficient action to remove the underage children that were using their platform.”
Responding to the ICO investigation claims, a spokesperson at TikTok said,
“TikTok is a platform for users aged 13 and over. We invest heavily to help keep under-13s off the platform and our 40,000-strong safety team works around the clock to help keep the platform safe for our community.
“While we disagree with the ICO’s decision, which relates to May 2018 to July 2020, we are pleased that the fine announced today has been reduced to under half the amount proposed last year. We will continue to review the decision and are considering the next steps.”
TikTok emphasized that it had changed its practices since the period the ICO investigated. Now, in common with social media peers, the site uses more signals than a user’s self-declared age when trying to determine how old they are, including training its moderators to identify underage accounts and providing tools for parents to request the deletion of their underage children’s accounts.
TikTok’s recent fine in the U.K, is coming amid calls for it to be banned in the U.S. over national security concerns, as government officials disclose that the app could be providing vital information of US users to Beijing.