Case Studies

TikTok fined £12.7m

by Mark Rowe

The UK data protection regulator the Information Commissioner’s Office (ICO) has fined the social media platform TikTok £12.7m for breaches of data protection law, including failing to use children’s personal data lawfully.

The ICO estimates that TikTok allowed up to 1.4 million UK children under 13 to use its platform in 2020, despite its own rules not allowing children that age to create an account. By law, organisations that use personal data when offering information services to children under 13 must have consent from their parents or carers.

John Edwards, UK Information Commissioner, said: ““There are laws in place to make sure our children are as safe in the digital world as they are in the physical world. TikTok did not abide by those laws. As a consequence, an estimated one million under 13s were inappropriately granted access to the platform, with TikTok collecting and using their personal data. That means that their data may have been used to track them and profile them, potentially delivering harmful, inappropriate content at their very next scroll.

“TikTok should have known better. TikTok should have done better. Our £12.7m fine reflects the serious impact their failures may have had. They did not do enough to check who was using their platform or take sufficient action to remove the underage children that were using their platform.”

Since the ICO’s investigation of TikTok, the regulator has published a Children’s code; a statutory code of practice aimed at online services, such as apps, gaming platforms and web and social media sites that are likely to be accessed by children. The code sets out 15 standards: visit ico.org.uk/childrenscode.

Comment

Chris Linnell, Principal Data Privacy Consultant at the cyber services consultancy Bridewell, said: “TikTok’s fine from the ICO illustrates why organisations must be transparent with their data protection practices. One of the main aims of Data Protection legislation is to give us more control over how our personal data is being used. Transparency is crucial when it comes to providing this control. Individuals have a right to understand how their data will be collected, used, and shared. This is to empower them to make informed choices about whether and how they choose to engage with applications like those provided by TikTok.

“Equally, it is crucial for organisations to understand the demographic of individuals using their products and services. This can be achieved through process mapping and understanding the processing activities that are performed by each business unit. Where children’s data is likely to be processed, measures such as age verification or mechanisms to obtain parental consent should be put in place to ensure adequate protection.”

Related News

  • Case Studies

    Cloud cover

    by Mark Rowe

    The cloud-based customer relationship management (CRM) software from Microsoft, Dynamics CRM Online, has gained Impact Level 2 (IL2) accreditation from UK government.…

  • Case Studies

    Detector approved

    by Mark Rowe

    The UK Department for Transport (DfT) has formally approved Smiths Detection’s Multi-Mode Threat Detector (MMTD) as an explosives trace detection system to…

  • Case Studies

    Tunnel IP

    by Mark Rowe

    A case study of the Eurasia Tunnel, that connects the European and Asian sides of Istanbul under the Bosphorus Strait. Grundig IP…

Newsletter

Subscribe to our weekly newsletter to stay on top of security news and events.

© 2024 Professional Security Magazine. All rights reserved.

Website by MSEC Marketing