TikTok Enhances Age-Verification Technology Across EU

TikTok is set to launch new age-verification technology across the European Union in the coming weeks. This initiative responds to escalating demands for stricter regulations on social media use among minors, particularly in the wake of a recent social media ban in Australia for users under the age of 16.

The platform, owned by ByteDance, is under increasing scrutiny alongside other popular services like YouTube to enhance measures for identifying and managing accounts belonging to children. The newly developed system has been undergoing testing in the EU over the past year, analyzing various indicators such as profile information, posted videos, and user behavior to assess whether an account may belong to someone under the age of 13.

Accounts flagged by the age-verification technology will not face immediate bans. Instead, they will be reviewed by specialist moderators, allowing for a more nuanced approach to account management. In a previous pilot program in the UK, this process resulted in the removal of thousands of accounts that did not comply with age guidelines.

Meta, the parent company of Facebook and Instagram, employs Yoti, a verification service, to confirm users’ ages on its platform. In December 2023, Australia implemented a significant move to enhance online safety by banning social media access for individuals under 16 years old. Following this decision, the eSafety Commissioner reported that over 4.7 million accounts had been removed across ten different platforms, including TikTok, Instagram, and Facebook.

As TikTok prepares to roll out its new system, European authorities are intensifying their examination of how social media platforms verify user ages in accordance with data protection regulations. Recently, UK Labour leader Keir Starmer expressed openness to enforcing a similar social media ban for young people in the UK. He voiced concerns regarding the amount of time children and teenagers are spending on smartphones, citing alarming reports of very young children spending hours in front of screens each day.

Starmer’s previous stance against outright bans stemmed from worries about the enforcement challenges and potential consequences, such as pushing teenagers towards less regulated areas of the internet. His concerns have been echoed by parents like Ellen Roome, who lost her 14-year-old son, Jools Sweeney, following an online challenge gone wrong. Roome has called for greater parental rights to access the social media accounts of deceased children.

The European Parliament is actively pursuing age restrictions on social media platforms, while Denmark is considering a ban on social media access for users under the age of 15. TikTok has stated that its new technology is designed to align with EU regulatory standards. The company has collaborated with Ireland’s Data Protection Commission, the lead privacy regulator in the EU, during the development of this verification system.

Concerns regarding child safety on social media have gained momentum, particularly after a 2023 investigation by the Guardian revealed that moderators were instructed to allow under-13s to remain on TikTok if they claimed their parents were overseeing their accounts. This revelation has further highlighted the need for robust age-verification measures.

As TikTok enhances its age-verification technology, the platform aims to not only comply with regulatory standards but also to foster a safer online environment for younger users. The effectiveness of these measures will be closely monitored as pressure mounts on social media companies to take responsibility for the protection of minors in the digital space.