The European Union has initiated a comprehensive investigation into TikTok, scrutinizing the social media giant for potential breaches of the Digital Services Act, particularly focusing on issues related to child safety and the platform's "addictive design." This inquiry marks a significant escalation in regulatory efforts aimed at ensuring online safety and protecting young users from exposure to harmful content.
Owned by the Chinese company ByteDance, TikTok has found itself at the center of regulatory attention in Europe, following preliminary investigations into its compliance with the new legislation that mandates stringent content moderation practices. The Digital Services Act empowers EU regulators to impose penalties of up to 6 percent of a company's global revenues for violations, highlighting the seriousness with which the EU is addressing digital safety concerns.
The investigation's scope includes examining TikTok's algorithmic systems, which are suspected of fostering behavioral addictions and leading users down so-called "rabbit hole effects," thereby endangering their physical and mental well-being. Margrethe Vestager, the European Commission’s executive vice president in charge of digital policy, emphasized the importance of TikTok scrutinizing its services to mitigate risks to users of all ages.
In response to the regulatory scrutiny, TikTok has defended its efforts to safeguard teenage users and restrict access for those under 13, citing the implementation of innovative features and settings as part of its commitment to user safety. The company has expressed its willingness to engage with the European Commission and detail its protective measures.
The backdrop of this investigation includes broader concerns about TikTok's data collection practices, its impact on young people's mental health, and its ties to China. These issues have drawn attention from policymakers and parents alike, with TikTok’s chief executive, Shou Chew, facing questions from U.S. lawmakers about the platform's influence and operational independence.
In Europe, where TikTok boasts over 150 million monthly users, previous regulatory findings criticized the platform for default public settings and the use of dark patterns that compromise user privacy. The current EU investigation will also evaluate TikTok's age verification tools and its compliance with advertising transparency requirements under the Digital Services Act.
This move by European regulators underscores a growing global effort to hold digital platforms accountable for user safety, particularly for children and teenagers. As the investigation unfolds, TikTok's practices and policies will be under intense scrutiny, potentially setting a precedent for how social media companies address the challenges of protecting young users in the digital age.