October 21, 2025


TikTok Accused of Leading Vulnerable Youth to Harmful Content, Says Amnesty International France

Amnesty International has once again spotlighted TikTok for allegedly failing to safeguard children and teenagers from exposure to harmful content that could exacerbate mental health issues. In a recent study, the organization reiterated findings from 2023, which showed that young users seeking mental health information often find themselves plunged into a disturbing array of content featuring depressive themes and suicide romanticization.

The research conducted in France utilized three test accounts to explore how the platform's algorithms curate content on the "For you" page after users show interest in mental health topics. Disturbingly, the accounts soon started to display messages and videos that not only dealt with depression but also romanticized suicide, including the controversial "lip balm challenge" linked to self-harm.

This issue gained a human face in 2021 with the tragic suicide of 15-year-old Marie Le Tiec, whose death was linked to such harmful online content. Her family, among others, has taken legal action against TikTok, accusing it of inadequate moderation practices. Despite TikTok's defense, citing its large moderation team and mental health support initiatives, the platform is under scrutiny for not adhering to Business and Human Rights standards and the European Digital Services Act (DSA) of 2023, which mandates platforms to mitigate systemic risks to children.

The European Commission has initiated formal proceedings against the social media giant under the DSA in 2024, bolstered by Amnesty's findings. However, the debate among experts about the direct impact of social media on youth mental health remains divided. Marion Haza, a clinical psychologist, argues that while harmful content poses risks, it predominantly affects those already vulnerable to self-harm and emphasizes the importance of community support over strict prohibitions.

Contrastingly, Amnesty's 2024 survey indicates a significant impact of disturbing content on youth, with 58 percent of participants reporting negative effects, and only 20 percent managing to avoid depressive content on their feeds.

As global concern grows, some countries are taking legislative action. France is considering a "digital curfew" for children under 16, while Australia has already passed a law to ban social media access for the same age group starting December.

This ongoing issue highlights the complex challenges and responsibilities social media platforms face in balancing user engagement with the safety and well-being of their most vulnerable users.