TikTok pushes suicide and self-harm content to children within three minutes of joining app, study finds

Online safety campaigners warn the app could encourage eating disorders, self-harm and suicide
Watch more of our videos on Shots! 
and live on Freeview channel 276
Visit Shots! now

Some young TikTok users are being shown potentially harmful content which could encourage eating disorders, self-harm and suicide, an online safety group claims.

The group found certain accounts were repeatedly being served content around eating disorders and other harmful topics in as little as 2.6 minutes, and eating disorder content was recommended within eight minutes after joining the platform.

Hide Ad
Hide Ad

The video-sharing app includes a For You page, which uses an algorithm to recommend content to users as they interact and it gathers more information about their interests and preferences.

To test the algorithm, Center for Countering Digital Hate (CCDH) researchers created two accounts in the US, UK, Australia, and Canada and posed as 13-year-olds. In total eight accounts were created and data was collected from each account for the first 30 minutes of use.

Online safety campaigners warn the app could encourage eating disorders, self-harm and suicideOnline safety campaigners warn the app could encourage eating disorders, self-harm and suicide
Online safety campaigners warn the app could encourage eating disorders, self-harm and suicide

One account in each country was given a female name and the other was given a similar name, but with a reference to losing weight included in the username. The CCDH said it used this username method as previous research has shown that some users with body dysmorphia issues will often express this through their social media handles.

In its report, the CCDH said the accounts created indicated a preference for videos about body image, mental health, and eating disorders by pausing on relevant videos and pressing the like button. The report does not distinguish between content with a positive or negative intent. The CCDH argues it was not possible in many cases to determine the intent of a video and that even those with a positive intention could still be triggering to some.

Hide Ad
Hide Ad

The online safety group’s report argues that the speed in which TikTok recommends content to new users is harmful as on average, its accounts were served videos about mental health and body image every 39 seconds.

The research also revealed that more vulnerable accounts – those which included references to body image in the username – were served three times more harmful content and 12 times more self-harm and suicide-related content.

The CCDH said the study had found an eating disorder community on TikTok which uses both coded and open hashtags to share material on the site, with more than 13 billion views of their videos.

‘Poisoning young minds’

Imran Ahmed, chief executive of the CCDH, accused TikTok of “poisoning the minds” of younger users.

Hide Ad
Hide Ad

He said: “It promotes to children hatred of their own bodies and extreme suggestions of self-harm and disordered, potentially deadly, attitudes to food.

“Parents will be shocked to learn the truth and will be furious that lawmakers are failing to protect young people from big tech billionaires, their unaccountable social media apps and increasingly aggressive algorithms.”

The CCDH has published a new Parents’ Guide alongside the Molly Rose Foundation, which was set up by Ian Russell after his daughter Molly ended her own life after viewing harmful content on social media.

The guide encourages parents to speak “openly” with their children about social media and online safety and to seek help from support groups if concerned about their child.

Hide Ad
Hide Ad

In response to the research, a TikTok spokesperson said: “This activity and resulting experience does not reflect genuine behaviour or viewing experiences of real people.

“We regularly consult with health experts, remove violations of our policies, and provide access to supportive resources for anyone in need.

“We’re mindful that triggering content is unique to each individual and remain focused on fostering a safe and comfortable space for everyone, including people who choose to share their recovery journeys or educate others on these important topics.”

Related topics:

Comment Guidelines

National World encourages reader discussion on our stories. User feedback, insights and back-and-forth exchanges add a rich layer of context to reporting. Please review our Community Guidelines before commenting.