TikTok’s ‘For You’ feed pushes children & young people towards harmful mental health content, study finds

TikTok users are quickly being drawn into “rabbit holes” of potentially harmful content, including videos that romanticise and encourage depressive thinking, self-harm and suicide, according to a new study
TikTok's ‘For You’ feed pushes children to harmful mental health content, according to a new 2023 study from Amnesty International. Photos by Adobe. Composite image by NationalWorld/Kim Mogg.TikTok's ‘For You’ feed pushes children to harmful mental health content, according to a new 2023 study from Amnesty International. Photos by Adobe. Composite image by NationalWorld/Kim Mogg.
TikTok's ‘For You’ feed pushes children to harmful mental health content, according to a new 2023 study from Amnesty International. Photos by Adobe. Composite image by NationalWorld/Kim Mogg.

TikTok users are exposed to videos that ‘romanticise, normalise or encourage’ harmful behaviour including suicide within minutes of scrolling of the website or app, according to a new study. The social media giant’s business model is “inherently abusive and privileges engagement to keep users hooked on the platform” to collect data from them, according to research from the global human rights movement Amnesty International.

The movement carried out two studies in partnership with the Algorithmic Transparency Institute and AI Forensics, and together they used automated accounts posing as 13-year-olds to measure the effects of TikTok’s ‘For You’ recommendation system on young users. It found that after five to six hours on the platform, almost one in two videos shown were mental health-related and “potentially harmful”.

Hide Ad
Hide Ad

“Between three and 20 minutes into our manual research, more than half of the videos in the ‘For You’ feed were related to mental health struggles with multiple recommended videos in a single hour romanticising, normalising or encouraging suicide,” a spokesperson from Amnesty International said. TikTok’s ‘For You’ feed is a personalised and infinitely scrollable page of algorithmically recommended content, picked out to reflect what the system has inferred to be a user’s interests.

‘An addictive rabbit hole’

The two reports, called Driven into the Darkness: How TikTok Encourages Self-harm and Suicidal Ideation and the I Feel Exposed: Caught in TikTok’s Surveillance Web, highlight the alleged abuses experienced by children and young people using TikTok, and the ways in which these abuses are caused by TikTok’s recommender system and the underlying business model. The findings showed how children and young people who watch mental health-related content on TikTok’s ‘For You’ page are quickly being drawn into “rabbit holes” of potentially harmful content, including videos that romanticise and encourage depressive thinking, self-harm and suicide.

“The findings expose TikTok’s manipulative and addictive design practices, which are designed to keep users engaged for as long as possible. They also show that the platform’s algorithmic content recommender system, credited with enabling the rapid global rise of the platform, exposes children and young adults with pre-existing mental health challenges to serious risks of harm,” said Lisa Dittmer, Amnesty International Researcher.

TikTok's ‘For You’ feed pushes children to harmful mental health content, according to a new 2023 study from Amnesty International. Photos by Adobe. Composite image by NationalWorld/Kim Mogg.TikTok's ‘For You’ feed pushes children to harmful mental health content, according to a new 2023 study from Amnesty International. Photos by Adobe. Composite image by NationalWorld/Kim Mogg.
TikTok's ‘For You’ feed pushes children to harmful mental health content, according to a new 2023 study from Amnesty International. Photos by Adobe. Composite image by NationalWorld/Kim Mogg.

The ‘Driven into the Darkness’ study details how TikTok’s “relentless pursuit” of young users’ attention risks exacerbating mental health concerns such as depression, anxiety and self-harm. Another study called ‘Addictive by Design’ used focus groups and simulations of children’s TikTok habits to reveal how “TikTok’s platform design encourages the unhealthy use of the app”, according to the researchers. Various people contributed to the focus groups, and they were all given pseudonyms. 

Hide Ad
Hide Ad

Luis, a 21-year-old undergraduate student who has been diagnosed with bipolar disorder, told Amnesty International that the TikTok algorithm will highlight videos people don’t actually like. “It’s a rabbit hole because it starts with just one video. If one video is able to catch your attention, even if you don’t like it, it gets bumped to you the next time you open TikTok and because it seems familiar to you, you watch it again and then the frequency of it in your feed rises exponentially,” he said.

*Francis, an 18-year-old student, said: “When I watch a sad video that I could relate to, suddenly my whole ‘For You’ Page is sad and I’m in ‘Sadtok’. It affects how I’m feeling.” Another focus group participant explained: “The content I see makes me overthink [even] more, like videos in which someone is sick or self-diagnosing. It affects my mentality and makes me feel like I have the same symptoms and worsens my anxiety. And I don’t even look them (videos) up, they just appear in my feed.”

Children and young people interviewed for the study said that they felt their TikTok use affected their schoolwork, social time with friends and led them to scroll through their feeds late at night instead of catching enough sleep.

‘TikTok must respect the rights of all its younger users’

The ‘I Feel Exposed’ study highlights how TikTok’s “rights-abusing data collection practices” are “sustained” by making the app more addictive.  Amnesty Tech Deputy Programme Director Lauren Armistead also said TikTok targets places with less data protection regulations. “Children living in countries with weak regulation, including many countries of the Global Majority, are subject to the worst abuses of their right to privacy,” she said. “TikTok must respect the rights of all its younger users by banning all targeted advertising aimed at those younger than 18 globally.”

Hide Ad
Hide Ad

Amnesty International called on TikTok to also stop hyper-personalising the ‘For You’ feed by default, and instead asks the social media giant to let users actively choose which interests shape their content recommendations based on their informed consent. 

Responding to the findings, TikTok pointed us to its Community Guidelines, which set out which types of content are banned and as a result are removed from the platform if reported or otherwise identified. These include a ban on content “showing, promoting, or providing instructions on suicide and self-harm, and related challenges, dares, games, and pacts”, “showing or promoting suicide and self-harm hoaxes” and “sharing plans for suicide and self-harm”. TikTok also stated that it is in the process of developing a “company-wide human rights due diligence process which will include conducting periodic human rights impact assessments.”

Comment Guidelines

National World encourages reader discussion on our stories. User feedback, insights and back-and-forth exchanges add a rich layer of context to reporting. Please review our Community Guidelines before commenting.