Exclusive:TikTok challenges: grieving parents hit out at tech platform for being too slow to remove ‘dangerous’ videos
and on Freeview 262 or Freely 565
Grieving parents, whose children’s deaths have been linked to social media, have hit out at TikTok for being slow to take down “dangerous” breathing challenge videos.
Lisa Kenevan’s little boy Isaac passed away in March 2022, aged 13, after likely taking part in a choke challenge that had become popular on social media. An inquest ruled Isaac, who was “highly inquisitive and intelligent”, died due to misadventure after police found two videos on his phone of him performing the dare.
Advertisement
Hide AdAdvertisement
Hide AdSince then Lisa has become a campaigner for greater internet safety, and has repeatedly reported videos on TikTok of challenges that encourage people to hold their breath for around a minute. The videos, one of which named “killer” was viewed more than 5.9million times, appeared to be aimed at children, as they were designed in the style of the Minecraft game and used voices from Super Mario.
When Lisa, 51, first reported the videos to TikTok last year, it found that there was no violation, however after a NationalWorld investigation, the tech giant removed more than a dozen breathing challenges and banned the profile that had posted them.
Earlier this year, Lisa again raised concerns about breathing challenges with TikTok, however it was only after a high-profile appearance on BBC Breakfast, with other parents whose children’s deaths have been linked to social media, that the danger clips were removed.
Lisa told NationalWorld that “fighting to get the awful” challenges taken down is “emotionally draining and exhausting”. “It should be very, very easy,” she said. “A human eye should see those [challenges] as a violation.
Advertisement
Hide AdAdvertisement
Hide Ad“If you look at the wording - which is ‘hold breath’ - you should pick those up. But they [TikTok] don’t - they come back within minutes, or less than 24 hours, to say ‘no violation’. I think I speak for all of us when I say we’re not getting taken seriously enough and our children who have been lost on the way are just numbers that don’t equate to a human life.”


TikTok: we remove 99% of content that breaks out rules
The tech site’s guidelines state: “We don’t allow the following: showing or promoting dangerous activities, games, dares, challenges, or stunts that cause or could cause significant physical harm.”
A TikTok spokesperson said: “Dangerous challenges are not allowed on TikTok and we proactively find and remove 99% of content that breaks our rules through a combination of human and automated technology. We have strict interventions, including blocking harmful search terms and directing people to an Online Challenges Safety Centre created with youth safety experts."
TikTok said it has some of the strictest policies and strongest enforcement action. Every teenager who joins TikTok is shown a video that explains how they should engage with online challenge content, and when someone tries to search for a known dangerous challenge, no results are shown.
Advertisement
Hide AdAdvertisement
Hide AdAnd these kinds of videos sadly are not limited to TikTok. Similar challenges, which appear to be aimed at children, have got tens of millions of views on YouTube and Instagram - some even joking about when a breathing dare goes wrong.


‘It's not the job of parents to monitor the output of tech companies’
The 5Rights Foundation, a charity campaigning for children’s safety online, said it shouldn’t be up to parents like Lisa to monitor TikTok. Executive director Leanda Barrington-Leach told NationalWorld: “Let's be clear, it's not the job of parents to monitor the output of tech companies to ensure they comply with their own rules as well as the letter of the law.
“That's a responsibility that belongs to the company itself; it's theirs and theirs alone. This acts as a stern reminder to companies that the design of their products must not allow pushing dangerous content; that the terms and conditions children sign-up to need to be respected; and complaints need to be swiftly acted upon.”
The Labour MP for Hornsey and Wood Green, Catherine West, told NationalWorld: “Tech companies have to take responsibility for a lot of the content that’s on their websites. Much more needs to be done, the government has taken an awful long time to get around to the regulation side - there’s a lot more work to be done.”
Advertisement
Hide AdAdvertisement
Hide AdWest was speaking at the opening of a new canteen at Mulberry Academy Woodside, North London. She said in her constituency she was concerned about violent crime being glamourised on social media.


‘TikTok are only doing something because of media exposure’
Hollie Dance’s son Archie Battersbee, 12, died after a strangulation prank or experiment went wrong. He watched TikTok for seven minutes before his death - the contents of which remain unknown. Senior Coroner for Essex, Lincoln Brookes, said he could not "rule out the possibility" that Archie had taken part in an online challenge, but police hadn't found any evidence he had.
Together, Lisa and Hollie have launched a campaign to educate parents and teachers about the dangers of social media. Hollie also said she thinks TikTok needs to be swifter at taking down challenges.
“They’re only doing something because they’re getting media exposure,” she told NationalWorld. “Do they think it’s all going to calm down in a few weeks? All their harmful stuff reappears again. It’s very frustrating as a parent.
Advertisement
Hide AdAdvertisement
Hide Ad“I can’t bring Archie back, I still don’t know whether he watched one of these videos, but as a parent my eyes have now been opened having to look into this stuff, and I am so concerned about the safety of other people’s children.
“I don’t think [TikTok] are taking it seriously. I can tell that by their response - that they don’t allow harmful content on their platform. That’s not true at all. They’re being screened by bots, with something so life and death, you can’t have a bot screening it, it’s got to be a human safeguarding team.”
TikTok says that every video uploaded to the platform is reviewed by automated moderation technology. Then if a potential violation is found, this system will either pass it on to human safety teams for further review or remove it automatically.
Ofcom complaints process ‘very, very frustrating’
Lisa also wants Ofcom to take a greater role in policing this content. It comes after the communications regulator published its draft children’s safety codes of practice, which set out how it expects online services to meet their new legal responsibilities to protect children online under the Online Safety Act.
Advertisement
Hide AdAdvertisement
Hide AdIt will require social media platforms to take action to stop their algorithms recommending harmful content to children, and put robust age-checking measures in place to protect them. However, Ofcom has said it will not accept reports from parents about individual violations, including from Lisa.
Lisa said she found this “very, very frustrating”. “You’re bouncing back and forth and your hand is not being held through this procedure,” she said of Ofcom’s complaints processes.
“Especially after you’ve lost a child, you don’t know where to turn to, how to get the data [from child’s phone]. It needs to be really clear and concise. A parent is going through hell [after losing a child] and trying to digest what has happened in their lives, just to be left on their own devices of not knowing where to turn.”
On BBC Breakfast, Dame Melanie Dawes, the regulator’s CEO, responded to Lisa by saying: “Ofcom won’t have the power to look at individual complaints about individual bits of content on a social media site, so it’s different from for example TV complaints. The reason for that is just the scale, in the last few minutes Facebook alone will have moderated 50,000,100,000 individual bits of content in their system.
Advertisement
Hide AdAdvertisement
Hide Ad“So what the laws do is focus on them having their systems and processes in place, but also improving complaints for members of the public, users of the service and that will be a big thing Ofcom is holding them to account.
Ralph Blackburn is NationalWorld’s politics editor based in Westminster, where he gets special access to Parliament, MPs and government briefings. If you liked this article you can follow Ralph on X (Twitter) here and sign up to his free weekly newsletter Politics Uncovered, which brings you the latest analysis and gossip from Westminster every Sunday morning.
Comment Guidelines
National World encourages reader discussion on our stories. User feedback, insights and back-and-forth exchanges add a rich layer of context to reporting. Please review our Community Guidelines before commenting.