A new lawsuit alleges staff at four of the world's biggest social media giants were aware of the harmful effects their platforms could have on their teenage users, but instead chose to ignore that information.
Facebook and Instagram owner Meta, TikTok owner ByteDance, SnapChat, and Youtube owner Google are the subject of a newly-unsealed lawsuit over social media addiction. Bloomberg reports the unredacted version filed over the weekend in California's federal court offered details about how much engineers and others key players - including Meta CEO Mark Zuckerberg - knew about the potential harms of social media.
Some had even expressed their misgivings: one Meta employee wrote in 2021 that “no one wakes up thinking they want to maximize the number of times they open Instagram that day. But that’s exactly what our product teams are trying to do," Bloomberg said.
The Oakland case is made up of dozens of claims alleging that TikTok, Instagram, Facebook and other social media sites are designed to hook young users – even at the cost of their physical and mental health – Reuters reports. They centre around scores of complaints by young people that the social media sites caused them to suffer anxiety, depression, eating disorders and sleeplessness.
More than a dozen suicides also have been blamed on the companies, Bloomberg reports, based on claims that they knowingly designed algorithms that drew children down dangerous and addictive paths. Several public school districts had also filed suits, claiming they could not fulfil their educational duties while students were coping with mental-health crises.
Yahoo News reports that more specifically, the complaints cite internal data that researchers at Meta disclosed to the company that Instagram can create a high level of social comparison between teen users, which can send these users into a “downward spiral.”
The lawsuit further alleged that Snap - the owner of SnapChat - has designed its app to capitalize on teens’ attachment to instant exchanges, rewarding users with different titles and statuses based on their engagement. Likewise, the lawsuit claims that TikTok made a coordinated effort to market to children and that Google’s YouTube was specifically engineered to exploit user addiction.
According to the new filing, internal documents at TikTok parent ByteDance show that the company knows young people are more susceptible to being lured into trying dangerous stunts they view on the platform — known as viral challenges — because their ability to weigh risk isn’t fully formed.
Young people are more likely to “overestimate their ability to cope with risk,” and their “ability to understand the finality of death is also not fully fledged,” according to the filing.
Bloomberg reports another unsealed portion of the filing claimed that instead of moving to address the problems around children using Instagram and Facebook, Meta defunded its mental health team. It said Zuckerberg was personally warned: “We are not on track to succeed for our core well-being topics (problematic use, bullying & harassment, connections, and SSI), and are at increased regulatory risk and external criticism. These affect everyone, especially Youth and Creators; if not addressed, these will follow us into the Metaverse.”
In their defense, the social media giants have pointed to a 1996 law, which gives internet platforms broad immunity from claims over harmful content posted by users.
A Meta spokesperson also told Bloomberg the claim that it defunded work to support people’s well-being is false. “In fact because this so important our company, we actually increased funding, shown by the over 30 tools we offer to support teens and families,” they said. "Today, there are hundreds of employees working across the company to build features to this effect."
Lawyers for the three plaintiffs who are leading the lawsuit, Lexi Hazam, Previn Warren and Chris Seeger, said in a statement the never-before-seen documents showed that "social media companies treat the crisis in youth mental health as a public relations issue rather than an urgent societal problem brought on by their products".
"This includes burying internal research documenting these harms, blocking safety measures because they decrease ‘engagement,’ and defunding teams focused on protecting youth mental health," they said.