YouTube to remove all anti-vaccine misinformation videos - the new rules explained

The Google-owned video platform will crack down on any health misinformation about all vaccines

This article contains affiliate links. We may earn a small commission on items purchased through this article, but that does not affect our editorial judgement.

Watch more of our videos on Shots! 
and live on Freeview channel 276
Visit Shots! now

YouTube has announced that it will remove any videos containing misinformation about all vaccines, as it expands its policies around health misinformation amidst the coronavirus pandemic.

Last year, YouTube implemented a ban on Covid-19 vaccine misinformation, leading to 130,000 videos removed so far.

Hide Ad
Hide Ad

But more scope was needed to clamp down on broader false claims about other vaccines appearing online.

What are the new rules on YouTube?

Under new rules, which come into effect from next Wednesday (6 October), any content with false allegations stating any approved vaccine is dangerous and can cause chronic health problems will be removed, which includes any videos containing misinformation regarding the content of vaccines.

The streaming site said it was taking this action after seeing vaccine misinformation branching out into other false claims.

Social media and internet platforms have been repeatedly urged to clamp down on the spread of online misinformation.

Hide Ad
Hide Ad

Although millions of posts have been blocked or taken down, and new rules and prompts directing users to official health information have been added, critics suggest not enough has been done to stunt the spread of harmful content since the beginning of the pandemic.

What has YouTube said?

In a blog post announcing the rule update, YouTube stated: “We’ve steadily seen false claims about the coronavirus vaccines spill over into misinformation about vaccines in general, and we’re now at a point where it’s more important than ever to expand the work we started with COVID-19 to other vaccines.

“Specifically, content that falsely alleges that approved vaccines are dangerous and cause chronic health effects, claims that vaccines do not reduce transmission or contraction of disease, or contains misinformation on the substances contained in vaccines will be removed.

“This would include content that falsely says that approved vaccines cause autism, cancer or infertility, or that substances in vaccines can track those who receive them. Our policies not only cover specific routine immunisations like for measles or Hepatitis B, but also apply to general statements about vaccines.”

Hide Ad
Hide Ad

However, YouTube added there would be "important exceptions" to the guidelines, such as "vaccine policies, new vaccine trials and historical vaccine successes or failures", which the company state were vital parts of public discussion regarding the scientific process.

“Today’s policy update is an important step to address vaccine and health misinformation on our platform, and we’ll continue to invest across the board in the policies and products that bring high-quality information to our viewers and the entire YouTube community,” the company added.

A message from the editor:

Thank you for reading. NationalWorld is a new national news brand, produced by a team of journalists, editors, video producers and designers who live and work across the UK. Find out more about who’s who in the team, and our editorial values. We want to start a community among our readers, so please follow us on Facebook, Twitter and Instagram, and keep the conversation going. You can also sign up to our email newsletters and get a curated selection of our best reads to your inbox every day.

Comment Guidelines

National World encourages reader discussion on our stories. User feedback, insights and back-and-forth exchanges add a rich layer of context to reporting. Please review our Community Guidelines before commenting.