YouTube Will Be Removing Anti-Vaccine Misinformation on the Platform


YouTube has revealed in a blog that they will be managing harmful vaccine content on the platform.

According to the streaming service, “Today, we’re expanding our medical misinformation policies on YouTube with new guidelines on currently administered vaccines that are approved and confirmed to be safe and effective by local health authorities and the WHO.”

Here are the types of content they will remove:

  • Content that falsely alleges that approved vaccines are dangerous and cause chronic health effects
  • Claims that vaccines do not reduce transmission or contraction of disease
  • Contains misinformation on the substances contained in vaccines
  • Content that falsely says that approved vaccines cause autism, cancer or infertility, or that substances in vaccines can track those who receive them

These policies apply to specific routine immunizations (like for measles or Hepatitis B) and general statements about vaccines.

The blog post also reminded everyone that its community guidelines already prohibit certain kinds of misinformation. YouTube said in the blog, “at the onset of COVID-19, we built on these policies when the pandemic hit,” adding that they worked with experts to develop policies on COVID-19 and medical misinformation. Since it was implemented, over 130,000 videos that violate its policies have been removed.

YouTube also worked with local and international health organizations and experts to develop policies for vaccine misinformation.

What do you think? Share your thoughts below!

Do you have a story for the Team? Email us at or send us a direct message at Facebook Page. Interact with the team and join the Community at WIM Squad. Join our Viber group to be updated with the latest news!