News and HeadlinesNational News

Actions

YouTube promises to crack down on videos that contain misinformation about vaccines

YouTube
Posted
and last updated

YouTube said Wednesday it is changing its policies to prevent the spread of misinformation on all vaccines — not just COVID-19 vaccines.

In a blog post on Wednesday, the company said it will begin removing any video that falsely "claims that approved vaccines cause chronic health effects, claims that vaccines do not reduce transmission or contraction of disease, or contains misinformation on the substances contained in vaccines."

YouTube says the new guidelines cover videos about viral debunked conspiracy theories about vaccines, including conspiracies that claim shots contain tracking devices and those that claim vaccines cause autism, cancer or infertility.

The company added that the new misinformation policies apply to videos about specific vaccinations and statements about vaccines in general.

YouTube noted that there are some exceptions to the guidelines, including "personal testimonials" about vaccines, as well as videos that discuss a vaccine's successes and failures as well as their testing trials.

"We've steadily seen false claims about the coronavirus vaccines spill over into misinformation about vaccines in general, and we're now at a point where it's more important than ever to expand the work we started with COVID-19 to other vaccines," the company said in its blog post.

Earlier this year, YouTube said it had removed 30,000 videos that contained misinformation about COVID-19 vaccines over a six-month period. Despite their efforts, The Washington Post reported over the summer that the platform is still struggling to contain the spread of misinformation about the virus.