(Bloomberg) — YouTube will begin removing content questioning any approved medical vaccine, not just those for Covid-19, a departure from the video site’s historically hands-off approach.
The division of Alphabet Inc.’s Google announced Wednesday that it will extend its policy against misinformation to cover all vaccines that health authorities consider effective. The ban will include any media that claims vaccines are dangerous or lead to chronic health outcomes such as autism, said Matt Halprin, YouTube’s vice president for trust and safety.
A year ago, YouTube banned certain videos critical of Covid-19 vaccines. The company said it has since pulled more than 130,000 videos for violating that rule. But many videos got around the rule by making dubious claims about vaccines without mentioning Covid-19. YouTube determined its policy was too limited.
“We can imagine viewers then potentially extrapolating to Covid-19,” Halprin said in an interview. “We wanted to make sure that we’re covering the whole gamut.”
YouTube has faced some — but not all — of the same political pressure as Facebook Inc. and Twitter Inc. for its handling of health information. President Joe Biden and U.S. lawmakers have blasted the platforms for inadequately policing vaccine skepticism and falsehoods. Politicians and pundits on the right have charged YouTube with censoring speech.
About 64% of the U.S. population has received at least one shot against Covid-19, with health experts saying the illness is increasingly becoming a disease of the unvaccinated. Employers are beginning to require shots for workers although some high-profile celebrities and professional athletes express skepticism about the effectiveness of the vaccines.
Garth Graham, YouTube’s global head of health care, said the company didn’t speak with the Biden administration about this recent policy update, but consulted with health experts. “There’s a lot of scientific stability around vaccine safety and effectiveness,” Graham said.
In February, Facebook said it would ban accounts that repeated conspiracies about vaccines and has tried taking stronger measures to combat vaccine hesitancy.
To date, YouTube has let people broadcast most anything about vaccines. Its new policy, which goes into effect Wednesday, will remove several prominent accounts that Biden has criticized. That includes Joseph Mercola, a vaccine skeptic with almost half a million YouTube subscribers, and Children’s Health Defense, a group affiliated with Robert F. Kennedy Jr.
Still, YouTube’s new rule will have two caveats. Halperin said the company will allow “scientific discussion” — videos about vaccine trials, results and failures. And YouTube will continue to permit personal testimonies, such as a parent talking about their child’s experiences getting vaccinated.
If videos aggregate these testimonials into one clip or make broader claims questioning vaccine efficacy, then they will be removed, according to Halprin. “It’s only when somebody then turns around and generalizes, ‘It’s not just about my child,’” he said. “Obviously there’s a line there.”
YouTube has said it has taken down more than 1 million videos since February 2020 for misinformation related to all aspects of Covid-19. The company also treats certain health videos as “borderline,” reducing their appearance in search results and recommendations. YouTube typically doesn’t disclose when this happens.
Halprin said it’s likely the footage targeted under the new policy is already being treated as borderline. But the YouTube executives didn’t share traffic numbers for these videos or how often YouTube’s system have recommended them to viewers.
More stories like this are available on bloomberg.com
©2021 Bloomberg L.P.