Exclusive: YouTube removed 30,000 videos with incorrect information from COVID

YouTube has removed more than 30,000 videos that made misleading or false claims about COVID-19 vaccines in the past six months, said YouTube spokeswoman Elena Hernandez, offering the company’s first release of numbers for such content.

Why does it matter: Various surveys show that about 30% of Americans remain hesitant or suspicious of vaccines, and many of these doubts have been fueled by online falsehoods and conspiracy theories.

What is happening: Videos spreading misinformation about COVID-19 vaccines continue to appear online as more and more Americans are vaccinated.

  • Platforms, including Facebook and Twitter, have implemented policies to reduce the spread and reach of such content, but it is an ongoing challenge.

Background: YouTube started including incorrect vaccination information in its COVID-19 medical disinformation policy in October 2020.

  • Since February 2020, YouTube has taken down more than 800,000 videos containing incorrect information about the coronavirus. Videos are first flagged by the company’s AI systems or human reviewers and then receive another level of review.
  • Videos that violate vaccine policy, according to YouTube rules, are videos that contradict the consensus of vaccine experts from health authorities or the World Health Organization.
  • Accounts that violate YouTube rules are subject to a “strike” system, which can result in permanent banning of accounts.

Our thought bubble: Platforms are eager to share data on the amount of incorrect information they capture, and that transparency is valuable. But the most valuable data would tell us the extent of the misinformation that it is not caught up.

.Source