Facebook on Monday announced a major crackdown on the spread of what the tech giant considers “false claims” about COVID-19 and vaccines.
Moving forward, the company will remove any misinformation about coronavirus and vaccines from its platform, including claims that the virus was manufactured in the laboratory, that vaccines are not effective in preventing disease, that it is safer to catch the disease than vaccine, and that vaccines in general are toxic, dangerous or cause autism.
“In addition to sharing reliable information, we are expanding our efforts to remove false claims on Facebook and Instagram about COVID-19, COVID-19 vaccines and vaccines in general during the pandemic,” said the company’s chief of health, Kang-Xing Jin, said in a newsroom update. “Today, after consultation with major health organizations, including WHO, we are expanding the list of false claims that we will be removing to include additional denied claims about COVID-19 and vaccines.”
The exhaustive list of false claims that are now subject to removal is listed on the Facebook Help Center page. It broadly includes any content that minimizes the severity of COVID-19 or discourages good health practices, such as wearing a mask.
“We will begin to apply this policy immediately, with a particular focus on pages, groups and accounts that violate these rules, and we will continue to expand our application in the coming weeks,” wrote Facebook Vice President of Integrity, Guy Rosen. “Facebook, Instagram groups, pages and accounts that repeatedly share these debunked claims can be removed entirely.”
Rosen added that the company plans to access reliable information through “prosecutors[ing] relevant, reliable results “when people search for content related to vaccines or coronaviruses on the platform.
He noted that monitoring of content on the coronavirus pandemic has been going on since December, only now the list of false claims has grown.
The change marks a significant departure from previous actions, however. As the New York Times reported, in the past the company chose to “demean” coronary virus misinformation or reduce the content of people’s news feeds. But now the company is taking action to remove all that content.
According to the wording update, the new action is a response to a decision by the company’s Supervisory Board, which concluded that its rules and standards for health-related misinformation were “inadequately vague”.
The news comes as Facebook and other major social media companies are under intense scrutiny for their censorship practices. If the new action is any indication, it appears that Facebook has no intention of giving up on engaging in aggressive content deletion.