Antivax content and misinformation about vaccines are now banned on Facebook

Open Sourced logo

Almost a year after the start of the Covid-19 pandemic, Facebook is taking its strictest stance against vaccine misinformation, banning it entirely. The ban does not apply only to the misinformation of the Covid-19 vaccine. This means, for example, posts stating that vaccines cause autism, or that measles cannot kill people, are no longer allowed on Facebook. At the same time, the platform will also encourage Americans to get vaccinated and refer people to information about when it will be their turn to get the Covid-19 vaccine and how to find an available dose.

These changes, part of the company’s broader momentum, are significant because, with nearly 3 billion users, Facebook is one of the most influential social media networks in the world. And as vaccines have started to spread around the world, many are concerned that misinformation – including misinformation on Facebook – may exacerbate some people’s refusal or hesitation to get vaccinated.

In a blog post published on Monday, Facebook explained that these changes are part of what it is calling “the world’s biggest campaign” to promote reliable information about Covid-19 vaccines. The effort is being developed in consultation with health authorities, such as the World Health Organization, and will include reliable information from organizations such as the United Nations and various ministries of health. (A list of banned vaccine claims, which was formed with the help of health officials, is available here.) The general approach seems similar to Facebook’s US voter registration initiative, which the company claims to have helped register several million people to participate in November. election.

“Covid-19 was declared a public health emergency a year ago, and we have since helped health authorities reach billions of people with accurate information and support health and economic aid efforts,” wrote Kang-Xing Jin, Facebook’s chief of health, on Monday. “But there is still a long way to go, and in 2021, we are focused on supporting health leaders and public officials in their work to vaccinate billions of people against Covid-19.”

A major caveat of the new policy is that just because Facebook says your guidelines on vaccine disinformation are changing, it does not mean that vaccine disinformation will not end up on the website anyway. Changing the rules and applying them are two different things. Despite previous Facebook rules banning misinformation specifically about Covid-19 vaccines, images suggesting that coronavirus inoculations came with extreme side effects could still go viral on the platform, and some accumulated tens of thousands of “likes” before Facebook knock them down.

A Facebook spokesman told Recode that the company will apply its expanded rules as it becomes aware of content that violates them, regardless of whether it has already been published or will be published in the future. The spokesman did not say whether Facebook is increasing its investment in content moderation due to the increased scope for vaccine misinformation, but told Recode that expanding its application will take time to train its moderators and content systems.

Still, Monday’s changes are significant because Facebook CEO Mark Zuckerberg, who has repeatedly defended the principles of free speech, now says the company will pay special attention to Facebook, Instagram and Facebook pages, groups and accounts (which Facebook has) that regularly share vaccine misinformation, and can remove them entirely. It is also adjusting search algorithms to reduce the prominence of antivax content.

Like other enforcement actions that Facebook has taken – in everything from the right-wing and anti-Semitic QAnon conspiracy theory to incitements of violence posted by Donald Trump – some say the company’s move is long overdue. “This is a classic case of Facebook acting too little, too late,” said Fadi Quran, campaign director for the nonprofit Avaaz, which leads its disinformation team, Recode. “For more than a year, Facebook has been at the epicenter of the disinformation crisis that has worsened the pandemic, so the damage has already been done.” He said that at this point, much more needs to be done to serve users who have already seen misinformation about vaccines.

Facebook’s announcement comes at a time when major technology platforms are fighting their role in the Covid-19 crisis. In the fall, experts warned that social media platforms were moving along a delicate line with respect to the global vaccination effort: although social networks must promote accurate information about Covid-19 vaccines, they said, platforms should also leave room for people to express honest questions about these relatively new vaccines.

“We have a new virus associated with a new vaccine associated with a new way of life – it’s too new for people,” said Ysabel Gerrard, a digital sociologist at the University of Sheffield, Recode at the time. “I think resistance to the Covid-19 vaccine will be on a scale never seen before.”

To what extent Facebook will apply its new rules, or how many people the platform will help to be vaccinated, it is unclear. The changes it announced on Monday came after experts repeatedly warned of Facebook’s role in promoting anti-vaccine conspiracy theories. For years, researchers have signaled Facebook as a platform where misleading and misleading information about vaccines – including the idea that vaccines may be related to autism – has proliferated.

Open Source is possible thanks to the Omidyar Network. All Open Sourced content is editorially independent and produced by our journalists.

Source