Facebook is finally taking a tougher stance on misinformation about vaccines. With help from the World Health Organization (WHO), the company has increased the number of complaints it will remove from its platform. You can see the full list in their Help Center, but some of the most notable include claims that suggest that COVID-19 is man-made or that it is safer to be sick from the disease than to be vaccinated against it. In a big step forward, the company says it will also eliminate claims that vaccines are toxic or that they can cause autism.
Facebook’s enforcement actions will initially focus on pages, groups and accounts that violate its new rules. The company says it will remove repeat offenders. Meanwhile, moderators of groups that have violated Facebook’s policies against COVID-19 and misinformation about vaccines in the past will have to approve each post on their page. As additional protection, third-party fact checkers can still examine claims that do not completely violate company policies on COVID-19 or vaccines. If they are found to be false or misleading, Facebook says it will tag and demote these posts.
“These new policies will help us to continue to take aggressive action against misinformation about COVID-19 and vaccines,” said the company.
In addition to taking a stricter stance on misinformation about vaccines, Facebook says it will simultaneously take additional steps to get the right information to people. Like Google, the company will help find out how they can be vaccinated. Starting this week, Facebook’s COVID-19 Information Center will include links to local health officials who have details on their websites about who can get a vaccine right now, as well as how to get it. As this information becomes more widely available elsewhere, Facebook will share that information in other countries, in addition to making the Information Center available via Instagram.
To complement these efforts, the company will donate $ 120 million in advertising credits to public health agencies, NGOs and the UN, as well as provide these organizations with training and support while working to bring reliable information to people.
Historically, Facebook’s efforts to curb misinformation about vaccines on its platforms have been ineffective due to the fact that the company has stopped banning this type of content altogether. Even with Facebook pointing people to reliable sources, accounts that promote conspiracy theories and inaccurate information have dominated its search results page.