Facebook is doing its best to combat the anti-vaccination damage caused by Facebook

On Monday, Facebook unveiled a plan to vaccinate 50 million people, the latest in a series of efforts by the social media company to combat the Covid-19 pandemic and the misinformation that flourished on its platform. The campaign follows years of criticism directed at Facebook for not doing enough to combat the dangers of the anti-vaccination movement.

First announced in a post by CEO Mark Zuckerberg, Facebook’s plans include launching a tool to help people find and book appointments at local vaccination sites, expanding reliable vaccination information for health workers and adding labels to posts about the coronavirus that give people information The world health organization. The company is also expanding official WhatsApp chatbots to help people register for vaccines and offering new stickers on Instagram “so that people can inspire others to get vaccinated”. (WhatsApp and Instagram are owned by Facebook.)

In addition to all of this, and perhaps more critically, Facebook is doing something it hates: limiting the spread of information. The company also announced that it would temporarily reduce the distribution of content to users who violated Covid-19 and vaccine disinformation policies, or who continued to share content that their fact-checking partners unmasked. Finding out what misinformation is and what is not is a complicated business, and it is difficult to tell the difference between people intentionally deceiving others and having legitimate questions.

These efforts are based on Facebook’s existing promises. In February, Facebook announced that it would remove incorrect anti-vaccination information and use its platform for what it called the world’s largest Covid-19 vaccination information campaign, the start of which it announced this week. The social media company also partnered with public health researchers to discover the reasons for the vaccine’s hesitation – and how to fight it – through research on the platform.

Critics say Facebook’s efforts are not enough to contain the enormity of the situation that the platform itself helped to create.

Anti-vaccination rhetoric has flourished for years on the platform, which has provided a safe space for vaccine misinformation groups and has even recommended these groups to users. And much of the content that pushes the vaccine’s hesitation would not be considered disinformation, but opinion, so Facebook guidelines would not ban it, according to David Broniatowski, a professor at George Washington University who researches anti-vaccination communities.

“People who oppose vaccination are not putting forward arguments based mainly on science or facts, but on values ​​like freedom of choice or civil liberties,” Broniatowski told Recode. “These are opinions, but very corrosive opinions.”

For example, a post saying “I don’t think vaccines are safe, do you?” it probably wouldn’t be flagged as misinformation, but the tone can be insidious.

Facebook is aware that these posts that do not violate Facebook’s rules are causing hesitation in vaccines, according to a new report by the Washington Post. “Although the research is very early, we are concerned that the damage to non-infringing content could be substantial,” the story quotes in an internal Facebook document.

While Broniatowski praised Facebook’s initiatives to partner with health organizations and promote vaccine facts, he believes it could do something more effective: allow public health officials to target groups that hesitate to vaccinate with arguments as compelling as those advocated by vaccine detractors. He noted that the vaccine’s hesitation was being promoted by a relatively small share of Facebook users with great influence and that, likewise, a small group of public health experts could be used to fight it.

“You have some very sophisticated actors who make a number of arguments, whatever it is, to prevent people from being vaccinated,” he said. “We need a more nuanced response that is more sensitive to people’s real concerns.”

Facebook did not immediately respond with a comment.

People who refuse to be vaccinated have a wide variety of reasons, according to data released today by the Delphi Group at Carnegie Mellon University in partnership with Facebook. Of the respondents, 45 percent said they would avoid being vaccinated due to fear of side effects, and 40 percent cited concerns about the vaccine’s safety. Smaller percentages of respondents pointed to distrust of vaccines and the government. Addressing these concerns directly can have a significant impact on people’s willingness to receive vaccines.

Facebook can also ensure that its efforts to limit Covid-19 misinformation are more than just its most recent public relations campaign, said Imran Ahmed, CEO of the Center for Countering Digital Hate, Recode in a statement.

“Since Facebook’s last announcement about its intention to ‘repress’ anti-vaccine misinformation more than a month ago, almost no progress has been made,” said Ahmed.

“Facebook and Instagram still do not remove the vast majority of posts reported to them because they contain dangerous misinformation about vaccines,” he said. “The main propagators of the anti-vaccine lie, everyone is still present on Instagram or Facebook, despite promises to remove them.”

Since the announcement that banned disinformation of the vaccine in February, the company has said it has removed an additional 2 million pieces of content from Facebook and Instagram. It remains to be seen whether this and the new measures will cause another 50 million people to be vaccinated.

Source