Small number of Facebook users responsible for most of the skepticism about Covid’s vaccine – report | the Facebook

A small subset of Facebook users is said to be responsible for most of the content that expresses or encourages skepticism about Covid-19 vaccines, according to the first results of an internal Facebook study.

The study, first reported by the Washington Post, confirms what researchers have long argued about how the effect of the echo chamber can amplify certain beliefs in social media communities. It also shows how the discourse that falls short of complete misinformation about vaccines, which is banned on Facebook, can still contribute to the vaccine’s hesitation.

A document outlining the study – which has not been made publicly available – was obtained by the Washington Post. Facebook researchers divided users, groups and pages into 638 “population segments” and studied them for “hesitant beliefs about vaccines”, according to the Post. This may include language like “I am concerned about getting the vaccine because it is too young” or “I don’t know if a vaccine is safe”, instead of total misinformation.

Each “segment” can have up to 3 million people, meaning that the study can examine the activity of more than 1 billion people – less than half of Facebook’s approximately 2.8 billion active monthly users, the Post reported. . The large study also highlights how much information can be obtained from Facebook’s user base and how the company is using this treasure trove of data to examine public health outcomes.

The Post reported that the study found that in the population segment with the highest incidence of vaccine hesitation, only 111 users were responsible for half of all the content flagged in that segment. It also showed that only 10 of the 638 signaled population segments contained 50% of all vaccine hesitation content on the platform.

Facebook’s research on vaccine hesitation is part of an ongoing effort to help public health campaigns during the pandemic, said spokeswoman Dani Lever, and is one of a series of studies being conducted by Facebook.

“We routinely study things like voting, prejudice, hate speech, nudity and Covid – to understand emerging trends so that we can build, refine and measure our products,” said Lever.

Meanwhile, Facebook last year partnered with more than 60 global health experts to provide accurate information about Covid-19 and vaccines. It announced in December 2020 that it would ban all misinformation about vaccines, suspending users who violate the rules and eventually banning them if they continue to violate policies.

The study is just the latest to illustrate the huge effect that only a few actors can have on the online information ecosystem. This comes in the wake of another study by the Electoral Integrity Project, which found that a handful of right-wing “outreachers” on social media were responsible for most of the electoral misinformation in preparation for the attack on the Capitol. In that report, experts outlined a series of recommendations, including the complete removal of “super spreader” accounts.

The Facebook study also found that there may be significant overlap between users who exhibit anti-vaccination behavior on Facebook and supporters of QAnon, an unfounded conspiracy theory surrounding a “deep state” conspiracy of Democrats and Hollywood celebrities engaged in pedophilia and trafficking. sexual.

The overlap shows another long-term effect of the rise of QAnon, which was also linked to the Capitol insurrection in January. Many far-right actors, including followers and proponents of QAnon, understand how to manipulate social media algorithms to reach a wider audience, said Sophie Bjork-James, professor of anthropology at Vanderbilt University who researches the white nationalist movement in the United States.

“QAnon is now a threat to public health,” said Bjork-James. “Last year, QAnon spread widely in the online anti-vaccination community and, by extension, in the alternative health community. The Facebook study shows that we are likely to face the consequences of this for some time to come. “

Source