Facebook’s ‘Supervisory Board’ cancels 4 cases in first decisions

Facebook’s Supervisory Board released its first round of decisions on Thursday, overturning several of the company’s decisions to remove posts for violating hate speech, violence and other issues.

The first decisions, which Facebook said it will comply with, come in the race for a much more important decision the board will make in the coming weeks: overturn Facebook’s decision to suspend former President Donald Trump’s account in the wake of the 6 riots January in Washington.

The council, a group of 20 journalists, politicians and judges from around the world, was formed last year and is tasked with judging how the social media giant is handling the most difficult content issues. He claims total independence from Facebook, and Facebook said the decisions it makes will be binding.

Thursday’s decisions provide a sign that the social media giant’s newly formed “Supreme Court” intends to make a mistake on the freedom of speech side.

“For all board members, you start with the supremacy of freedom of expression,” said Alan Rusbridger, one of the 20 board members and former editor-in-chief of The Guardian, in an interview before the decisions were made public. “So you look at each case and ask: what is the cause, in this particular case, of why freedom of speech should be restricted?”

The council’s first decisions refer to five cases where Facebook removed posts for violating its policies. And in four of the five cases analyzed, the board voted to overturn Facebook’s original decisions. The board also asked Facebook to give users greater clarity about its policies and how it intends to enforce them.

Two of these decisions concerned Facebook’s hate speech policy, one of which was overturned and the other maintained.

In the first case, Facebook removed a post from a user in Myanmar that appeared to discredit Muslims as psychologically inferior. Although the company decided that the post violated its policy, the board determined that the terms used “were not disparaging or violent”.

“While the post may be considered pejorative or offensive to Muslims, it does not advocate hatred or intentionally incite any form of imminent harm,” wrote the council.

In the second case, a user posted a term to describe Azerbaijans that Facebook interpreted as slander. The board also decided that “the context in which the term was used makes it clear that its intention was to dehumanize its target” and upheld Facebook’s decision.

The third case concerned nudity: the board revoked Facebook’s decision to remove an Instagram post from a user in Brazil with the aim of raising awareness about breast cancer. The post included five photographs showing women’s nipples, which the council declared permissible in light of Facebook’s own exception policy for “breast cancer awareness”.

The fourth case concerned violence: a user quoted Joseph Goebbels, the Nazi propagandist who is on Facebook’s list of “dangerous individuals”. Facebook policy states that quotes attributed to such individuals are an expression of support for that individual, unless stated otherwise. But the council said the quote “did not support the Nazi party’s ideology or the regime’s acts of hatred and violence”.

The fifth and final case concerned disinformation: Facebook removed a post from a user in France who falsely claimed that a cure for Covid-19 existed and criticized the French government for not making it available. Facebook said the post could lead people to ignore health guidelines or try to self-medicate.

The council, considering the context of the user’s posting, argued that the user was “opposing a government policy and intended to change that policy” and that its posting would not cause people to self-medicate, as the combination of these drugs was not available without prescription.

Source