“We feel this is an important first step, however, many essential questions remain. Much about the mechanics of this body needs to be specified. We need guarantees of real independence, as well as decision-making that conforms to international legal standards for freedom of speech,” said RSF San Francisco office director Sabine Dolan.
A challenging task
With more than 2 billion Facebook users worldwide, selecting a board that reflects the globe’s diversity will be a challenging feat. The board is expected to be composed of up to 40 experts, free from commercial influence, who specialize in human rights, technology, and journalism, among others. They will be charged with solving difficult calls pertaining to harassment, incitement to violence and how to rein in misinformation while respecting freedom of expression.
But for now, the charter draft published by Facebook’s vice president of global affairs and communications, Nick Clegg, raises more important questions than it answers. How will the cases presented to the board be selected? How will the independence of its members be guaranteed? How to guarantee transparency in the decision-making process? And how to ensure the coherent jurisprudence of this board?
The world’s largest social media platform says it will spend the next six months addressing these difficult questions, with the help of feedback from workshops held in Singapore, Delhi, Nairobi, Berlin, New York, Mexico City and other cities, along with submitted proposals.
Troubling unanswered questions
The creation of an institution designed to tackle Facebook’s most challenging and contested cases marks an important step forward for the company, but it’s insufficient. Here are a few preliminary points from the draft charter that caught RSF’s attention:
According to the draft, the board would operate within the framework of Facebook’s community standards, Facebook norms would therefore be applied.
The draft seems to indicate that content removed because of non-conformity to a national law could not be reintroduced by the board, even if it adhered to Facebook’s community standards.
Furthermore, without more precision, national laws seem to take precedence no matter what --- the board cannot decide against a national law, be it in a totalitarian regime or a democratic state.
It’s unclear who will speak on behalf of conformity to the law. Would it suffice for an authoritarian government, for example, to state that content is contrary to its law, for content to be removed, without the board being able to have a say against this decision?
What about content removed because it does not conform to Facebook’s community standards, even if it conforms to the law? From that angle, Facebook community standards take precedence over the law and international standards of freedom of expression.
The draft charter also indicates that Facebook would preserve the final say. If the board makes a decision that is not “consistent” with its own case law, the company could decide to not take it into account, as suggested in the draft charter’s question 11: "Facebook is ultimately responsible for making decisions related to policy, operations and enforcement."
Finally, one wonders what will happen to the millions of significant but low profile cases of contested content removal? This is something RSF has been dealing with in recent years as it relates to journalists, cartoonists and citizen bloggers across the world. Will this oversight effort ultimately promote greater checks and balances throughout Facebook’s content moderation policies?
Here’s hoping for greater transparency and accountability; RSF will be following progress closely.