Facebook has announced that the suspension of Donald Trump’s Facebook and Instagram accounts will remain in effect but will be limited to an initial period of two years from the date they were suspended – 7 January 2021. It took this decision in response to the decision issued by the Oversight Board on 5 May, in which the Board upheld the suspension of the former president’s accounts but said “the indeterminate and standardless penalty of indefinite suspension” was inappropriate.
Facebook also announced the adoption of new enforcement protocols including the possibility of suspensions of one month to two years for serious violations by public figures. In all, it said it would fully adopt 15 of the 19 recommendations that the Board made on 5 May.
“The undertaking given by Facebook to implement the Oversight Board’s recommendations is positive in the very short term,” RSF secretary-general Christophe Deloire said. “Nonetheless, a privately-owned entity cannot be the judge of online information. The standards established by Facebook cannot substitute those established and enforced by democratic institutions, which the company itself recognises. This self-regulatory initiative by Facebook is symptomatic of a regulatory deficiency on the part of democratic states. We must move quickly to something else. It is becoming urgent for state regulators to define democratic obligations for platforms such as Facebook, especially with regard to transparency and promoting trustworthy information.”
The scope of the Oversight Board’s remit is extremely limited and does not include Facebook’s algorithmic mechanisms, which amplify of reduce the visibility of certain kinds of content and shape the public debate.
In the course of its deliberations on Trump’s suspension, the Oversight Board asked Facebook 46 questions, of which Facebook refused to answer seven entirely and two partially, arguing that the information was “not reasonably required,” “was not technically feasible to provide” or was covered by attorney/client privilege or other forms of protection. They included questions about how Facebook’s news feed and other features effected the visibility of Trump’s posts and may have contributed to the attack on the US Capitol on 6 January.
While content moderation decisions are important, so are the mechanisms that determine how content is delivered on the platform. It was these mechanisms that created an environment conducive to the dissemination of the false information and hate speech posted by Trump during his four years as president.