Facebook must explain how Feed bug made disinformation more visible
For six months, a bug in Facebook’s content-ranking algorithm increased the visibility of content previously flagged by fact-checkers as disinformation. Reporters Without Borders (RSF) calls for detailed public explanations from the US social media mega-company about this bug and its impact.
The flaw’s existence has just been revealed by The Verge, a US tech website that obtained a confidential internal memo about the bug by engineers at Facebook parent company Meta, the content of which is staggering.
Meta’s engineers discovered a “massive ranking failure” that may have increased views of disinformation in Facebook’s News Feed by as much as 30%. The bug may also have promoted nudity, violence, and even Russian state media content. RSF calls on Meta to provide precise explanations about its impact and to publish the relevant data.
According to the report obtained by The Verge, the technical problem dates back from 2019 and began causing a noticeable surge in disinformation in October 2021 and was not fixed until last month. Meta spokesperson Joe Osborne claimed in a tweet that the bug had no “meaningful long-term impact” but provided no evidence to support this claim.
“Meta is trying to be reassuring but it’s hard to take their word for it in the circumstances,” said Vincent Berthier, the head of RSF’s Tech Desk. “Why didn’t they make this memo public? The role that this social media platform plays in providing access to content does not allow them to remain silent about such serious flaws.”
Empires too vast to be governed secretly
Ever since its creation in 2004, Facebook has become increasingly ensnared in a labyrinth of complexity. Whistleblower Frances Haugen’s revelations in October 2021 highlighted how tinkering with social media software can cause malfunctions that their creators cannot control. Such problems are not limited to Facebook. They affect all social media platforms, which together constitute an excessively powerful but unstable content universe capable of escaping their creators’ control and being manipulated by powerful actors.
“The need for access to algorithmic data and for algorithmic transparency is more urgent than ever,” Berthier added. “Independent researchers must know how the platforms select and assemble the content offered to their users. The results of these studies should be made public in order to correct design errors and protect users.”
The report on Infodemics published under the aegis of the Forum on Information & Democracy (an entity launched by RSF) included calls for such measures in its 250 specific recommendations for platforms and decision-makers.
The Digital Services Act, which the European Union will soon adopt, will result in an increase in platform transparency and access to some platform data. But, although ambitious, the DSA does not go far enough in this area, as RSF deplored last month. The Code of Practice on Disinformation, currently being negotiated between the European Commission and the platforms, must also get the platforms to commit to greater transparency and to opening up data to researchers.
The United States is ranked 44th out of 180 countries in RSF's 2021 World Press Freedom Index.