The Network Enforcement Act apparently leads to excessive blocking of content

(This article is available on the Reporters Without Borders Germany website: http://ogy.de/zgsc)

(This article is available on the Reporters Without Borders Germany website: http://ogy.de/zgsc)

 

In view of recently published figures, Reporters Without Borders knows that Facebook and Google are blocking content that is in fact legal. Germany’s so-called Network Enforcement Act (Netzwerkdurchsetzungsgesetz -NetzDG), which came into force at the start of 2018, obliges social networks to quickly remove illegal content from their platforms or face penalties (http://ogy.de/nare). Apparently, this pressure has led these companies to delete large amounts of content that was in fact legal in an effort to ensure that they will not be punished under the Network Enforcement Act. When deleting the content, Facebook and Google cite their community standards. In these standards they stipulate what kind of content users may share on their platforms and reserve the right to also remove content that is protected by communicative freedoms.

 

With the Network Enforcement Act the German government has turned private companies into judges presiding over press freedom and freedom of information on the Internet without providing for official oversight of the deletion procedure. An independent supervisory body is, however, necessary in order to detect “over-blocking”, or in other words the deletion of legally admissible content,” said ROG’s Executive Director Christian Mihr. “Facebook and Google delete content according to their own rules because they see themselves solely as private companies and are trying to impose their own digital house rules. Their platforms, however, have become part of the modern public sphere, so that people must be able to say anything they like there that doesn’t contravene any laws.”

 

THOUSANDS OF DELETIONS BECAUSE OF “HATE SPEECH”

 

Google says it received complaints against approximately 215,000 videos on its video platform YouTube pursuant to the Network Enforcement Act in the first half of 2018. According to the company it removed around 27 percent of this content. The company offers users a very simple option for invoking the Network Enforcement Act: if they want to report a video, all they have to do is tick a box on the online form. Google then examines first of all whether the video breaches its own community standards. Only if this is not the case is the content then checked in terms of compliance with the Network Enforcement Act.

 

In the same period Facebook received just 886 complaints pursuant to the Network Enforcement Act, in which 1,704 posts were reported as objectionable. Several posts can be reported in a single complaint, but the reporting procedure is far more complicated. For example, users must specify the concrete offence in a separate form. Facebook deleted 21 percent of the content reported under the Network Enforcement Act. How many posts were deleted by Facebook in Germany on the basis of its own standards is not known – but it is likely to have been several times that amount. In its own transparency report, the network claims to have removed around 2.5 million posts in the first quarter of 2018 on the grounds that they contained so-called “hate speech”. In 2017 in Germany, according to the company’s own figures, it removed approximately 15,000 posts per month for containing “hate speech”.

 

Figures for the messaging service Twitter released on the same day also point to an “over-blocking” of content on the basis of community standards. In the first half of 2018 Twitter received 264,818 complaints. In 28,645 (10.82 percent) of these cases the messaging service took measures against the content, for example deleting it. However, Twitter – like Facebook and Google – first screens content according to its own community standards, so that a large amount of content had already been deleted in this first filtering phase. These community standards, however, also allow for the removal of legal content, as Twitter invokes its own set of “digital house rules” here.

 

FACEBOOK, GOOGLE AND CO. HAVE BECOME KEY PROVIDERS OF INFORMATION

 

The figures above raise the question of how much freedom these platforms should have in terms of defining their community standards. These companies are sources of information for billions of people. Reporters Without Borders considers them to be a key component of the process of keeping societies informed, meaning that they play an essential part in ensuring that people can inform themselves freely and independently in a democratic public sphere. Yet on the basis of their community standards they are deleting content that is in fact admissible under German law. They are effectively imposing their own set of digital house rules that users must agree to if they wish to use these services.

 

The state-imposed pressure to delete data resulting from the Network Enforcement Act has therefore clearly led to these community standards being used to “purge” platforms of questionable content – and in cases of doubt also of content that would normally be protected by the law. The Network Enforcement Act lists 21 norms from the penal code. If a company deems that certain content contravenes one of these norms, it is in general obliged to remove the content within 24 hours. If it systematically fails to meet this obligation, it faces large fines. But if companies already delete large amounts of problematic content on the basis of their community standards, they avoid the danger of such fines. There is no independent monitoring of their deletion practices.

 

CREATE AN INDEPENDENT SUPERVISORY BODY

 

Reporters Without Borders believes that, in view of the recently published figures, the German government has the obligation to correct the Network Enforcement Act without delay. ROG calls for the creation of an independent supervisory body to oversee companies’ deletion procedures. In addition to operators, representatives of the judiciary and prosecutors, this body could incorporate “users’ advocates” and members of civil society organisations. The supervisory body would have in particular the role of overseeing the procedures of private operators as a whole, or in other words making decisions beyond individual cases, and also of developing guidelines for dealing with content that is reported as illegal. Moreover, it should be obliged to keep the public informed and could also function as an Appeal body  in the event that a user objects to a decision to delete content. Cases of dispute would then not have to be dealt with directly by a court, but would still be extricated from the non-transparent deletion procedures of individual companies.

 

In their coalition agreement the Christian Democratic Union (CDU) and the Social Democratic Party (SPD) already announced their intention of reviewing terms of use such as community standards in order to determine whether they adequately protect consumers’ rights. The grand coalition must now put this into practice. The size of the companies in question and their relevance for the information interests of society make such a step necessary. Their special status as key providers of information goes hand in hand with special obligations of diligence and the duty to submit to public monitoring.

 

HIGH RELEVANCE FOR PRESS FREEDOM

 

Digital platforms like Facebook, Google and Twitter are an essential component of the modern public sphere. They offer huge potential for journalism to inform the public. Especially in countries with limited freedom of information and press freedom the following is becoming clear: in media systems dominated by censorship Facebook and co. offer a space in which independent information can be spread freely and new audiences reached. One example from the activities of Reporters Without Borders is the Egyptian opposition media outlet Mada Masr, which reaches most of its readers through the social networks – without which it would hardly be able to survive.

 

Reporters Without Borders, however, is observing with concern that in states all over the world the potential for freedom is being disproportionately hindered, for example by the legal obligation to delete content on a massive scale. The Network Enforcement Act is a negative example of such legislation, which has already been adopted by other states. This makes it all the more imperative that the systematic errors in this law be corrected.

 

Germany ranks 15th out of 180 states on the World Press Freedom Index.

 

FURTHER INFORMATION

 

Video statement by ROG Executive Director Christian Mihr on the Network Enforcement Act:

http://ogy.de/jwxh

 

More information about press freedom in Germany: www.reporter-ohne-grenzen.de/d...

Published on
Updated on 03.08.2018