Law on confidence in the digital economy (LEN): first jurisprudence on the responsibility of hosting companies picks holes in concept of "manifestly illegal" content

The central Paris court handed down a first explicit decision on the Law on confidence in the digital economy (LEN), that judges an Internet hosting provider's responsibility for content. The case demonstrated the difficulty of applying the concept "manifestly illegal" content, as introduced by the LEN.

The central Paris court on 15 November 2004 handed down a first explicit decision on the Law on confidence in the digital economy (LEN), that judges an Internet hosting provider's responsibility for content - and found in favour of the provider. The case concerning the 1915 Armenian genocide and pitting the Armenian National Committee (CDCA) against the Turkish consul in France and Wanadoo, demonstrated the difficulty of applying the concept "manifestly illegal" content, as introduced by the LEN law. The CDCA had laid a complaint against Turkish consul in Paris, Aydin Sezgin, and against Wanadoo in connection with articles contesting the Armenian genocide on the consulate's website (http://perso.wanadoo.fr/tcparbsk/). The court explicitly referred to LEN in assessing Wanadoo's responsibility "in the light of the interpretation contained in the decision of the Constitutional Council on 10 June 2004". Reporters Without Borders has constantly campaigned against the responsibility of providers established by LEN. Since the law's adoption, it has urged judges to show extreme vigilance in its interpretation. "The law imposes on technical providers the obligation to decide on the legality of content to which they provide access and effectively to take over the work of the courts. The Constitutional Court that ruled on the law had limited the responsibility of providers by introducing the concept of 'manifestly illegal' content. But the CDCA case against Wanadoo demonstrates that the concept is too vague to effectively protect freedom of expression. Deciding on the legality of content turns out to be an arduous business, which cannot be accepted by providers, particularly smaller ones," said the organisation. The entire case rested on the question: "Does contesting the Armenian genocide constitute a manifestly illegal act? To reach a decision the court had to examine a variety of national and international legal texts produced by the Armenian association. These were: the 1881 press law; the statutes of the international military court annexed to the London agreement of 8 August 1945; the French law of 29 January 2001 recognising the Armenian genocide; the international convention for the prevention and repression of genocide adopted by the UN on 9 December 1948 and an 18 June 1987 resolution of the European Parliament. The judges even went so far as to check the minutes of parliamentary sittings. The court finally decided that nothing in these texts confirmed that contesting the Armenian genocide was manifestly illegal. The decision exonerated the provider from any responsibility in the case. Nevertheless the court had to examine three international texts and two French laws to reach a decision. What would have been the judges' decision if one of the international conventions stipulated that Armenian genocide was not contestable? Would the court have then decided that providers were responsible. The question is far from rhetorical. The judgement given seems to suggest that the provider should, if faced with this type of content, check its legality in the light of national laws and also international legal texts. Certainly ignorance of the law is no excuse but is it reasonable to believe that providers, particularly the smaller ones would be able to handle such a task? According to Lionel Thoumyre, who works for the French consultative group 'Internet Rights Forum', the interpretation of the concept "manifestly illegal" should only be applied in relation to the degree of legal competence that one could expect from a provider. With this kind of interpretation, it would be easier to prove that the provider, who could not be considered an expert in international law, was not responsible in this case. It seems however that this ruling, even if favourable to the provider, demonstrates more fundamentally that the legal responsibility on providers is too heavy, despite the protection of the phrase "manifestly illegal". In wanting to put companies in the place of judicial authorities, the legislator has opened a Pandora's Box the consequences of which will gradually make themselves felt. The risk of advance censorship of the Internet by its technical providers remains on the agenda. Responsibilities established by the LEN The Constitutional Council made its ruling on 13 June 2004, on the Law on confidence in the digital economy. It reaffirmed the principle of legal responsibility of providers in cases where the illegal content has been drawn to their attention. Even though this step established a system of private justice on the Internet, the council took the view that it conformed to the European directive on which the French law was based. It however toned down the law as voted by parliament, to the effect that providers could only be held responsible if a judge had ruled the content illegal or if a web page was "manifestly illegal". This last point picked up a recommendation from the Internet Rights Forum, providing judges with limits to this law of responsibility. This meant that French jurisprudence only recognises as "manifestly illegal" content involving revisionist statements, child pornography, justification of war crimes and so on. It therefore became unlikely that service providers would be convicted for posting defamatory articles, for example.
Published on
Updated on 20.01.2016