Online hate speech is a topic that has gained importance in recent years. In 2021, according to the report of the Office for the Protection of the Constitution (Verfassungsschutz), there were approximately 7500 cases of insults and incitement to hatred from an extremist background in Germany, the vast majority of which had right-wing extremist content. The Grand Chamber of the European Court of Human Rights (ECtHR) made an important ruling in this context on 15 May 2023 in Sanchez v. France. It decided that politicians’ freedom of expression does not protect them from criminal liability for publishing hate speech that their Facebook friends had posted on their publicly accessible Facebook page and that they had not promptly deleted. The complainant, Mr Sanchez, was a local councilor who stood as mayor and as chairman of the regional parliament for the party then still called Front National (FN). The comments in dispute referred to the opposing political candidate. One of them read: “This Bigwig has turned Nimes into Algiers, there is no street without a kebab shop and a mosque; drug dealers and prostitutes rule supreme, no surprise that he has chosen Brussels as the capital of the new world order of Sharia….” In further comments, another person lamented the “drug trafficking and further criminality carried out by Muslims”. French criminal courts fined the complainant alongside the authors of the comments for inciting hatred against persons of Muslim faith.
The challenge of this decision was to balance the right to freedom of expression of the complainant, who as an opposition politician had used his own Facebook page for election campaign purposes, with the protection of individual dignity and safety as well as public order against the publication of hate speech, which, according to the court, could have a particularly negative impact in times of election campaigning. The decision on the proportionality of the criminal liability imposed on the complainant was complicated by the fact that this case concerned responsibility for comments made by third parties. The Grand Chamber took two significant decisions in this regard. It confirmed its jurisprudence on the limitation of political speech during election campaigns, if this speech qualifies as hate speech. It decided for the first time that users of social media can be held criminally liable for statements left by third parties on their profile pages, without this constituting a violation of Art. 10 ECHR. From a democratic theory and individual rights’ perspective, I would endorse the first decision because it tackles the so-called “silencing” and “desensitization effect” of hate speech. The second decision, however, runs the risk of adversely affecting free political debate, especially when individual politicians are called upon to delete comments by third parties.
Hate speech during election campaigns
With its decision, the Grand Chamber confirms that also during electoral campaign freedom of expression finds its limits in hate speech. As presented in the judgment, the ECtHR could rely on the legal situation in the Convention states for this purpose. While not all of them formally prohibit hate speech, the majority of them prohibit forms of “incitement to hatred”. In comparison, the US Supreme Court had confirmed in 2017 in Matal v. Tam that the First Amendment protects hate speech. As Justice Kennedy pronounced, it should be left to the open and free societal debate to counter hate speech, as the state could otherwise use its authority to exclude the views of minorities or views that differ from that of the majority from public discourse.
The ECtHR bases its case law on the protection of human dignity. However, the ruling in Sanchez v. France can also be justified on the basis of democratic theory, more specifically the collective-democratic function of freedom of expression. Hate speech can exert psychologically mediated constraints on individuals of a particular social group. This can cause them to withdraw from public discourse (the so-called “silencing effect”, see here and here). Moreover, hate speech causes desensitization, a loss of the ability to understand others’ pain. In a downward spiral, this can destroy a common basis for political communication on a general societal level beyond inter-group relations. Quite contrary to Justice Kennedy’s assumption, the regulative function of an open and free social debate then ceases to exist. Especially during election times, it is important that the debate includes as many viewpoints as possible so that voters can make informed decisions. Only then does democracy retain its constitutive openness to the formation of new majorities. In this respect, the Court’s approach of delimiting the right to freedom of expression where it openly descends into hate speech seems prudent.
At the same time, it remains the Court’s task to curb excessive or even abusive restrictions of freedom of expression. For this purpose, the Court should not defer as much as in this this case to the legal assessment of domestic courts when it comes to classifying certain expressions as hate speech, even if the definition of hate speech contains a strong contextual element. While the Court must rely on domestic findings of fact, the question of whether a particular statement is likely to have a harmful impact on individual social groups and public order is for the Court to decide.
Liability of politicians for comments by third parties on Facebook page
The Court’s finding that the criminal liability imposed on the applicant is compatible with Art. 10 ECHR risks having an adverse impact on the exercise of freedom of expression in its collective-democratic dimension. The Court is attentive to the negative impact that the imposition of criminal liability, or the lack thereof, could have on freedom of expression as well as on the rights of those affected by hate speech, including arguments such as self-censorship or the risk that the number of similar cases of online hate speech could increase. However, it does not reflect on the practical effect that the ruling creates a certain dilemma for politicians who, like the applicant, hold municipal office and use their Facebook page for political purposes: either they risk their own criminal liability for not deleting comments, or they risk unduly restricting the freedom of expression of third parties through so-called “overblocking” by censorship or automatic filtering (on the “chilling effect” caused, see the dissent by judges Wojyzceck and Zünd as well as that of the Chamber decision; for automatic filtering, see here), which could potentially even give rise to administrative claims by third parties for the restoration of comments (cf. the case Knight First Amendment institute v. Donald J. Trump). Unlike commercial actors (cf. Delfi AS v. Estonia), not all politicians can rely on legal advice that would allow them to make reliable decisions in this regard. Imposing this not inconsiderable burden on politicians could therefore render the use of social media as a channel for public exchange unattractive.
Conversely, I would worry that the ECtHR has unduly ignored the risk that politicians could use the responsibility imposed on them as a justification to remove disagreeable comments – a means that could be quite effective in excluding other political opinions from political debates. The responsibility of politicians for an orderly public debate, as subsumed by the Court under the “duties and responsibilities” of Art. 10 (2) ECHR, might then threaten to slip into a content-based control of public debate by individual politicians. In doing so, it seems that the ECtHR expects the fox to guard the henhouse.
Overall, the Court does not sufficiently justify why the liability of a Facebook page operator, which it even describes as desirable (“should”, para. 185), is “necessary in a democratic society” (cf. the dissent of Judges Wojyzceck and Zünd). It fails to sufficiently answer the question why the imposition of criminal liability on the applicant is a necessary means to effectively counter online hate speech or to ensure effective legal protection against it. The obvious argument in favour is that an additional person is responsible for deleting such comments. However, it is questionable whether Metas’ obligation to cooperate in eliminating online hate crime, which is a punishable duty under Article 6 (7) of Loi n° 2004-575 du 21 juin 2004 pour la confiance dans l’économie numérique, does not already constitute an equally suitable, milder means, since it can ensure the rapid deletion of such comments. In any case, the effectiveness of the legal protection gained seems small. Given the seriousness of the interference with the right to freedom of expression, which enjoys increased protection during election campaigns, the balance of interests struck by the Court appears questionable.
Against this background, I would argue that a more balanced approach would have required at least that the applicant be informed of the unlawful content as a necessary precondition for establishing criminal liability (see the separate opinion of Judges Wojtyczeck and Zünd, cf. the statement by the UN Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression et al, p. 2, cited by the ECtHR). Such a notification requirement would justify that the criminal liability imposed on the applicant due to his inaction is proportionate to the aim pursued. For such inaction deepens the violation of rights incurred by hate speech by perpetuating it. Seen in this light, the ruling would be in line with the case decided by the European Court of Justice on data protection (Unabhängiges Landeszentrum für Datenschutz Schleswig-Holstein v. Wirtschaftsakademie Schleswig-Holstein GmbH, 5.6.2018, C-210/16, referred to by the Chamber). Here, the CJEU held not only Meta but also the operator of the Facebook page as having a duty to protect the users’ data. Since the operator decided independently on the purpose and means of data processing, the CJEU found the associated data protection violations to be directly attributable to him. In a case like Sanchez v. France, such imputability can be reasonably established in relation to the failure to delete despite notification. Indeed, digital service providers such as Meta enjoy some liability privileges as far as the responsibility for illegal content published on their channels is concerned. In the European Union, the E-Commerce-Directive (Directive 2000/31/EC, transposed into French law in the Loi n° 2004-575 du 21 juin 2004 pour la confiance dans l’économie numérique) has established the “notice and take down”-principle together with some host provider privileges. It expressly rejects a general monitoring obligation for host-providers and limits liability to cases in which they have positive knowledge of illegal content. To avoid inconsistencies, similar privileges must apply if a Facebook page operator is made equally responsible as a host-provider to delete unlawful content on their page.