22 January 2018

The German Network Enforcement Act and the Presumption in Favour of Freedom of Speech

The Network Enforcement Act (NetzDG) violates the presumption in favour of freedom of speech. This does not mean that social networks should not be regulated. However, such regulation must not only combat “underblocking”, but has to counteract “overblocking” as well.

1. Since the beginning of the new year, the Internet in Germany is governed by new rules. Although the NetzDG has been in force since 1 October 2017, there was a transitional rule for its central provision: The procedure ensuring a rapid removal of illegal content, which social networks have to maintain (§ 3 NetzDG), had to be introduced within three months after the entry into force of the Act (§ 6(2) NetzDG; cf. the english translation of a draft here). It is not unlikely, therefore, that cases of removal or blocking of content will occur with increasing frequency now. In all of these cases, the question will arise if the provisions of the Act are compatible with freedom of expression.

2. The NetzDG responds to the massive difficulties in enforcing civil and criminal prohibitions in the social networks even in cases of blatant and unequivocal violations of the right to respect for personality (allgemeines Persönlichkeitsrecht, Art. 2(1) in conjunction with Art. 1(1) Grundgesetz; cf. Art. 8 ECHR).

If such utterances are deleted only after several days they have often already irreparably done most of their harm – due to the speed of communication in the social networks.

The NetzDG therefore obliges the large social networks (with two million or more registered users in Germany) to provide an “effective and transparent” procedure for complaints about unlawful content. This procedure has to “ensure”, among other things, that the provider of the network removes (or blocks access to) “manifestly unlawful content” (generally) within 24 hours and all other unlawful content usually within seven days (§ 3(1-3)). The breach of the obligation to maintain such a procedure can be punished with a regulatory fine of up to 5 million euros (§ 4(2)).

3. There is nothing wrong with removing and blocking unlawful content. The problem arises from the fact that the attempt to bring it about will always affect lawful content as well.

However, the procedural obligations and fines for regulatory offences of the NetzDG are solely aimed at preventing unlawful content. They do not contain any precautions to protect legitimate contributions from being deleted or blocked.

Only “underblocking ” (of unlawful contributions) is threatened with fines, while “overblocking” (of lawful contributions) remains unsanctioned. This suggests that the providers of the social networks, to avoid a fine, will establish procedures that ensure that, when in doubt, a contribution is rather erroneously blocked than erroneously left in place.

The regulatory model of the NetzDG thus amounts to a call for massive “overblocking”. It provides incentives exclusively in one direction – to remove and block content.

It is this legal one-sidedness of its regulatory model which makes the statute so blatantly unconstitutional (there seems to be wide agreement among German scholars on the statute’s unconstitutionality; see also my contribution on Hate Speech in the Internet, in: Marion Albers / Ioannis Katsivelas [ed.], Recht & Netz, Nomos [forthcoming] [in German], on which this blogpost is partly based).

The one-sided regulatory approach of the NetzDG is problematic regardless of how big its actual impact will be (there may after all also be a countervailing economic incentive to avoid damaging the company’s image and losing revenue by freedom of speech violations).

A possible deterring effect on the use of fundamental rights (“chilling effect”) is (here as well as in other contexts like informational privacy) less a question of empirical claims about “big numbers” and more a question of normative constitutional assessment: Severe constitutional damage is done even if only a few persons refrain from speaking although they have a constitutional right to speak.

4. When balancing freedoms of expression against other protected interests there is a fundamental presumption in favour of freedom of speech. This presumption applies at least to speech on matters which substantially affect the public (cf. the long standing case law of the German Federal Constitutional Court since BVerfGE 7, 198 [212] – Lüth [1958]; see also, extending this assumption to all areas: BVerfGE 7, 198 [208]; 124, 300 [342] – Wunsiedel [2009] [in English here, § 74]; on the seminal Lüth Case see the leading work in English on the Court’s jurisprudence: Kommers/Miller, here, p. 442 et seq.).

The presumption in favour of freedom of speech also (indirectly) binds private Internet companies such as Facebook, Google and Twitter. Since Lüth it was (also) applied to the general so-called indirect third-party effect of freedom of expression in private relations (on the doctrine of indirect third-party effect – “mittelbare Drittwirkung” –; see here, p. 60 f.). To the providers of the large social media networks, however, it has to be applicable all the more.

What is prevented or deleted on Facebook, Youtube or Twitter is often censored or restricted more effectively than would be possible by a state ban. Today, therefore, the privately operated social networks, are as a matter of fact – and in the crucial first few days probably inevitably so – largely responsible for protecting freedom of expression (as well as the right to respect for personality).

The constitutional obligations of those companies (which follow from the state’s duties to protect – Schutzpflichten – that create the indirect third-party effect) must keep pace with this development. In its Fraport judgment, the Federal Constitutional Court rightly made it clear that the scope of freedom of assembly is also open to privatized public spaces, which, like Frankfurt Airport, form a generally accessible “public forum” (see the reasoning in BVerfGE 128, 226 [250-255] – Fraport [2011] [in English here, §§ 62 et seq, especially § 70), and the reference to this the judgment in the dissenting opinions of the ECtHR’s Raelian Movement decision of 2012).

The basic idea of this Fraport decision is equally applicable to the privatized “public fora” of the Internet: the more the public debate shifts to privately controlled areas of network communication such as Facebook, Instagramm or Twitter, the stronger the legal obligations arising from fundamental rights have to be that the large companies have to meet that design, open and manage these spaces for commercial purposes.

5. But do the populist’s successes and the psychological dynamics of the uninhibited group polarization in the virtual echo chambers of the internet not indicate that we have to rethink the legal limits of freedom of expression and draw this right narrower than has until now been widely recognized under the German Constitution?

Right-wing populism in particular promotes a revival of lines of conflict along ethnic and religious identities, using the psychological mechanisms of intuitive “groupthink” that reinforce our already existing inclination to favor arguments and information that confirm our views (see on “confirmation bias” Kahnemann, here, pp. 80 f, on group polarization Sunstein, here, pp. 111-145): In the filter bubbles of social networks like-minded people get together and radicalize their hatred on other groups, mutually reinforcing themselves in their convictions. Based on an in-depth analysis of these group dynamics, Stefan Magen, for example, suggested at the last annual meeting of German scholars of public law that the limits of freedom of expression should be readjusted in the face of these threats.

However, at least if one considered a strong presumption in favour of freedom of speech as convincing before, the new technological possibilities do not seem sufficient to justify a change in course now. Such a narrowing down of free speech would probably not correspond to the original meaning of the German Constitution or, for that matter, the European Convention on Human Rights. Both legal instruments were created only a few years after the end of the Second World War – under the immediate impression of those crimes against humanity to which the racist hatred of National Socialism had led. Although social networks are a new phenomenon whose risk potential is hard to gauge as of yet, the enormous threat to human dignity and democracy as such that is posed by populist groupthink unleashed by xenophobic demagoguery is not.

6. The presumption in favour of freedom of speech (and other applicable freedoms of communication) should not only govern insofar as the Grundgesetz is applicable (or even has to remain so because the ordering of the freedoms of speech, press and assembly belong to those “important areas” where “sufficient space” has to be left to the Member States, cf. BVerfGE 123, 267 [358] – Lisbon [2009]; in English here, § 249). It also rightly follows from the freedom of expression under Article 10 of the European Convention on Human Rights and of Article 11 of the Charter of Fundamental Rights.

The European Court of Human Rights in Strasbourg (ECtHR), too, generally (and rightly) assumes that political speech is particularly important when balancing conflicting interests (see, for example, here, pp. 112-114) (even though, it has, to my knowledge, not yet expressly formulated a corresponding presumption in favour of freedom of speech).

In its Delfi judgment of 2015 (§ 147), however, the ECtHR expressly referred to the Google judgment of the Court of Justice of the European Union in Luxembourg (ECJ) of 2014, which, quite to the contrary, proceeds from a presumption in favour of the right to respect for personality – and therefore in favour of deleting the respective search results (see §§ 87, 91; for a critique see, e.g., here).

Both the Google judgment of the ECJ and the Delfi judgment of the ECtHR require, to my mind, a correction in this respect: In case of doubt, at least as far as statements on matters of public importance are concerned, freedom of opinion should take precedence.

7. Therefore, if the NetzDG is not repealed and if its basic regulatory model is to be retained, than it will need supplementary safeguards to ensure its compatibility with freedom of speech.

The extent to which the major social media providers have to bear constitutional responsibilities in view of their novel, central role for guaranteeing freedom of expression will have to be debated in the future with regard to the particular individual questions.

A completely one-sided regulatory structure like that of the NetzDG in its current form, however, cannot be compatible with freedom of expression. Providers of social networks cannot be subject to special sanctions, which are aimed solely at the removal or blocking of unlawful content, if this imbalance of regulatory incentives is not offset by equally effective mechanisms protecting freedom of expression. Otherwise, the unconstitutional asymmetry is predetermined: the presumption in favour of the freedom of political speech will be replaced by a presumption against it.

The necessary regulatory compensation could for example be a complementary threat of a fine for overblocking. That is, as long as the fine for underblocking remains in place, the provider should also face the threat of a fine if the procedure that has to be maintained does not ensure that contributions that have been removed or blocked on the basis of their alleged unlawfulness (or manifest unlawfulness) but that are instead lawful (or manifestly lawful) are not reinstated or unblocked within seven days (or 24 hours). In order to secure a larger corridor of non-sanctioned behavior, the threat of a fine could be confined – on both sides, mind you – to cases of manifest lawfulness or unlawfulness.

Such a regulatory counterweight would not create a right of users to post any lawful content, but would instead be limited to cases in which the removal or blocking is based on the (erroneous) assumption that the content is unlawful (on the provider’s duty to ensure that the user is given the reasons for a decision to remove or block content cf. § 3(2) Nr. 5 NetzDG).

Such additional obligations would of course increase the burden on social networks – and would for example exert some pressure them to buy qualified legal expertise for the necessary legal assessments. Ultimately, however, this only reflects the increased factual responsibility of the providers of Facebook & Co. for the fundamental rights interests at stake. The companies are in a somewhat tight spot between freedom of expression and the right to respect for personality. However, exaggerated pity does not seem appropriate – if one considers the enormous economic opportunities that also present themselves to them.

This article is based on a German version that has previously appeared on Verfassungsblog.


SUGGESTED CITATION  Hong, Mathias: The German Network Enforcement Act and the Presumption in Favour of Freedom of Speech, VerfBlog, 2018/1/22, https://verfassungsblog.de/the-german-network-enforcement-act-and-the-presumption-in-favour-of-freed