08 February 2018

The German NetzDG: A Risk Worth Taking?

The German Gesetz zur Verbesserung der Rechtsdurchsetzung in sozialen Netzwerken (Netzwerkdurchsetzungsgesetz) (literally: Law on the improvement of law enforcement in social networks – NetzDG) has attracted much media attention since fully entering into force on 1 January 2018. This was sparked to a significant extent by a few high profile deletions, including a tweet from the responsible Minister for Justice.

This contribution will give a succinct overview of the NetzDG and explain how some of the criticisms are overstated and partially misguided. While the NetzDG is unlikely to resolve all challenges surrounding social media and freedom of expression, and undoubtedly presents a certain risk of stifling expression online, I believe it is nonetheless a significant step in the right direction. Rather than undermine freedom of expression, it promises to contribute to more inclusive debates by giving the loud and radical voices less prominence. In any case, it appears reasonable to let this regulatory experiment play out and observe whether fears over a ‘chilling effect’ on free expression are borne out by the evidence. A review of the law and its effects is planned after an initial three year operation period, which should deliver ample data and regulatory experience while limiting the scope for potential harm.

The statute in a nutshell

The NetzDG provides compliance regulations for social media platform operators with at least two million users within Germany. Social media networks are defined as internet platforms that seek to profit from providing users with the opportunity to share content with other users and the broader public. Platforms which provide individualised communication services, such as email or messaging apps, as well as platforms providing editorialised content, such as news websites, are explicitly excluded from the scope of the law (§ 1 I NetzDG).

The core obligations are setting up an effective and transparent complaints management infrastructure (§ 3 NetzDG) and compiling bi-annual reports on the complaints management activity (§ 2 NetzDG). Especially the latter reporting obligations are quite detailed and include provisions that set out the training and management oversight requirements of the social media platform operators. The complaints management infrastructure must chiefly ensure that the social networks delete or block illegal content within a specified timeframe. Deletion results in a global removal of the content from the platform, while blocking merely makes the content unavailable in Germany. Although blocking and deleting are thus distinct, they will be collectively referred to as deleting throughout the post.

Content is designated illegal if it falls under the one of the enumerated provisions of the German criminal code (Strafgesetzbuch – StGB). The most important ones for the purposes of freedom of expression are insult (§ 185), defamation (§ 186 and § 187), public incitement to crime (§ 111), incitement to hatred (§ 130) and dissemination of depictions of violence (§ 131). It is important to note that the obligation to delete or block is not novel. The NetzDG merely enforces an existing legal obligation under § 10 of the Telemedia Act (Telemediengesetz – TMG). Under that provision, social media operators are liable for illegal content on their site under criminal and private law.

NetzDG further distinguishes between manifestly illegal and illegal content and prescribes different deadlines for deletion. Manifestly illegal content must be deleted within 24 hours of a receiving a complaint, while merely illegal content allows for up to seven days before action must be taken. The most important exception to the seven-day deadline applies if operators refer the decision of whether to delete to an independent body of industry self-regulation.

Such bodies must be setup and funded collectively by social media platform operators and reach independent decisions that the operator accepts as binding. Certain conditions apply to bodies of industrial self-regulation (§ 3 VI NetzDG), and they must be accredited by the Ministry of Justice. Such bodies are a common feature in the German regulatory landscape and have been setup for instance by the movie, tv, and computer games industries to rate the age appropriateness of content (the FSK, FSF and USK respectively).

Separately from this duty on social media platforms to delete content, it remains possible for anyone to seek criminal prosecution for any content that violates the criminal code, and social media platforms must preserve the deleted content for evidence purposes for ten weeks (§ 3 II 4 NetzDG). Additionally, NetzDG requires social media platform operators to name an agent in Germany that is responsible for receiving complaints. A failure to name or lack of response from a responsible agent attracts a fine of up to 500.000 Euros, while any other failure to implement a complaints management scheme as set out in NetzDG can draw a fine of up to 5 million Euros. The latter increases to 50 million Euros for legal persons and corporations under § 30 II 3 of the Code on Administrative Offences (Gesetz über Ordnungswidrigkeiten), see § 4 II 2 NetzDG.

NetzDG and ‘hate speech’

It is debatable whether it is useful to view NetzDG as an attempt at curbing hate speech on social media. This is largely due to specific criminal law provisions referenced by the statute and the peculiarities these produce. While the criminal law provisions referenced in NetzDG cover some aspects of what is conventionally, and in the context of some jurisdictions legally, defined as ‘hate speech’, there is no general definition or use of this terminology in German law.  Collectively, the enumerated provisions of the German criminal code simultaneously criminalize more and less than would be encompassed by a generic ban on hate speech.

For instance, one might attract criminal liability for defamation when describing an abortion doctor’s work as ‘babycaust’ even though this does not fall under most definitions of hate speech, as it is not based on attributes such as race, religion, ethnic origin, sexual orientation, disability, or gender. Conversely, posters by a far right party depicting ethnically stereotyped people on a flying carpet with the caption ‘Have a good flight home’ did not attract criminal liability even though it at least arguably constitutes hate speech on the grounds of race, religion and ethnic origin.

Hence, the analytical value of hate speech is limited due to the particular criminal provisions NetzDG is based upon. It would be more accurate to say that the statute itself does precious little beyond seeking the removal of content that one cannot already express in public without the risk of criminal prosecution and sanctions. The law expressly avoids creating new criminal offences and does not, in any real sense, seek to expand existing limitations on freedom of expression in Germany. The fact that one could in the past express many views that constitute incitement to hatred on social media platforms without any real fear of repercussions does not fundamentally alter that conclusion.

An assault on freedom of expression?

It is rare for legal system to treat freedom of expression as an absolute right. Most European jurisdictions, including the German Basic Law recognise that there are limits. As a matter of German constitutional law, it is not clear whether the provision would run afoul of freedom of expression. At this stage it is useful to distinguish two scenarios.

In the first scenario, a social media platform operator deletes content that is illegal: in this case freedom of expression is not violated. Under the German Basic Law, freedom of expression does not cover insults and defamations, or incitement to hatred. To the extent that deletion of the illegal content amounts to an infringement, this is justified as it is provided by provisions of general laws under Article 5 II Basic Law. Moreover, deleting illegal content appears as a measured sanction, given that such statements, when made in offline scenarios, often attract criminal prosecution which may result in fines and prison sentences.

Conversely, in the second scenario the operator deletes content mistakenly deeming it illegal. Here, the issues become more complicated. The German Federal Constitutional Court has recognised that there is a presumption in favour of freedom of expression whenever it is unclear whether the expression is illegal, at least on topics of public interest. This has been settled case law ever since the famous decisions in Lüth and more recently in Wunsiedel. Notably, this protection extends to public forums, even where access to them is regulated through private law relationships. However, NetzDG notably does not require censorship (i.e. pre-emptively scrutinizing content before it is shared, which is unconstitutional under Article 5 I 3 Basic Law), nor does it discriminate against specific content (which would generally be unconstitutional). Rather, it primarily enforces legal obligations under § 10 TMG, and the requirement to delete only stretches to content already illegal under criminal law provisions.

Hence, arguments alleging unconstitutionality rely primarily on the unsubstantiated contention that NetzDG will promote an overly aggressive deletion policy (so-called ‘overblocking’) that will have a ‘chilling effect’ on freedom of expression for users of social media platforms: reducing their readiness to make use of their rights. If overblocking does take place as a result of NetzDG, then this would indeed be would be problematic under the German Basic Law.

The overstated danger of ‘overblocking’

Despite their prevalence in legal writing on the subject, concerns that social media platforms will, when in doubt, delete content rather than risk a fine, appear overstated. Overblocking is likely to arise, so goes the argument, due to the structure of the fines that apply to a systematic failure to delete illegal content. Hence, a prudent social media platform operator would, when in doubt and confronted with a flurry of complaints, delete content that is questionable, rather than risk a fine.

With respect to illegal content, the matter is unproblematic from a constitutional perspective. For the reasons stated earlier, social media users do not benefit from protections under freedom of expression for illegal content.

Again, the more problematic scenario arises when the social media platform operator mistakenly deletes legal content. For the user, this represents an infringement of freedom of expression. Indeed, if overblocking is a prevalent phenomenon beyond the occasional erroneous decision of the complaints management infrastructure, it could dissuade users from expressing their views on the platform. This in turn, would render the NetzDG significantly more problematic, and arguably unconstitutional. The Federal Constitutional Court has found a violation in ordering the publishers of a satirical magazine to pay compensation to an individual for an allegedly defamatory article, chiefly basing their ruling on the risk that it would discourage future exercise of freedom of expression.

However, it is not clear that such a chilling effect is inevitable: occasional, non-systematic mistakes by social media platform operators in an otherwise lawful complaints management infrastructure would arguably not suffice to produce such an effect. Notably, and contrary to the impression given by some reports, no fines attach to decisions in individual cases. Rather, a fine requires a systemic and persistent failure in the complaints management infrastructure which must be substantiated through content that has been ruled illegal by a court (§ 4 V NetzDG).

It is difficult to see why a social media platform operator, which ultimately requires continuous user engagement and content creation to be profitable, would adopt an overly aggressive deletion policy. An exodus of users would be sure to follow the consistent and arbitrary deletion of legal content, and thus critically undermine the viability of the social media platform. It therefore appears more likely that the limited scope of the fines and the inherent economic interests of social networks encourage a more nuanced deletion policy: one that complies with existing laws but avoids removing more content than necessary. However, even assuming a measurable ‘chilling effect’ this would not necessarily equate to the unconstitutionality of NetzDG.

A limited free speech environment

The argument is that freedom of expression does not necessarily equate to a right to access to any specific means of expression. For instance, a recipient of social security was not entitled to claim the necessary transportation costs to travel to a protest meeting. Moreover, access to and expression on most social networks is already significantly limited through private law terms and conditions. These grant platform operators wide-ranging powers to delete content or even indefinitely suspend accounts of users for actions that are unlikely to fall afoul of German criminal law. Against this backdrop, it is difficult to sustain an argument that the potential for unintended side effects of NetzDG are a unique or would as such suffice to find it unconstitutional. Social media platforms are hardly a free speech paradise: users already operate in an environment where arbitrary limitations are placed on freedom of expression, where deletion and suspension of accounts may occur without the possibility of appeal or redress to courts, and where the terms and conditions of participation can be altered at the sole discretion of the platform operators at any time.

Conclusion

Overall, the NetzDG might after all form part of a civilizing influence on online debate, instead of having a one-sided chilling effect on freedom of expression. The fact that to date social media and online interaction more generally, has created a space for a significantly more laissez faire approach to expression is neither here nor there on the question of constitutionality. The obligations to delete illegal content are based on well-establish limits to freedom of expression, to which NetzDG chiefly adds a more robust enforcement mechanism.

The constitutionality of NetzDG may to a considerable extent rest on an evaluation of the complaints management infrastructure that social media operators develop. Should it consistently, and inevitably, lead to a chilling effect on freedom of expression, then the argument for unconstitutionality grows stronger, but is in my view by no means straight-forward. Conversely, the Federal Constitutional Court would be less likely to take issue with this novel regulatory approach if the deletion of legal content is limited to individual, non-systematic mistakes. Ultimately, the goal must be to limit the divide between the online and offline world as far as possible: it is not evident why constitutionally acceptable limits on freedom of expression should not be extended and enforced on social networks.


SUGGESTED CITATION  Theil, Stefan: The German NetzDG: A Risk Worth Taking?, VerfBlog, 2018/2/08, https://verfassungsblog.de/the-german-netzdg-a-risk-worth-taking/, DOI: 10.17176/20180208-184652.

2 Comments

  1. Katja Thu 8 Feb 2018 at 21:53 - Reply

    “Rather than undermine freedom of expression, it promises to contribute to more inclusive debates by giving the loud and radical voices less prominence.”

    I am not following here. The view that a suppression of “loud and radical” voices could be a legitimate goal of limitations on freedom of expression was categorically rejected by the Federal Constitutional Court in its Wunsiedel judgment.

    Similarly, the ECtHR has regularly held that freedom of expression also applies to views that “offend, shock or disturb the State or any sector of the population”.

    Unless I am misunderstanding this and something else entirely is implied, the goal of shaping public discourse is not a permissible goal of limitations on freedom of expression.

    While this obviously does not make other rationales impossible, I do not think that this particular dog is going to hunt.

    Also, I would question that the fact that social media are “hardly a free speech paradise” can be made into an argument to allow the state to deputize them for purposes that the state itself could not pursue. By that token, most any restriction on freedom of speech could be legitimized by the simple expedient of putting the onus of enforcement on private actors.

    The main problem here is that any single private actor cannot effectively censor anything, due to competition with other private actors; but the state is not limited in such a fashion, as the universality of laws prevents competition in the marketplace of ideas to be an effective escape hatch from potential censorship.

  2. Stefan Theil Fri 9 Feb 2018 at 11:10 - Reply

    Thanks for engaging with my piece and for your thoughtful comments, Katja. A few quick rejoinders, if you will indulge me:

    1) I do not think it is constitutionally problematic to suppress speech that is illegal under criminal law – barring a finding from the Constitutional Court that these provisions are unconstitutional. Wunsiedel rejected the suppression of loud and radical, but legal expression (the same goes for the ECtHR rulings). So it does not really touch on my argument, unless and to the extent that you argue that ‘overblocking’ will occur. In that case however, I would humbly ask you to show me the evidence of that.

    2) The point of the free speech paradise serves merely to illustrate the difficult argument that those who argue for the unconstitutionality of NetzDG face: if there is a requirement on private actors to uphold basic rights of users, then there is a reasonable argument that facebook is already violating freedom of expression through its terms and conditions. This in turn makes the limited attempt of NetzDG to enforce German criminal law online seem far less problematic. Private actors are not asked to delete legal content, only to live up to their existing legal obligations to delete: whether they can or could not do so do to market pressures is neither here nor there on that question. I do not think that is deputizing them in an effort to suppress expression in any real sense, nor do I think it falls under the definition of censorship.

Leave A Comment

WRITE A COMMENT

1. We welcome your comments but you do so as our guest. Please note that we will exercise our property rights to make sure that Verfassungsblog remains a safe and attractive place for everyone. Your comment will not appear immediately but will be moderated by us. Just as with posts, we make a choice. That means not all submitted comments will be published.

2. We expect comments to be matter-of-fact, on-topic and free of sarcasm, innuendo and ad personam arguments.

3. Racist, sexist and otherwise discriminatory comments will not be published.

4. Comments under pseudonym are allowed but a valid email address is obligatory. The use of more than one pseudonym is not allowed.




Explore posts related to this:
Freedom of Speech, Hate Speech Regulation, Netzwerkdurchsetzungsesetz


Other posts about this region:
Deutschland