Search
Generic filters
05 November 2024

Art. 21 DSA Has Come to Life

Art. 21 DSA is a new, unusual and interesting framework to settle disputes over online content moderation decisions. By now, the first four online dispute settlement bodies (ODS-bodies) have been certified, and most of them have already started taking cases. In this article, based on recent interviews with representatives from all certified bodies, I will explore how these very first ODS-bodies are set up and which very first experiences they have made. Continue reading >>
0
03 September 2024
,

Auditing Platforms under the Digital Services Act

Taming the power of online platforms has become one of the central areas of the European Union's policy in the digital age. The DSA increases the accountability of very large online platforms and very large search engines by introducing an auditing system. The audit process as defined by the DSA risks producing counterproductive consequences for the European policy objectives. From a constitutional perspective, the outsourcing of competence and decision-making from public to private actors articulates a system of compliance and enforcement based on multiple centres of power. Continue reading >>
0
26 July 2024

„Grundgesetz der Sozialen Marktwirtschaft“ meets Grundrechte

Wo wirtschaftliche Macht im Wettbewerb missbraucht wird, greift das Kartellrecht ein. Dass der Missbrauch wirtschaftlicher Macht neben Einschränkungen des Wettbewerbs auch zu Grundrechtsverletzungen führen kann, hat im deutschen Kartellrecht bisher selten eine Rolle gespielt. Dennoch hat das Landgericht Düsseldorf in einer aktuellen Entscheidung Grundrechte in die Wertung des Kartellrechts einbezogen – wenig überraschend ging es um Content Moderation. Das ist bemerkenswert und zeigt, dass das Kartellrecht im Rahmen seines Anwendungsbereichs sehr wohl Grundrechten zur Durchsetzung verhelfen kann. Continue reading >>
0
08 April 2024

To Define Is Just to Define

Social media allows users to share content worldwide. This also enables users to distribute illegal content. The laws of the EU Member States vary greatly when it comes to what content they consider to be illegal, especially regarding hate speech. Thus, it is important which national law applies in cross-border cases concerning online content. Ultimately, this question is closely linked to the broader reshuffling of power in the digital sphere: will it be actual ‘law’ that platforms enforce online or norms made by platforms themselves? So far, the law of 27 Member States plus the EU itself remains utterly chaotic compared to the more uniform Terms of Service (ToS) of the internet giants. Continue reading >>
0
14 February 2024

Absolute Truths and Absolutist Control

Last week, the Bombay High Court delivered its judgment in Kunal Kamra v. Union of India, comprising a split verdict on the constitutional validity of the Information Technology Rules, 2023. The rules install an institutional regime for determining – and warranting takedown by social media intermediaries – of content relating to the Central Government deemed “fake, false or misleading”. This regime was challenged on three main grounds – first, its violation of citizens’ free expression due to “fake, false, or misleading” speech being constitutionally protected; second, the pedestalization of state-related information, such that it enters public discourse with a single, truthful formulation, as being an illegitimate and disproportionate measure; and third, the violation of natural justice in enabling the state to determine truth and falsity concerning itself. Continue reading >>
08 January 2024

Putting X’s Community Notes to the Test

All of the biggest social media platforms have a problem with disinformation. In particular, a flood of false information was found on X, formerly Twitter, following the terrorist attack by Hamas on 7 October 2023 and the start of the war in Ukraine. The EU Commission therefore recently initiated formal proceedings against X under Art. 66 para. 1 of the Digital Services Act (DSA). One of the subjects of the investigation is whether the platform is taking sufficient action against disinformation. Despite these stakes, X takes an approach different to all other platforms: As can be inferred from the X Transparency Report dated 03.11.2023 posted information is not subject to content moderation, but solely regulated through a new tool: The Community Notes. Continue reading >>
08 January 2024

Community Notes auf dem Prüfstand

Die größten Social Media Plattformen haben ein Problem mit Desinformation. Insbesondere auf X, vormals Twitter, war nach dem Terroranschlag der Hamas am 07.10.2023 und dem Beginn des Krieges in der Ukraine eine Flut an Falschinformationen feststellbar. Daher hat die EU-Kommission vor Kurzem mitgeteilt, dass sie ein förmliches Verfahren nach Art. 66 Abs. 1 Digital Services Act (DSA) gegen X eingeleitet hat. Gegenstand der Untersuchung ist unter anderem, ob die Plattform hinreichend gegen dieses Problem vorgeht. X setzt dabei alles auf eine Karte: Wie aus dem X Transparency Report vom 03.11.2023 geschlossen werden kann, unterliegen Desinformationen nicht der sog. Content Moderation, sondern ihnen soll allein durch den Einsatz eines neuen Tools entgegengewirkt werden. Das heißt, dass die Nutzerinhalte auf X von Seiten des Betreiberunternehmens weder durch Algorithmen noch durch dazu beauftragte Personen auf Falschinformationen kontrolliert werden. Continue reading >>
0
18 October 2023
,

A Step Forward in Fighting Online Antisemitism

Online antisemitism is on the rise. Especially since the recent terror attack by Hamas in Southern Israel, platforms like X are (mis)used to propel antisemitism. Against this backdrop, this blog post analyses the legal framework for combatting online antisemitism in the EU and the regulatory approaches taken so far. It addresses the new Digital Services Act (DSA), highlighting some of the provisions that might become particularly important in the fight against antisemitism. The DSA improves protection against online hate speech in general and antisemitism in particular by introducing procedural and transparency obligations. However, it does not provide any substantive standards against which the illegality of such manifestations can be assessed. In order to effectively reduce online antisemitism in Europe, we need to think further, as outlined in the following blog post. Continue reading >>
23 September 2023

Be Careful What You Wish For

The European Court of Human Rights has issued some troubling statements on how it imagines content moderation. In May, the Court stated in Sanchez that “there can be little doubt that a minimum degree of subsequent moderation or automatic filtering would be desirable in order to identify clearly unlawful comments as quickly as possible”. Recently, it reiterated this position. This shows not only a surprising lack of knowledge on the controversial discussions surrounding the use of filter systems (in fact, there’s quite a lot of doubt), but also an uncritical and alarming approach towards AI based decision-making in complex human issues. Continue reading >>
0
31 July 2023

Warum Fehlinformation, Desinformation und Hassrede nicht gleich behandelt werden sollten

Der Umgang mit Fehlinformationen, Desinformationen und Hassrede im Internet ist ein hochaktuelles Thema. Eine im Juni 2023 vorgestellte Politikrichtlinie der UN zielt darauf ab, eben jene Phänomene zu bekämpfen. Es erscheint jedoch nicht sachdienlich Fehlinformationen, Desinformationen und Hassrede ähnlich bzw. gleich zu behandeln, wie es der UN Entwurf momentan vorsieht. Dieser Blogpost vertritt daher die These, dass zumindest Fehlinformationen - also unabsichtlich unrichtige Aussagen - anders behandelt werden müssen als bewusste falsche oder verletzende Äußerungen im Internet. Continue reading >>
Go to Top