18 December 2020

Institutionalizing Parallel Governance

The Digital Services Act, Platform Laws, Prosecutors, and Courts

On 15 December, the European Commission published its proposal for the Digital Services Act (DSA-P). One, if not the, major challenge for the regulation of social platforms is which and how content is disseminated as well as moderated on such platforms.

At least when it comes to so-called very large online platforms (VLOPs, Art. 25) – read Facebook, maybe TikTok, YouTube, Twitter – the DSA-P’s path seems quite clear: Put platforms and the Commission in charge. Platform ‘Laws’, ‘Prosecutors’, and ‘Courts’ are set up to lead the way out of the mess together with the Commission. State courts, prosecutors, law enforcement and state law as such take the back seat. In the construction of a new social order for online platforms they are apparently no longer needed.

Tackling the problems

In the last years a myriad of problems arose with respect to platforms’ content moderation: disinformation/fake news, hate speech, political advertisement, and the limits of free speech, social bots, fake accounts, radicalization, filter bubbles, over-blocking, automated content review, and discriminatory practices – just to name a few.

Under the current legal framework, the e-Commerce Directive, these questions are not addressed at all. Platforms are granted limited, conditional liability for illegal content – at most they have to implement a notice and take-down procedure. Dealing with otherwise harmful content is left to the platforms entirely. With the DSA-P the Commission set out for a sweeping overhaul of the e-Commerce-framework in an attempt to tackle all the current and potentially coming problems of online platforms.

Platform Laws

The DSA-P strengthens the role of platforms’ “Terms and Conditions” (T&Cs, Art. 12) and thus their ‘law-making’ powers. Though the DSA-P signals to be first and foremost concerned with illegal content (particularly for liability issues, Art. 5, and in line with the e-Commerce Directive), T&Cs are all in all treated on par with national laws when looking at content moderation: It is them who establish what is “otherwise harmful content and activities online” (recital 52).

Acknowledging that they are in fact of predominant importance for the regulation of content on platforms, the DSA-P now directly addresses the content of T&Cs (as before only did the P2B Regulation). The DSA-P gives a clear nudge as to what they may entail: “[content] policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review” (Art. 12). In addition, platforms are obliged to establish a sanctions system: A “policy in respect of the misuse” of illegal content as well as the content moderation systems (Art. 20) – their own ’criminal law’ so to speak.

When interpreting and applying these T&Cs, human rights law takes the back seat, particularly national human rights law: Platforms shall give “due regard” only to “fundamental rights of the recipients of the service as enshrined in the Charter” (Art. 12) – full horizontal effect as is the direction of the German Federal Constitutional Court seems off the table. At the same time, the DSA-P offers less protections than the P2B, which makes retroactive changes to T&Cs illegal (Art. 8(a) P2B).

Platform Prosecutors

Under the existing e-Commerce-framework as well as under current platform T&Cs every user is a potential prosecutor as they can flag content via the notice system of a platform.

The DSA-P seeks to professionalize such prosecutorial efforts. Newly established “Trusted flaggers” will receive a privileged reporting role via technical interfaces (APIs) and a guarantee that their notices “are processed and decided upon with priority and without delay” (Art. 19). Though technically only required for notices of illegal content (Art. 14), platforms surely will expand this to T&C-violations as their preferred grounds for content moderation. These trusted flaggers are appointed not by national law enforcement authorities, but by the newly established Digital Services Coordinators (Art. 19(2)).

Although “relevant national judicial or administrative authorities” are not completely out of the game – they are allowed to issue an “order to act against a specific [!] item of illegal content” (Art. 8) – the big chunk of reporting will come from trusted flaggers.

Platform Courts

Lastly, the role of courts goes to the platforms (and their algorithms) as well as newly established private Platform Courts.

The first and second instance of adjudication is handled by the platforms themselves. For illegal content the DSA-P now expressly requires a notice and action mechanism (Art. 14). However, we know and the DSA-P rightfully presumes (e.g., in Art. 15, 17) that platforms already have and will keep a similar system in regard to T&C violations. For any decision taken platforms have to provide the user with a statement of reasons (Art. 15). As a second instance for all decisions regarding content and sanctions (suspension, termination) – whether based on illegal content or content “incompatible” with T&Cs – the DSA-P demands an “internal complaint-handlings system” (Art. 17). It seems that users in fact must make use of these first two instances before being able to go to national courts. Art. 14 and 17, unlike Art. 18, contain no clause stating that the mechanism is without prejudice to the right to go to court.

On top, the DSA-P proposes a third instance acting alongside national courts: “Out-of-court dispute settlement” bodies (Art. 18), to which all decisions under Art. 17 can be appealed. These Platform Courts must be “impartial and independent” and provide for “clear and fair rules of procedure”. Platforms in fact “shall be bound” by the decision taken by them – only the users maintain the right to redress against the decision before national courts. Fees are on the platform, unless the user loses in which case they have to cover their own fees. The dispute settlement bodies may serve for more than one platform. Again, these bodies are “certified” by the Digital Services Coordinators. Facebook’s Oversight Board seems ideally timed. Presumably, most users will choose Platform Courts, not state courts as third instances. Low costs and easy digital procedures will work in their favor. The task of establishing EU-wide, if not global free speech standards for platforms is now officially vested with such private bodies.

The EU Commission: The Platform Monarch?

Well, is this it in the new platform world: Three neatly set up branches of government? Not quite – the Commission may reign in at any time.

The Commission reserves broad enforcement powers for itself. At first, the DSA-P seems to suggest that national institutions will have their say as national Digital Services Coordinators are called upon to apply and enforce the DSA-P (Art. 38, 41). However, the DSA-P makes clear that these (once again) completely independent authorities (Art. 39) serve effectively at the pleasure of the Commission: In case the Digital Services Coordinators’ actions are not to its satisfaction (“did not take any investigatory or enforcement measures, pursuant to the request of the Commission”), the Commission may step in. It can then take all the measures the Digital Services Coordinators can take (Art. 52-59), including “request[ing]” the Digital Services Coordinators to go to national court (Art. 65).

The potential enforcement measures the Commission can take seem to be very broad. What opens the influence of the Commission on content moderation is a probably well intended tool: risk assessments by platforms (Art. 26). Platforms are obliged to take measures to mitigate identified systemic risks in light of “any negative effects” for a broad range of issues like freedom of expression and discrimination as well as any intentional manipulation of the service, with an actual or foreseeable negative effect on the civic discourse or the electoral processes and public security (Art. 27) – that is: all currently problematic content moderation issues. However, in the end it is the Commission that will decide whether such measures or “commitments” (Art. 56) are appropriate. This will give it a powerful say in the platforms’ content moderation setup – as an out of the box platform monarch.

To be fair…

The DSA-P offers more Platform Procedure than the e-Commerce directive ever did, thus addressing a broad range of criticism. Particularly the IGF Dynamic Coalition on Platform Responsibility’s (DCPR) “Best Practices on Due Process Safeguards” calling for users’ rights to prior notification and contest, counter-notice, human review, appeal, a (voluntary) independent and impartial alternative dispute resolution mechanism seem to have been heard.

But this comes as a cost. The DSA-P mainly codifies content moderation as platforms like Facebook are currently handling it. Rather than thinking of more inclusive ways, it knights it as the future of all online content moderation. Thereby, the DSA-P pushes established national institutions and authorities, as well as national legal principles and human rights law to the sidelines. Their influence on how content moderation is carried out and the influence on establishing a certain practice further diminishes. The question remains if European and national courts will play along or demand to have their say.


SUGGESTED CITATION  Weinzierl, Quirin: Institutionalizing Parallel Governance: The Digital Services Act, Platform Laws, Prosecutors, and Courts, VerfBlog, 2020/12/18, https://verfassungsblog.de/institutionalizing-parallel-governance/, DOI: 10.17176/20201219-052754-0.

Leave A Comment

WRITE A COMMENT

1. We welcome your comments but you do so as our guest. Please note that we will exercise our property rights to make sure that Verfassungsblog remains a safe and attractive place for everyone. Your comment will not appear immediately but will be moderated by us. Just as with posts, we make a choice. That means not all submitted comments will be published.

2. We expect comments to be matter-of-fact, on-topic and free of sarcasm, innuendo and ad personam arguments.

3. Racist, sexist and otherwise discriminatory comments will not be published.

4. Comments under pseudonym are allowed but a valid email address is obligatory. The use of more than one pseudonym is not allowed.




Explore posts related to this:
Digital Services Act, Governance, content moderation, internet platforms


Other posts about this region:
Europa