10 September 2024

Brave New World

Out-Of-Court Dispute Settlement Bodies and the Struggle to Adjudicate Platforms in Europe

The exhilaration and enthusiasm which followed the passing of the Digital Services Act (DSA) is long over. It seems that an initially prevailing sense of achievement and optimism has been replaced by a sceptical outlook: The DSA confers an excessive amount of power to the executive, big platforms only comply reluctantly, and the implementation of the DSA poses extraordinary challenges. No matter one’s perspective on the DSA, it seems clear that the party is over and the work begins. One of the perhaps oddest provisions of the DSA is Article 21. It calls for the creation of private quasi-courts that are supposed to adjudicate content moderation disputes. User Rights, based in Berlin, is one of the first organisations to assume this role. The self-proclaimed goal of User Rights is to provide a model of out-of-court dispute settlement (ODS) that other organisations can follow and set standards for the newly emerging landscape. To develop these standards, it has created the Article 21 – Academic Advisory Board. Such an Advisory Board has neither been foreseen by the law nor been anticipated by regulators. It’s an innovative response that aims at providing solutions to hard questions that both the law and regulators leave open. This blogpost outlines the opportunities and challenges of implementing the DSA in practice from the perspective of the “Article 21 – Academic Advisory Board”.

Out-of-court dispute settlement under the DSA and challenges of the emerging landscape

The DSA unifies a complex network of supervisory and enforcement structures to achieve its goal of a safe and trustworthy online environment. In addition to the Commission and national supervisory authorities, other stakeholders, including civil society, play an important role in the DSA’s regulatory regime. Beyond “trusted flaggers” (Article 22) and auditors (Article 37), the DSA now establishes the possibility of users to refer to an out-of-court dispute settlement (ODS) body under Article 21. The creation of independent bodies with a legal mandate to review all sorts of content moderation actions is unprecedented. So far, the ability for platform users to access redress solutions that review content moderation was rather limited. Optimistic visions of how ODS could work exist alongside scepticism and concern for how it impacts the rule of law.

The DSA requires online platforms (Article 3 i)) to establish an internal complaint-handling system that allows users to lodge complaints against content moderation decisions, e.g. the blocking or removal of content, the suspension of monetary payments or the termination of the user’s account (Article 20). Following this, users have the right to a reasoned decision by the platform, including the information about the possibility of calling ODS bodies. The latter are organisations certified according to Article 21 DSA by national Digital Services Coordinators (DSCs). The DSA envisions ODS decisions to be non-binding but requires platforms to cooperate with ODS bodies in good faith (Article 21 (2)). Conversely, it follows from Article 21(2) that platforms may only refuse to participate in dispute resolution proceedings for the reasons listed therein; otherwise, they may be fined. There is also hope for a pull effect: the more users turn to the ODS bodies, the greater the pressure on platforms to comply with the decisions.

The objective of out-of-court dispute settlement under Article 21 is to improve platform accountability and protect user rights and democracy. Yet, it is still unclear how ODS bodies should function in practice. The first ODS bodies have to answer difficult questions to make non-judicial redress work in digital environments. It is likely that the practices developed by them will set standards that will shape the broader development of the ODS landscape under the DSA.

User Rights, which is the first ODS body to be certified in Germany and the second in Europe, has therefore created an “Article 21 – Academic Advisory Board” which will provide guidance on what these standards should look like. Additionally, all ODS bodies focusing on social media content will be invited to work with the Advisory Board. They can bring the most difficult and consequential issues arising from their establishment and operations to the attention of the Board. The Advisory Board selects the most important issues, discusses these in bi-monthly meetings, and then publishes publicly accessible discussion reports. It already had its first meeting and published its first discussion report on Wednesday the 21st of August.

In its first meeting, the Board engaged with the question of whether shortcomings relating to statements of reasons should impact the decisions of ODS bodies. It discussed whether ODS bodies should comprehensively review compliance of platforms’ content moderation decisions with the DSA, including errors such as inadequate reasoning, or only focus on a substantive assessment. It reached differentiated conclusions which ODS bodies can rely on for concrete guidance. This solution is explained in detail in the discussion report. The following overarching themes shaped the discussion of the Board.

What standard of review?

One of the most important issues for the ODS bodies is the standard of review against which they measure user complaints. For instance, the explanations provided by the platforms thus far regularly fail to meet the requirements for a clear and comprehensible explanation stipulated in Article 17 DSA. The DSA itself does not specify a concrete standard of review; OBS bodies therefore have different options, ranging from a limited mandate that only covers the content and not the justification provided by the platform, to a full review of, for example, all the requirements of Article 17.

In our view, the best approach at this time is to adopt a differentiated assessment depending on the purpose of Article 17(3). The primary aim is to enhance the protection of fundamental rights, particularly the right to effective legal redress for users. When determining the relevance of fundamental rights, insights from administrative law may be borrowed, specifically the distinction between substantive and formal requirements. Content moderation decisions, as de-facto “self-executing acts”, should undergo comprehensive review by the ODS bodies, analogous to administrative court proceedings, concerning both the legal basis of the moderation decision and its justification (Article 17(3)(d), (e)). However, a review beyond the legal grounds provided by the platforms should not be required, as this would exceed the scope of effective redress in the specific case. Furthermore, formal requirements, such as references to appeals to an ODS body, need not be reviewed if the user’s complaint has already been addressed.

It is important to note that ODS bodies are not substitutes for courts, but rather an additional option for out-of-court dispute resolution. In many cases, the concept of “rational apathy”, familiar from consumer protection, takes hold, with users avoiding the expense of court proceedings in relation to what might be a relatively minor moderation decision by a platform. Consequently, the objective of strengthening legal protection in state courts is not contradictory and should not be overlooked.

Contribution to gradual improvement of platform’s practices

Another important theme emerging from the discussion was the extent to which ODS bodies could contribute to the gradual improvement of platforms’ practices regarding statements of reasons. These statements are a crucial element of the DSA’s effort to enhance user rights and promote platform accountability. The regime under Article 21 requires that ODS bodies engage with platforms’ statements of reasons under Article 17. Despite the challenges this entails, it also presents an opportunity for ODS bodies to positively shape the quality of platforms’ practices in this regard.

However, to achieve this, a coherent and constructive approach by ODS bodies is necessary. As noted, it is likely that a significant percentage of platforms’ statements of reasons do not fully comply with the requirements of Article 17. In such cases, one possibility would be for ODS bodies to adopt a default position of overturning platforms’ moderation decisions on formal grounds. However, doing so would largely prevent ODS bodies from fulfilling their core function of reviewing the substance of the content behind those moderation decisions. Moreover, a strictly formal approach would overlook the current context, namely the relative novelty of the DSA’s obligations and of the ODS bodies themselves. In this regard, it is reasonable to allow time and provide guidance for platforms to adjust and improve their compliance practices, including their statements of reasons. This is particularly important given that the quality of these declarations already appears to have improved since the DSA came into force. It is our view that ODS bodies should foster and contribute to this ongoing systemic improvement.

ODS bodies assuming a unique role in the broader development towards platform accountability in the EU

More broadly, ODS bodies represent another instrument of a broader system created by the DSA and other EU laws to enhance platform accountability. If done right, such a system will help ensuring that the decision-making of online platforms is increasingly exposed to a higher level of scrutiny, and they offer users a practical means of seeking redress. Even if they do not overcome administrative and judicial remedies, still they could play a central role to bring users closer to remedies and platforms more exposed to their responsibility for moderating content based on the standard mandated by the DSA. Indeed, the decision-making of online platforms will be increasingly exposed to further review, thus making the process of content moderation, at least, more exposed to different views and standards.

Nonetheless, it is critical to consider that the role envisaged by the EU to ODS also brings responsibilities. If done well, these actors can play another critical part in counterbalancing platforms’ power, as part of a new DSA policy landscape composed of different stakeholders including trusted flaggers and auditors. If their role supports the DSA’s broader objectives of creating a safer and more accountable online environment, ODS also raise primary constitutional challenges considering their position. The reviewing process of these bodies would also include assessing how platforms have dealt with fundamental rights to take a certain decision and they will be primarily involved in providing a motivation coming from their reviews.

This substantive assessment potentially allows users to access an effective remedy which would require less effort and costs which will be instead covered by the platform. We cannot exclude that this process could also lead to issues related to workload, de facto limiting the efficiency and the effectiveness of ODS. Nonetheless, such an issue should not be a justification to limit the possibility to restrict platforms discretion in content moderation and to provide users access to effective remedies.

Outlook: Cooperation of ODS bodies with other important actors, such as fact checkers and the news media

In their work, ODS bodies will inevitably encounter content moderation disputes related to misinformation and disinformation. While the large-scale spread of disinformation is recognised as a systemic societal risk under the DSA, errors in content moderation or poorly reasoned actions by platforms can also result into a systemic risk to the exercise of fundamental rights, including freedom of expression, media freedom, and pluralism (Articles 34 and 35).

Furthermore, another recent EU regulation, the European Media Freedom Act (EMFA), in its Article 18, establishes that media content is distinct from other types of content on very large online platforms and should thus be given a special treatment in content moderation. This provision of the EMFA, however, applies only when platforms act based on their terms and conditions, not when they address systemic risks as defined by the DSA.

The actions of major platforms against disinformation have been guided by their commitments under the Code of Practice on Disinformation, a form of self-regulation and the central instrument of the EU’s policy against disinformation. The Code is now transitioning to a co-regulatory model of compliance with the DSA’s systemic risk management.

Due to the complexity of this area, the ODS bodies need to identify their roles within the broader framework of the DSA and in relation to other relevant EU laws and determine how they should engage with existing mechanisms and networks. ODS bodies are likely ill-suited to carry out assessments of whether information contains harmful misinformation. Therefore, it would be advisable for them to cooperate with fact-checking organisations and networks, such as the one established within the European Digital Media Observatory (EDMO). EDMO also closely monitors developments related to the Code of Practice on Disinformation through its Policy Analysis pillar. As regards the special consideration for media content and the new requirement for its distinct treatment in content moderation, ODS bodies should work with representative organisations of media and journalists.


SUGGESTED CITATION  Ruschemeier, Hannah, Quintais, João Pedro, Nenadić, Iva, De Gregorio, Giovanni; Eder, Niklas: Brave New World: Out-Of-Court Dispute Settlement Bodies and the Struggle to Adjudicate Platforms in Europe, VerfBlog, 2024/9/10, https://verfassungsblog.de/ods-dsa-user-rights-content-moderatin-out-of-court-dispute-settlement/, DOI: 10.59704/46b8611eb2d96a84.

Leave A Comment

WRITE A COMMENT

1. We welcome your comments but you do so as our guest. Please note that we will exercise our property rights to make sure that Verfassungsblog remains a safe and attractive place for everyone. Your comment will not appear immediately but will be moderated by us. Just as with posts, we make a choice. That means not all submitted comments will be published.

2. We expect comments to be matter-of-fact, on-topic and free of sarcasm, innuendo and ad personam arguments.

3. Racist, sexist and otherwise discriminatory comments will not be published.

4. Comments under pseudonym are allowed but a valid email address is obligatory. The use of more than one pseudonym is not allowed.