The Digital Services Act (DSA) is a bold attempt of the EU to create a safer digital space. It provides a whole set of notice and action mechanisms to address online harms. The codified mechanisms, together with detailed procedures, are foreseen for content that is illegal but also for content incompatible with platforms’ terms and conditions. But the DSA has also another goal, to ensure that the new rules respect fundamental human rights. Mindful of the effects that the notice and action mechanisms could have on the right to freedom of expression and access to information, the DSA includes remedies for situations when action leads to over-reaction. Such over-reactions may affect anyone whose content is considered shocking, controversial or otherwise “undesirable”, but not illegal. Often, content restrictions affect members of marginalized communities leaving them with no meaningful recourse (see here, here and here). Does the DSA include sufficient mechanisms to prevent that and to ensure access to justice?
The underlying rationale of the DSA is that everyone whose rights have been violated should have access to justice to remedy the situation. The idea that any violation of rights requires correction is reflected in Art. 17 of the DSA, which provides that any restriction on content by a hosting service provider should be followed by a statement of reasons to the affected recipients of the service. The statement of reasons should also include information about available redress mechanisms, such as internal complaint-handling mechanisms (Art. 20), out-of-court dispute settlement (Art. 21) and judicial redress. The DSA, therefore, offers three different redress routes that can be used in sequence or separately. Judicial redress does not have its own provision in the DSA, as it remains subject to national legislation and procedures. The DSA recalls however, on several occasions, that this redress route must be available.
The following paragraphs explain the core elements of access to justice: the right to fair trial and to effective remedy, as interpreted by the ECtHR and CJEU. The next part summarizes the most relevant elements of the DSA provisions on internal complaint-handling systems and out-of-court dispute settlements and provides critical assessment of the provisions. The analysis indicates that while definitely a good step towards more effective protection of users’ rights, the true effect of the provided remedies will depend on their practical implementation. It is further argued that some elements of the new regime may be a bold experiment the result of which is not fully predictable.
Access to justice and the DSA
Access to justice is not a right on its own. It is a notion that encompasses a number of core human rights, such as the right to a fair trial (Art. 6 ECHR and Art. 47 CFREU), and the right to an effective remedy (Art. 13 ECHR and Art. 47 CFREU). These are primarily procedural rights requiring states to organise domestic procedures to ensure better protection of rights. In a way, they serve as tools that help to give maximum effect to substantive rights, for example the right to freedom of expression, the right to privacy and reputation, freedom of assembly or freedom of thought.
The right to fair trial is composed of multiple elements, including procedural fairness. Procedural fairness refers to the way in which the case is handled rather than its outcome (Steel and Morris v. the United Kingdom, para. 95). Ensuring that the process of resolving conflicts is handled in a fair manner improves the perception of its legitimacy. The DSA attempts to achieve that by increasing transparency and foreseeability of the process in its detailed provisions on notice and action (e. g. Arts. 16-23).
The right to effective remedy embodies the principle ubi ius ibi remedium, meaning that where there is a right conferred on individuals, there must be an accompanying remedy to ensure its enforcement. The purpose of the right, therefore, is to allow a victim of a violation appropriate relief (Vilho Eskelinen and Others v. Finland, para. 80). Appropriate relief involves a measure that can stop the violation, or that allows the victim to obtain adequate redress, including compensation (Kaya v. Turkey, para. 106). The available remedy should be effective in practice and in law (Kudla v. Poland, para. 157). The “effectiveness” of a “remedy” does not depend on the certainty of a favourable outcome for the applicant (M.S.S. v. Belgium and Greece, para. 289; also Gebremedhin [Gaberamadhien] v. France, para. 53). Effectiveness also does not mean that a single remedy should entirely satisfy the requirement. Rather, a system of effective remedies may result from the combination of different mechanisms that are available for those affected.
The DSA refers to the right to effective remedy and to fair trial in multiple instances. In particular, in Art. 17, as well as in Recitals 39, 52, 55 and 59. The statement of reasons, described in Art. 17, should inform about the available redress mechanisms in a way that is clear, comprehensible, precise and specific to allow recipients their effective exercise. Recital 39 highlights that Member States, when applying the DSA, should respect the fundamental right to an effective remedy and to a fair trial, as provided in Art. 47 of the CFREU. The DSA, furthermore, should not prevent the national judicial or administrative authorities from issuing orders reinstating content that was in compliance with terms and conditions but has been erroneously considered as illegal and has been removed. Further, Recital 59 adds that the possibilities to contest decisions of online platforms should not affect the possibility to seek judicial redress.
The DSA requires the EU Member States to ensure that redress mechanisms are in place. But it also addresses the platforms falling within the scope of the DSA, by enumerating what is expected from them, i.e., creating the prescribed internal systems and participating in out-of-court dispute settlements. It should be highlighted that even though the DSA certainly took cues from the two human rights instruments (ECHR and CFREU) and the accompanying case law, their application is not strict. Full compliance with the provisions of the human rights instruments that are originally addressed to States – giving instructions on how to organize their judicial systems (Inuit Tapiriit Kanatami and Others v Parliament and Council, para. 100; see also Unión de Pequeños Agricultores v Council, para 41; Commission v Jégo-Quéré, para. 31) – cannot be achieved in the private enforcement context.
Internal complaint-handling systems
According to Art. 20, providers of online platforms should create an internal complaint-handling system and make it available for at least six months from the time a measure against a piece of content was taken. The system should allow the contesting of platforms’ decisions that led to content removal, restriction on visibility, suspension or termination of a service or an account, as well as decisions restricting the ability to monetize content. Complaints can refer to an action taken as result of a submitted notice or as a result of platforms’ own initiative. Further, the complaint system should allow the contestation of decisions both based on the illegality of content or its incompatibility with the provider’s terms and conditions. Art. 20 highlights that the complaint system should be available for the affected users but also to third parties who are not users but may want to submit a notice (e. g. regarding the post of a user). It should, furthermore, equally apply to decisions honouring or rejecting the submitted notice (e. g., removing or blocking content, or leaving it online). The crucial element of Art. 20 is that platforms should reverse their previous content moderation decisions (e. g., either honouring the notice or disregarding it), if the complaint contains sufficient grounds to justify such reversal.
Art. 20 DSA brings some balance to the notice and action mechanism, by specifically mandating platforms to reinstate content that, upon review, turns out not to be illegal nor incompatible with terms and conditions. But Art. 20 also allows the appealing of decisions where the content was left intact, disregarding a notice.
In the context of online expression, the right to an effective remedy comes into play on two separate occasions. First, when a victim of infringing expression attempts to stop it, for example by requesting removal. Second, in case of successful removal, when the author tries to contest the removal and asks for the expression to be reinstated. It can be used, therefore, by both sides of a conflict to remedy possible infringements of their rights. Including both scenarios is an improvement, since the initial DSA proposal did not foresee an appeal mechanism for disregarded notices.
The complaint-handling system should be easy to access and user-friendly, and it should enable and facilitate the submission of complaints that are sufficiently precise and adequately substantiated (Art. 20.3). Platforms should handle complaints in a timely, non-discriminatory, diligent and non-arbitrary manner (Art. 20.4). The same requirements appear earlier in Art. 16 on handling of complaint notices. There is no further indication what non-discriminatory and non-arbitrary mean exactly in this context, although Recital 58 adds that the system should lead to fair outcomes. It will be interesting to see if the two requirements will bring an end to platforms’ special rules providing more leeway for high-profile accounts. Such whitelisting practices have been revealed to give more protection to speech by those with large numbers of followers (politicians, journalists, celebrities and athletes).
Complaints should not be resolved solely on the basis of automated means and decisions should be taken under the supervision of appropriately qualified staff (Art. 20.6). After resolving the complaint, platforms should inform the parties about their reasoned decision, without undue delay. They should also include information about the possibility of out-of-court dispute settlement provided for in Art. 21 and other available possibilities for redress (Art. 20.5).
Operating a sophisticated internal complaint system per Art. 20 will inevitably be costly. It might be challenging for smaller platforms, especially if users start to appeal en masse. While providing for an appeal mechanism is arguably positive from the perspective of the right to effective remedy, handling a large number of appeals will not be easy. It will also most certainly lead to more content being reinstated, either because of successful appeals, decisions of the settlement bodies under Art. 21 or court orders. Does that mean that the DSA is effectively pushing for a right to forum, forcing platforms to host all content that is not illegal? This conclusion would be too far-reaching. This is also not the intention of the DSA. Platforms can still decide in their terms and conditions what content they do not welcome, subject to qualifications under Art. 14 (due regard given to freedom of expression, freedom and pluralism of the media and other fundamental rights and freedoms). They will most likely become more restrictive and more precise in listing all unwanted content, also as a result of the transparency requirement of Art. 14. That would also help them stay in line with recent decisions, such as in Germany, where courts started ordering reinstatement of content that was violating vaguely formulated terms and conditions, but was not illegal. More clarity and transparency regarding the internal rules of platforms is certainly a good thing.
Out-of-court dispute settlement
Art. 21 describes the functioning of out-of-court dispute settlement mechanisms to contest platforms’ content moderation decisions. Platform users, but also those who have previously submitted a complaint (as in Art. 20), should be able to select any certified out-of-court dispute settlement body. This route of redress can be used as a follow up, a form of second instance to complaints that have not reached a satisfactory outcome through the internal complaint-handling system. It could also be a self-standing mechanism for complaints that have not been subject to review through the internal system. Art. 21 clarifies that the possibility to refer the complaint to a settlement body does not impact the users’ right to initiate proceedings before a court, at any stage (see also Recital 59). Both parties to the dispute should engage in good faith with the selected body in an attempt to find a solution. Service providers, however, may refuse to engage if a dispute has already been resolved on the basis of the same information and the same grounds of alleged illegality or incompatibility of content (Recital 59, Art. 21.2).
It is crucial to note that the dispute settlement bodies do not have the power to impose a binding settlement of the dispute on the parties (Art. 21.2). This is another crucial twist as the initial EC proposal mandated the decisions to be binding (in Art. 18.1). The change of positions reflects the arguments that binding decisions would create extra-judicial bodies, would lead to must-carry orders and would prevent platforms from obtaining redress. The approach is also in line with the Directive 2013/11 on alternative dispute resolution. It leads to the question, however, of what will happen if a platform engages but then simply ignores any unfavourable decisions. The user could of course still seek recourse in court, but will they? Would such an approach be considered a failure to comply with the obligations of the DSA, and trigger consequences foreseen in Arts. 51 and 52? Users of the platforms (and organisations representing them) could also file a complaint to the Coordinators on the infringement on the DSA (Art. 53). Strictly speaking, there is no obligation to comply with decisions that are not binding. Such a continuous and systemic disregard, however, would definitely attract interest of the Digital Service Coordinators as well as the Commission (especially in case of VLOPs).
Art. 21 provides further details in nine paragraphs. The provisions specify that the rules of the procedure should be clear, fair and easily accessible (Art. 21.3). They specify time limits to reach a decision (90 days with possible extension of another 90 days for complex disputes) (Art. 21.4). They also describe the payment system, which differs depending on whose claim succeeds (Art. 21.5). For the users of the service the dispute settlement should be available free of charge or at a nominal fee, putting the main financial burden on the platforms. But if the costs are almost exclusively born by platforms, and it is the users who initiate and select a settlement body, will these bodies be inclined to rule more in favour of the users? Will it lead to high numbers of put-back decisions, especially for content that is “awful but lawful”? Again, platforms could become overly restrictive to avoid awful but also “undesirable” content to prevent such an effect.
Finally, certified bodies should report annually to the Digital Services Coordinator, specifying the number of disputes they received, information about the outcomes, the average time taken to resolve them, and any shortcomings or difficulties encountered (Art. 21.4). The Digital Services Coordinators should further compose reports on the functioning of the out-of-court dispute settlement bodies, identifying best practices and recommendations on how to improve their functioning. Both reports will surely be helpful to tweak the process along the way.
All things considered, it is uncertain what the impact of the out-of-court dispute settlement mechanism of Art. 21 will be. A specific regulated system in this form is rather novel in the area of content moderation. Due to the difference in scale, it cannot really be compared to the functioning of other mechanisms, e.g., the Facebook Oversight Board. At the moment, Art. 21 seems more of an experiment that leaves many open questions. The main one is whether the first stage of this experiment should not be limited to VLOPs only as they have the appropriate resources to handle the process in a non-disruptive manner (see, e. g., Daphne Keller’s intervention in the EP during the legislative process). After the initial findings on the functioning on the dispute settlement systems, and potential corrections of the issues identified in first reports, the scope could be broadened to other platforms.
The DSA takes a multi-step approach, offering not one remedy but three different options that can be used in sequence or separately. The triple-layered system in the DSA is certainly a positive development in comparison to the E-Commerce Directive, which did not contain any remedies to address unwarranted content restrictions. It is laudable that the involvement of courts is emphasized (e. g., in Recital 59) in a way that is rather unique for EU instruments. After all, independent courts of law remain the most apt institutions to adjudicate on conflicts between different (fundamental) rights.
Arts. 20 and 21 of the DSA and multiple mentions of the need to inform users about the available redress mechanisms throughout the Regulation certainly contribute to strengthening the right to an effective remedy and to a fair trial for those affected by the platform’s actions. They provide an additional safeguard to strengthen the respect for fundamental rights in the DSA, and for that, they should be considered a big step forward. The question remains, of course, how the provisions will play out in practice. They could (and hopefully will) lead to a more effective exercise of the right to freedom of expression online. Or maybe, a more pragmatic approach will win. And after the initial storm of appeals, platforms will become more restrictive in their terms and conditions, to protect their prerogative to choose the speech they want to host.