31 October 2022

The DSA has been published – now the difficult bit begins

The Digital Services (DSA) has finally been published in the Official Journal of the European Union on 27 October 2022. This publication marks the end of a years-long drafting and negotiation process, and opens a new chapter: that of its enforcement, practicable access to justice, and potential to set global precedents.

As many authors will analyze in more detail throughout this symposium, the DSA is intended as the EU’s new landmark piece of legislation for addressing illegal and harmful content and activity online. It has been portrayed as Europe’s new “Digital Constitution”, which affirms the primacy of democratic rulemaking over the private transnational ordering mechanisms of Big Tech. While it extends the e-Commerce Directive’s core principles for the regulation of online services that handle third-party content, and codifies existing self-regulatory practices initiated by online platforms, it also introduces several significant legal innovations: a tiered system of due diligence obligations for (very large) intermediary services, the regulation of content moderation through terms of service enforcement, systemic risk assessment obligations for the most widely used platforms, and access to data for researchers. In sum, with the DSA, the EU aims once again to set a global standard in the regulation of the digital environment. But will the DSA be able to live up to its expectations, and under what conditions?

It is this and related questions that this online symposium will address. We have invited a group of leading experts to comment on the DSA, focusing on the overall outcome of the DSA and on three main themes moving forward: (i) the implementation and enforcement of the DSA; (ii) access to justice in relation to content moderation processes; and (iii) international impact and what standards the DSA may be furthering globally.

Implementation and Enforcement of the DSA

A crucial aspect for the success of the DSA relates to the proper implementation of its enforcement framework and the application of its due diligence requirements in practice. The enforcement framework includes a combination of new regulatory authorities at the national (Digital Services Coordinators) and EU level (as part of the European Commission). As underlined by many (particularly civil society and academics), these elements will be decisive as to whether the DSA will deliver on its goals, and whether its rules will be capable of meaningfully protecting fundamental rights. In the long run, the question of legitimacy will be particularly important. Will the independent authorities for overseeing the regulation of content moderation processes, an area perhaps even more contested than online privacy, be broadly accepted by market players and the general public?

As shown by the GDPR, ambitious substantive rules are nothing but a “paper tiger” without effective enforcement. Serious failures in the GDPR’s enforcement have clearly influenced the DSA’s enforcement chapter and surrounding negotiations. In effect, the DSA opts for more centralized enforcement against the most powerful platforms by the European Commission, and includes strict deadlines for Digital Services Coordinators and the Commission to act.

At the national level, the EU member states must decide how to position and equip their national regulators. Dealing with this hot potato of regulatory competence will not be easy, as the DSA cuts across media law, telecommunications regulations, consumer protection, data protection, intellectual property and criminal law. Some countries may decide to create new regulatory agencies in the process, while others may allocate the relevant oversight tasks to (a possible combination of) existing agencies. Pragmatism, path dependency and national particularities may open a plethora of institutional approaches to platform governance.

Given the profound fundamental rights implications of the DSA, the choice of further developing the European Commission into the most important regulatory authority for online content governance deserves continued debate and scrutiny. The Commission is already set to further increase its enforcement powers in other pieces of legislation under discussion or recently adopted. This raises constitutional issues relating to independence and the separation of powers. In the area of platform regulation, the Commission is not an independent regulatory authority, but the executive branch of the EU, which put forward the DSA proposal and played an active role in its finalization.

Other stakeholders (users, researchers, civil society organizations) are also given a significant role to play in the DSA enforcement architecture. Vetted researchers, for instance, will be able to gain access to platform data to investigate relevant harms and dynamics in platform governance. One of the most relevant questions concerns whether the DSA provides these actors with adequate tools to contribute meaningfully and effectively to the enforcement of its rules, particularly from a fundamental rights perspective.

Finally, the DSA also regulates content moderation practices based on terms of service and requires that services moderate transparently, proportionately, and with due regard to the fundamental rights and interests of users and other stakeholders. The precise interpretation of this new provision, which builds on the horizontal effect of fundamental rights between users and online services, will involve complex balancing requirements and an interplay between national constitutional safeguards and EU .

Access to Justice and Content Moderation

One of the main policy goals of the DSA is to create a safer online environment. It is one thing for the DSA to provide for new mechanisms to address online harms, it is another for those to deliver on their promise in practice. Whether this goal is met will depend on whether the DSA succeeds in offering adequate access to justice to people confronted with online harms. In this regard, codification of the notice and takedown and complaint mechanism can be seen as a step forward. However, it is an open question to what extent this offers sufficient remedies, given the breadth of online harms the DSA addresses. For example, whether the DSA can provide individual or collective opportunities to contest terms and conditions remains to be seen. The matter is further complicated by the fact that substantial barriers to justice often prevent meaningful access to complaint and redress mechanisms or remedies against these harms. Ultimately, effective remedies against online harm and abuse will remain dependent on the platforms’ implementation of the DSA requirements, and on national particularities of procedural law more generally.

Even though it is clear that the impact of online harms is spread unevenly, it is still insufficiently understood what online harms are faced by different (including marginalized) groups, how these harms differ and intersect, and where access to justice and opportunities for contestation of platform practices are needed. In particular, various types of unlawful content (such as harassment or racism), as well as over-removals or bans, disproportionately harm marginalised communities. For the DSA to succeed in contributing to a healthy digital environment for all, it will be essential to understand these different needs, and involve civil society organisations representing these interests in the implementation and enforcement debate.

International Implications of the DSA

Finally, EU regulation has an undeniable impact beyond European borders. The so-called “Brussels effect” – the ability of the EU to shape global standards by exercising its regulatory power – has been a distinctive feature of earlier EU law, in particular the GDPR. Since its announcement, discussions about the DSA proposal have been accompanied by the awareness that the DSA may have a profound regulatory resonance on a global level. US-based platform regulation experts and policymakers have thus followed the DSA debate closely, perhaps not least because the largest platforms more heavily regulated under the DSA are mostly US firms.

The same issues and societal risks which the DSA seeks to address are affecting – perhaps even more significantly, and with additional complexities – countries in the so-called Global South. The possible adoption of DSA standards outside the EU raises the question of whether these rules, if implemented, could help advance the platform regulation efforts elsewhere, and promote fundamental rights and other democratic values. At the same time, the DSA’s approach could pose risks in less democratic countries, particularly in light of the civil society critiques of some aspects of the DSA, including the centralization of certain enforcement powers.

The line between the safeguarding of fundamental freedoms and democratic values online versus regulatory competition with other regions is thin. A question which thus accompanies deliberations on the DSA’s extraterritorial effects is, fundamentally, why the EU is attempting to set international standards, and whether it does so mindful of possible collateral effects.

A Preview 

In the few days that the DSA has been officially published, its necessity has already been made abundantly clear. Against this backdrop, more than a dozen expert authors spanning policy, academia and civil society across five continents will come together in the coming days and debate some of the questions sketched out above, as well as raise many more. We have learnt a lot from the texts in this symposium and hope they will move the debate forward.

We are looking forward to hearing from Julian Jaursch on what member states must take into account when designing a strong Digital Services Coordinator, with whom the DSA’s enforcement – hence its overall success – stands and falls. Ilaria Buri offers a perspective on the role of the European Commission as a central enforcement authority; while clearly a role ascribed to the Commission as a lesson from the GDPR, it raises fundamental questions about the separation of powers, and the Commission is already caught between conflicting policy objectives.    

Alessandro Mantelero scrutinises the fundamental rights impact assessments foreseen by the DSA; the risk-based approach adopted is not supported by adequate models, and existing frameworks from human rights impact assessments contexts are limited when extended to the digital context. Asha Allen invokes intersectionality in the risk assessment context; the DSA highlights the risk of online gender based violence, however, its approach to addressing such risks must adopt an intersectional methodology, without which mitigation measures and access to remedies will fail to provide the necessary mechanisms for those most acutely impacted by these rights violations. 

Folkert Wilman looks at the DSA through the lens of CJEU rulings relating to the e-Commerce Directive; it may appear as if the DSA had simply preserved and codified the status quo made by case law, however, in terms of the intermediary liability framework, a notable evolution has taken place. Nicolo Zingales looks at the DSA’s meta-regulatory approach of regulating the self-regulation among very large online platforms; while this shift should be welcomed for enabling reflexive and adaptive regulation, we must also be weary of its risk of collapsing in the absence of well-resourced and independent institutions.  

Aleksandra Kuczerawy turns to a central lesson the DSA has learnt from the e-Commerce Directive – it codifies three avenues for access to justice in the case of unwarranted content restrictions, to be used in sequence or separately; while they appear comprehensive on paper, practice may be another story. Tomiwa Ilori looks at the DSA from a pan-African perspective and treats it with cautious optimism; while the DSA’s precedents may have a Brussels Effect in Africa, it will only be positive insofar as local contexts are foregrounded in the transposition. 

“Now what?” asks Catalina Goanta; how shall we approach the DSA’s omissions? She explores native advertising in the influencer economy on digital platforms and highlights how it currently falls in a grey area, between the DSA and sectoral regulation. Relatedly, Nayanatara Ranganathan shows that while the DSA has set its sights on recommender systems and ‘influence’, it sidesteps the crucial operative question that characterizes online advertising: how and why advertisements reach who they reach.

Pietro Ortolani explores the DSA’s “Procedure Before Substance” approach to content moderation – rather than pursuing any major harmonization of the substantive law applicable in this very broad and porous area, the DSA concentrates on proceduralising access to justice on and off digital platforms. Whether this approach will pay off is unknown. Daphne Keller considers “the good”, “the bad” and “the future” of how the DSA will be received outside of the EU. While the procedural turn may set positive impulses, the rest of the world should see the DSA as no more than a starting point.

Sebastian Becker Castellaro and Jan Penfrat concur: the DSA is useful, but misses the bigger picture: not even the most carefully designed content moderation policy will protect us from harmful online content as long as we do not address the dominant, incredibly damaging surveillance business model of most large tech firms. Alexandra Geese sees these business models as drivers of the rise of authoritarian regimes worldwide – algorithmically amplified into visibility and success. She is optimistic that the DSA tackles the information asymmetry which allows platforms to polarize; via audits, risk assessments and researcher access.

Finally, Martin Husovec foregrounds that the DSA’s success depends entirely on societal structures that the law can only foresee and incentivize but cannot build; only people can. From consumer protection groups to research communities – how can people be supported in building bottom-up enforcement structures?

 

This online symposium was realized thanks to funding by the Digital Legal Studies initiative, an interuniversity research program on law and digital technologies in the Netherlands, and the Institute for Information Law (IViR). The DSA Observatory is supported through funding by the Open Society Foundations and the Civitates initiative for democracy and solidarity in Europe. The work of João Pedro Quintais in this symposium is further funded by his VENI Project “Responsible Algorithms: How to Safeguard Freedom of Expression Online” funded by the Dutch Research Council (grant number: VI.Veni.201R.036).


SUGGESTED CITATION  van Hoboken, Joris, Buri, Ilaria, Quintais, João Pedro, Fahy, Ronan, Appelman, Naomi; Straub, Marlene: The DSA has been published – now the difficult bit begins, VerfBlog, 2022/10/31, https://verfassungsblog.de/dsa-published/, DOI: 10.17176/20221031-095722-0.

Leave A Comment

WRITE A COMMENT

1. We welcome your comments but you do so as our guest. Please note that we will exercise our property rights to make sure that Verfassungsblog remains a safe and attractive place for everyone. Your comment will not appear immediately but will be moderated by us. Just as with posts, we make a choice. That means not all submitted comments will be published.

2. We expect comments to be matter-of-fact, on-topic and free of sarcasm, innuendo and ad personam arguments.

3. Racist, sexist and otherwise discriminatory comments will not be published.

4. Comments under pseudonym are allowed but a valid email address is obligatory. The use of more than one pseudonym is not allowed.




Explore posts related to this:
Brussels effect, DSA, Digital Services Act, Enforcement, European Commission


Other posts about this region:
Europa