31 October 2022

A Regulator Caught Between Conflicting Policy Objectives

Reflections on the European Commission's Role as DSA Enforcer

The Digital Services Act (DSA) has landed on an increased centralization of its enforcement powers in the hands of the European Commission (EC). The Regulation grants the Commission exclusive supervision and enforcement powers vis-à-vis the biggest platforms and online search engines for their most important due diligence obligations (such as the ones on the assessment of systemic risks, access to research data and crisis protocols). In addition, the Commission is also competent – together with the Member States – to supervise the same platforms for their compliance with rules which do not apply exclusively to very large online platforms (VLOPs) and very large online search engines (VLOSEs). However, the national regulators (the Digital Services Coordinator, “DSC”) will only be competent to step in when the Commission has not taken any initiative against the same suspected infringement. The final text also introduces an annual supervisory fee, to be paid by the VLOPs and VLOSEs, to cover the costs incurred by the EC as a result of its supervisory tasks.

As observed in previous analysis, the rationale behind EC centralized enforcement is understandable, particularly in light of the experience with GDPR enforcement. At the same time, this choice also raises new issues that are worth discussing.

This analysis focuses on the implications, from a fundamental rights and democratic values perspective, of opting for the EC as the body in charge of supervising and enforcing the DSA against the most powerful online platforms. Given the importance and broader implications of the DSA, the policy choice of making the EC the most important enforcer in the DSA architecture needs to be scrutinized more, especially where the centralization of enforcement powers around the EC may become recurrent in future pieces of legislation. In particular, aspects that deserve more attention relate to the difference between the EC and a separate independent EU supervisory authority and to the tensions inherent to the different policy objectives pursued by the EC, which might impact the way it performs its oversight tasks under the DSA.

The DSA regulators and their independence: the Digital Services Coordinators and the European Commission

Article 50 of the final DSA text states that DSCs must carry out their task in “an impartial, transparent and timely manner” and that they must exercise their tasks and powers “with complete independence, […] free from any external influence, whether direct or indirect, and [without taking] instructions from any other public authority or any private party”. In this regard, this language is identical to that of Article 52 GDPR on the independence of supervisory authorities.

Under the GDPR, the independence and willingness of certain national supervisory authorities to enforce the law has been questioned. This is particularly the case for the Irish DPC and its regulatory performance under the one-stop-shop mechanism, which also led to a formal complaint before the EU Ombudsmanii about the EC’s failure in ensuring that the GDPR is adequately applied across the EU. In the context of the DSA debate, this situation contributed to the legislative choice of granting the EC – which is presumed to be more resilient to dynamics of regulatory capture than national regulators – key functions in the oversight and enforcement of the DSA. Indeed, as recently admitted by the EC’s Vice-President Vestager, “there was distrust”, among member states that Ireland could act as an effective regulator against Big Tech.iii As the DSA’s rules and their enforcement will have a clear and undeniable impact on fundamental rights and democratic values, the question arises of how the above-mentioned requirements of independence play out when it comes to the EC. Complex assessments involving fundamental rights such as the freedom of expression, right to privacy, and any restrictions thereto are normally entrusted to independent bodies not vulnerable to direct political control.

The EC, however, is not an independent regulator, but the main executive body of the EU. It is, starting from its very composition and appointment, a deeply political body, which is entrusted with the power of legislative initiative and plays a crucial role in the legislative negotiations.  Through its many legislative proposals and institutional tasks, the Commission pursues and combines a variety of policy objectives, with significant implications on fundamental rights.

The European Commission and its many (often conflicting) policy objectives  

The main policy objectives of the DSA are the promotion of the digital single market, addressing online harms, in particular illegal content, and ensuring the protection of fundamental rights online. The coexistence between these policy objectives is complex: it is marked by inevitable tensions, which require policy choices and continuous balancing. Vis-à-vis these policy objectives, the position of the EC – both as the executive body holding the monopoly of legislative initiative and as a DSA enforcer – is also very complex. Different parts of the EC (Directorate Generals, DGs) relate differently to different policy objectives, which are often in tension with each other (more often than not, promoting the single market and favouring trade versus the protection of fundamental rights). As a consequence, it would seem unlikely that the assessments and initiatives carried out by the Commission as an enforcer under the DSA will not be influenced by the agenda pursued by the same institution in DSA-related domains and other policy areas.

 Systemic risks in the DSA and the EC’s policies in the area of data protection 

One area where the EC’s action and initiatives might be in conflict with its role as a DSA enforcer is EU data protection law and its enforcement. Online platforms’ services (particularly those of VLOPs) entail the processing of massive amounts of personal data, and some of the most relevant systemic societal risks are connected to the adverse impact of these practices on the fundamental right to the protection of personal data and privacy. It is in recognition of these issues, that the final DSA text mentions privacy and data protection among the fundamental rights which might be impacted by systemic risks, and expressly refers to targeted advertising systems and data-related practices, in Articles 34 and 35 on systemic risks. In general, the entire debate about the risks of tracking-based ads shaped up as one of the most heated issues in the entire DSA process. It included the idea of restricting the use of minors’ personal data and that of special categories of data for the purposes of online ads (Article 26 and 28 of the DSA final text). Overarchingly, these inclusions build upon the realization of the impact that business models relying on the systematic collection of personal data have on fundamental rights and other important societal values. Given the wealth of data protection and privacy-related aspects in the DSA framework and their enforcement, the European Data Protection Board (EDPB) also called on the co-legislators to ensure that the DSA foresees cooperation in enforcement with data protection authorities.

Against this background, the EC’s role as the main enforcer for VLOPs and VLOSEs might be difficult to reconcile with (i.e., to keep uninfluenced by) the policy choices or legislative proposals that the same institution is undertaking in the area of data protection law or in other domains which are related to it. In other words, it could be argued that the way the EC perceives possible systemic risks connected to the fundamental right to data protection (and the adequacy of platforms’ measures to mitigate those) is heavily influenced by the policy choices that the same institution has taken or is pursuing in that domain or connected ones.

In the area of international data transfers, for instance, the EC typically deals with different (and often conflicting) policy objectives: international trade, on the one hand, and the protection of fundamental rights, on the other hand. With regard to the EU-US international personal data transfers, the EC’s assessment of how to balance these policy goals resulted in two adequacy decisions, the EU-US Safe Harbour and the EU-US Privacy Shield, both of which were invalidated by the CJEU, in 2015 and 2020, for failing to provide adequate protection to the rights of EU citizens. A new framework for transatlantic data flows, with great implications for the VLOPs, is currently being negotiated by the EC, and might be referred to the CJEU again.

This example shows, first, that in a hypothetical scenario where the EC is the central data protection regulator for big platforms, conflicts of interest would be inescapable, and, second, that some of these same tensions might easily characterize the EC’s tasks in its DSA supervisory and enforcement functions.

The EC’s proposal on Child Sexual Abuse Material

The controversial new proposal on combating child sexual abuse material (CSAM), presented by the EC in May 2022, similarly shows the conflicting policy objectives it has to deal with. The draft regulation obliges providers to scan private communications to detect CSAM material. In reaction to the proposal, civil society organizations have warned against the staggering risks for privacy, security and integrity of private communications and other fundamental rights brought about by the draft regulation. The German Federal Commissioner for Data Protection has called the proposal incompatible with EU values and data protection law, for deeply interfering with fundamental rights and democratic principles such as the confidentiality of private communications.

As explained by the EC, the CSAM Regulation builds upon the DSA’s horizontal framework, thus acting as lex specialis. While the DSA provides a framework for addressing illegal content online in general, the CSAM Regulation introduces more specific rules as regards the fight against a particular form of illegal content. Providers would therefore be subject to a more general systemic risk assessment obligation under the DSA and a more specific one under the CSAM Regulation.

Thus, one could legitimately wonder whether and how risk assessments and mitigation measure choices – undertaken by platforms under the DSA and overseen by the Commission – would be influenced by the CSAM framework (and similar specific regulations adopted in the future). Could the assessment of DSA systemic risks on illegal content and fundamental rights, and the enforcement of such obligations, be impacted in practice by (and assimilated to) CSAM obligations and standards? The Explanatory Memorandum to the proposal seems to confirm this: “Those providers can build on the more general risk assessment in performing the more specific one, and in turn, specific risks identified for children on their services pursuant to the specific risk assessment under the [CSAM] proposal can inform more general mitigating measures that also serve to address obligations under the DSA” (page 5 of the Explanatory Memorandum). Therefore, technologies implemented in the context of CSAM compliance, which translate into extensive forms of surveillance, could potentially also be used to comply with DSA-related obligations.

In particular, conflating the operationalization of DSA and CSAM assessments and mitigation measures raises the question of whether the Commission might be tempted to adopt CSAM standards, and the underlying fundamental rights balancing (proposed by the same EC), when overseeing and enforcing VLOPs’ risk assessment and mitigation under the DSA.

All these problematic aspects are also clearly related to the extensive surveillance risks inherent to the CSAM proposal. While providers’ obligations under the DSA (and the e-Commerce Directive) build upon the principle of ‘no general monitoring or active fact finding’, the CSAM proposal revolves around an overhaul of such prohibition of generalised monitoring. In other words, with the CSAM regulation the EC opts for a very different balancing of the (conflicting) rights which underlie that prohibition.

All these issues raise concerns on how the EC, as a DSA enforcer caught between its many other legislative proposals, will solve important and complex evaluations relating to a variety of fundamental rights and any tensions between those. Given the interlinkages between the CSAM and the DSA proposals, knowing how the EC intends to operationalize the DSA enforcement in practice is more urgent than ever.

Freedom of expression and responses to the Ukraine war 

Another important area where tensions might emerge, between the EC’s enforcement role under the DSA and its other institutional initiatives, is in the protection of the right to freedom of expression.  In this regard, it is worth stressing that content moderation is highly contested and politicized, and questions connected to the perceived legitimacy of the EC, across Europe, in overseeing the regulation of these matters might have been underestimated.

Further, the war in Ukraine has prompted a number of unanticipated developments in the domain of content moderation and platform regulation which are clearly of relevance for the DSA discussion. The EC had a crucial role in some of them: at the end of February, the EC announced a ban on the Russian media outlets Russia Today and Sputnik, which was immediately followed by Council measures prohibiting the broadcasting in the EU of media outlets which are considered essential tools of Russian propaganda. While the measures have been upheld by the General Court of the EU (in the proceedings initiated by RT France), experts have raised doubts on the proportionality of the ban and warned about its implications on freedom of expression and access to information in the EU.

During the third trilogue in March 2022, following the Russian invasion of Ukraine and related Russian disinformation campaigns, the EC proposed to introduce a crisis management mechanism for exceptional circumstances (Article 36 of the final text), in order to supplement the anticipatory and voluntary crisis protocols already set out under Article 37 DSA proposal.  Thirty-eight civil society organizations active on digital rights warned that “decisions that affect freedom of expression and access to information, in particular in times of crisis, cannot be legitimately taken through executive power alone”. Thus, they urged the DSA negotiators to ensure that this new crisis management system complies with human rights law and includes safeguards against abuses (in particular, time limits, ex-post scrutiny by the EP and a specific definition of crisis).

Concluding remarks

The final DSA text confirms the EC’s central role in the DSA supervision and enforcement architecture vis-à-vis VLOPs and VLOSEs.

However, the implications of this legislative choice, from a fundamental rights and democratic principles perspective, have not yet been adequately discussed and explored.

The examples discussed in this analysis indicate that central issues of the separation of powers should take center stage in the current conversation on platform regulation. Careful attention should be paid to the independent design of the DSA’s oversight and enforcement actors, with a view to ensure a fundamental rights-supportive regulatory structure. In this regard, it is essential to understand how the supervision of the VLOPs and VLOSEs will be concretely operationalized within the EC.


SUGGESTED CITATION  Buri, Ilaria: A Regulator Caught Between Conflicting Policy Objectives: Reflections on the European Commission's Role as DSA Enforcer , VerfBlog, 2022/10/31, https://verfassungsblog.de/dsa-conflicts-commission/, DOI: 10.17176/20221031-220451-0.

Leave A Comment

WRITE A COMMENT

1. We welcome your comments but you do so as our guest. Please note that we will exercise our property rights to make sure that Verfassungsblog remains a safe and attractive place for everyone. Your comment will not appear immediately but will be moderated by us. Just as with posts, we make a choice. That means not all submitted comments will be published.

2. We expect comments to be matter-of-fact, on-topic and free of sarcasm, innuendo and ad personam arguments.

3. Racist, sexist and otherwise discriminatory comments will not be published.

4. Comments under pseudonym are allowed but a valid email address is obligatory. The use of more than one pseudonym is not allowed.




Explore posts related to this:
DSA, Digital Services Act, Enforcement, European Commission, Platform Regulation


Other posts about this region:
Europa