02 November 2022

The DSA as a paradigm shift for online intermediaries’ due diligence

Hail to meta-regulation

In late October 2022, the final version of the Digital Services Act (DSA) was published in the official journal. The importance of this legislation in shaping the governance of online content in the years to come cannot be overstated. While several provisions are worth highlighting, in this blogpost I focus on one specific aspect: the adoption of a meta-regulatory approach. Specifically, after providing a definition of this concept, I discuss its virtues and limits, and illustrate how this approach is operationalized in the DSA with regard to a subset of online intermediaries: providers of Very Large Online Platforms (VLOSEs) and Very Large Online Search Engines (VLOSEs). The bottom-line is that, while the shift to a meta-regulatory model should be welcomed for enabling reflexive and adaptive regulation, we must also be weary of its risk of collapsing in the absence of well-resourced and independent institutions. Indeed, this risk affects the extent to which the exportation of the DSA outside Europe would be in the public interest.

The concept of meta-regulation

The DSA marks a fundamental shift towards the definition of due diligence obligations for online intermediaries: first, it departs from a system of liability limitations that left a wide range of issues up to self-regulation, in the absence of specific provisions of national law. Second, it produces a comprehensive set of obligations which are imposed directly by EU law, but necessitate specific implementation by providers through a framework that involves self-assessment accompanied by close monitoring by the regulator. This approach, that on the one hand leaves businesses with a significant amount of discretion in the implementation of regulatory principles, and on the other involves a process of continuous evaluation and monitoring of the results, has been called “meta-regulation” or “enforced self-regulation”: “meta” because one (macro) regulator oversees another (micro) regulator in their management of risk; “enforced” because, in case of inadequacy of the self-regulatory practices, the (macro) regulator has the power to take enforcement measures. To determine whether such measures are warranted, meta-regulation establishes norms of organization and procedure through which the self-regulatory practices can be assessed. By doing so, it assumes a fundamentally “reflexive” character: it focuses on enhancing the self-referential capacities of social systems and institutions outside the legal system to achieve broad social goals, rather than on prescribing particular actions. Furthermore, as noted by Morgan and Yeung, at the core of meta-regulation are participatory procedures for securing regulatory objectives and mechanisms that facilitate and encourage deliberation and mutual learning between organizations.

Considering these characteristics, the meta-regulation model is particularly apt to deal with complexity and uncertainty, where some experimentation and dialogue between different stakeholders may be necessary. According to Ayres and Braithwaite, there are other inherent advantages, including the fact that the rules can be tailored to the specifics of each regulated entity and adapt more quickly to an evolving environment, in addition to generating typically a higher level of commitment due to the company´s own elaboration of those rules, and imposing a high share of costs of regulation on the regulated entities (as opposed to the regulator). On the other hand, weaknesses of the model include the regulator´s costs of regularly monitoring and approving a vastly increased number of rules, the possibility that regulated entities write rules in a way that assists them to evade the spirit of the law (as occurred, for instance, with the implementation of the NetzDG law in Germany) and the lack of effective independence of those who certify the adequacy of the measures undertaken. We return to these points below, explaining how they might apply in the context of the DSA.

Meta-regulation in the DSA

Chapter III in the DSA deals with due diligence obligations for intermediary service providers. To provide a harmonized framework for due diligence obligations and promote a safe, predictable and trustworthy online environment where respect for fundamental rights is ensured, the Regulation distinguishes different types of intermediaries, based on the type, size and nature of their services. The more demanding types of obligations concern very large online platforms (VLOPs) and very large online search engines (VLOSEs), which are the focus of this contribution. This is because it is with regard to these categories of intermediaries that the meta-regulatory character of the DSA is most evident: once designated, these entities are effectively required to act as risk regulators, subject to the oversight and enforcement by the European Commission, the national Digital Services Coordinators and the European Board for Digital Services in their capacity as meta-regulators. Specifically, VLOPs and VLOSEs are required under Article 34 to conduct regular assessments of any systemic risks stemming from the design or functioning of their service and its related systems (including algorithmic systems), or from the use made of their services, and provide information to the Commission and the Digital Services Coordinator upon request. They also must put in place, pursuant to Article 35, reasonable, proportionate and effective measures for the mitigation of such risks. Further, a similar obligation was introduced relatively late in the process of DSA negotiations (in 2022, after Russia’s invasion of Ukraine) to deal with the event of a “crisis”, i.e., extraordinary circumstances leading to a serious threat to public security or public health in the EU or a significant part of it. According to Article 36, in such a situation the Commission can request VLOPs and VLOSEs to assess and mitigate the risks of their contribution to the serious threats that have been identified, and report over them at regular intervals.

As a mechanism to document the compliance with the above-mentioned measures, under Article 37, VLOPs and VLOSEs shall be subject, at their own expense and at least once a year, to independent audits to assess compliance. They must also transmit to the competent Digital Services Coordinator, the Commission and the Board (and make public within 3 months) audit reports, as well as audit implementation reports (showing how the audit´s recommendations have been addressed). These audit obligations constitute a critical element for the functioning of the meta-regulatory framework, providing a necessary check on the implementation of the measures that have been undertaken as part of the providers´ due diligence. The same auditing applies to the implementation of commitments contained in voluntary codes of conduct that can be drawn up to contribute to the proper application of the DSA under Article 45, and the effectiveness of which must be regularly monitored and evaluated by the Commission and the Board1. The codes of conduct facilitate this by establishing clear objectives and key performance indicators, drawing from the lessons learned by the Commission with the Code of Practice on Disinformation about the ineffectiveness of general commitments without concrete measurement criteria. Furthermore, Article 41 of the DSA requires VLOPs and VLOSEs to set up a compliance function, independent from their operational function, which serves as a channel of cooperation with the Commission and the Digital Service Coordinators. Among other duties, the management body of the compliance function must devote sufficient time to the consideration of risk management measures, ensure that adequate resources are allocated to risk management, and approve and review at least once per year the risk management, monitoring and minimization policies of that VLOP or VLOSE.

All these obligations are prodromic to a process of dialogue with the regulator, in particular on the adequacy of the measures adopted, possibly leading to the adoption of enforcement measures. For instance, in the case of systematic failure to comply with the codes of conduct, the Commission and the Board may invite the signatories to the codes to take the necessary action. Similarly, in the context of the crisis response mechanisms, the Commission may, on its own initiative or at the request of the provider, engage in a dialogue to determine whether the implemented measures are effective and proportionate. If it considers that they are not, the Commission may (after consulting the Board) request the provider to review them. Ultimately, Digital Services Coordinators may accept and make binding the compliance commitments offered by those providers, impose fines and periodic penalty payments, and exercise a range of enforcement measures as per Articles 51 and 52. These backstops are essential incentive mechanisms for the due diligence that meta-regulation seeks to promote.

The meta-regulatory framework is also supplemented by flanking obligations, such as a data access framework for vetted researchers, transparency reporting to the broader public about the risk assessment and identification (in addition to the audit and audit implementation reports), as well as the human resources dedicated to content moderation by each VLOP and VLOSE provider. These create an opportunity for further monitoring of the adequacy of the measures adopted, thus potentially improving the regulator’s detection of non-compliance. In fact, the Board will draw from these sources when publishing yearly reports, in cooperation with the Commission, to identify the most prominent and recurrent systemic risks, along with best practices for VLOPs and VLOSEs providers.

Open issues and criticism

Having explained the dynamics at play in the DSA, let us return to some of the criticism that has been raised against the use of meta-regulation. The first one we mentioned, having to do with the costs of monitoring and approving a vastly increased number of rules, has been directly addressed by the latest version of the DSA: its Article 43 now provides that the Commission shall charge an annual supervisory fee to providers of VLOPs and VLOSEs upon their designation as such. While the criteria used to determine the amount are to be developed in implementing acts of the Commission according to pre-established criteria, one could question the rationale for the establishment of a cap of 0,05 % of the worldwide annual net income in the preceding financial year. Indeed, considering that the fee is intended to cover the estimated costs that the Commission incurs in relation to its supervisory tasks under the DSA, and that there is concern about its insufficient enforcement resources, one may wonder whether the Commission might not have underestimated the costs that can be raised by a non-cooperating firm.

The second concern relates to the possibility for regulated entities to pursue a strategy of stylized compliance, crafting rules in a way that enables them to evade the spirit of the law. In principle, the regular reporting and monitoring should permit the detection of this kind of behavior and trigger remediation, with a request to modify the risk identification and management measures. However, there is a risk that the depth of inquiry into each relevant document will depend on the resources available for the relevant regulator – a matter that, as seen above, is not uncontroversial. To prevent regulatory failure, a further instrument in the toolbox is the possibility that the European Commission or the national Digital Services Coordinator receive this information from a researcher who has obtained access pursuant to Article 40, or to anyone who has examined the auditing and self-assessment documents made public by the relevant VLOP or VLOSE under Article 42. This could give rise to a complaint by a user of those services or by a body mandated to exercise the rights of the DSA pursuant to Article 53, or even a private action for compensation of any consequent damages (a measure introduced under Article 54 by the latest version of the DSA). Notably, providers are only required to make risk assessments, mitigation measures and auditing reports public three months after the receipt of each audit, which creates a delay for the possible detection. In the absence of this documentation, the data access framework might be insufficient to detect misconduct in real time. Furthermore, those qualified researchers that are granted access to data may not have access to complete datasets, due to the need to take into account the interests of VLOPs and VLOSEs (including the protection of trade secrets) and those of their users (including privacy and data protection). Compared to Digital Services Coordinators, they may also lack the overarching structure necessary to conduct a comprehensive and systematic review of compliance of each provider’s practices.

A different type of safeguard used in the DSA to ensure that VLOP and VLOSE providers undertake appropriate commitments is to include participation of other stakeholders from the start of the meta-regulatory conversation. For instance, Recital 90 requires their risk assessment and mitigation to be based on the best available information and scientific insights, and that their assumptions in this exercise are tested with the groups most impacted by the risks and the measures they take. This may entail involving representatives of groups potentially impacted by their services. Additionally, Article 45(2) grants the Commission the power, where significant systemic risk emerges and concerns several VLOPs and VLOSEs, to invite relevant stakeholders to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes. However, the practical effect of these provisions remains to be seen: the latter is a highly circumscribed possibility, while the former is only contained in Recitals and not in the operative text of the DSA.

The third and perhaps most contentious point concerns the lack of effective independence of those who certify the measures undertaken. In the original formulation by Ayres and Braithwaite, this criticism was directed at the insufficient independence of the compliance directors, who are required to report to regulators on pain of criminal liability any management overruling of compliance directives. In the context of the DSA, such criminal liability is not envisaged, and no specific requirements are detailed for the independence of the compliance function. As a result, the effectiveness of this safeguard may be questioned. On the other hand, more elaborate criteria are established for the independence of the auditors: Article 37 requires that they do not provide audits for contingency fees; that they have not provided non-audit services on matters audited to the provider for the past 12 months and do not provide them for 12 months after the completion of the audits; and that they have not provided auditing services to the provider or any legal person connected to it for more than 10 consecutive years. Nevertheless, it is easy to foresee that the mere expectation to provide auditing services to the same provider in the future might influence the auditor’s objectivity. As convincingly argued on this blog, this situation could only be tackled through a public auditing framework – although for this to work effectively, a robust system of safeguards against regulatory capture must be defined.

Effects beyond the EU?

There is one additional reason why we should not simply brush aside healthy scepticism on the institutional capacity to ensure the proper application of the DSA: the rest of the world is watching. Since the Regulation seeks to deal with content moderation challenges that are faced in a similar manner by regulators, intermediaries and users across the globe, it won´t be long before we see legislation in other jurisdictions inspired by the DSA. By way of example, the Brazilian Congress has already been debating a bill that would replicate some of the dynamics of the DSA, including the meta-regulatory approach. The latest version of the bill attributes a crucial role to self-regulation for social networks, search engines and messaging services, overseen by a self-regulatory institution of their own creation which would have the power to adopt and disseminate codes of conduct for the implementation of the law. Differently from the DSA, these codes would not be validated by a public authority: instead, it would be the Brazilian Internet Steering Committee (a multistakeholder body composed of 9 government representatives, 4 business representatives, 4 civil society representatives, 3 science and technology representatives, and a representative with notorious knowledge of Internet matters) which would become the entity to issue guidelines for the implementation of those codes, and certify compliance by the self-regulatory institution with the principles set out in the bill. More worryingly, the burden of monitoring and enforcement would be placed on the market, in particular through its self-regulatory institution. Institutional arrangements of this kind may be the norm rather than the exception in countries where public institutions suffer from insufficient resources and a low level of trust, with foreseeable consequences for the protections that the legislation seeks to provide to platform users and society.

One should also not underestimate a second type of Brussels effect, which has to do with the possibility that regulated entities themselves export outside the EU the compliance framework that they establish under the DSA. While this could substantially improve the dialogue between platforms and regulatory institutions abroad, in the absence of adequate institutional backing it raises the twofold risk of selective importation and insufficient consideration of the local context. To prevent this, we need to ensure that the complexities of meta-regulation are properly communicated and understood. This starts from the realization that the due diligence obligations imposed on providers are not to be taken in isolation: they are part and parcel of a broader ecosystem geared to enable appropriate experimentation, monitoring, and regulatory dialogue with possible escalation to enforcement. And crucially, robust mechanisms of oversight and accountability must be built into this framework if it is to deliver on its promises.


SUGGESTED CITATION  Zingales, Nicolo: The DSA as a paradigm shift for online intermediaries’ due diligence: Hail to meta-regulation , VerfBlog, 2022/11/02, https://verfassungsblog.de/dsa-meta-regulation/, DOI: 10.17176/20221102-215609-0.

Leave A Comment

WRITE A COMMENT

1. We welcome your comments but you do so as our guest. Please note that we will exercise our property rights to make sure that Verfassungsblog remains a safe and attractive place for everyone. Your comment will not appear immediately but will be moderated by us. Just as with posts, we make a choice. That means not all submitted comments will be published.

2. We expect comments to be matter-of-fact, on-topic and free of sarcasm, innuendo and ad personam arguments.

3. Racist, sexist and otherwise discriminatory comments will not be published.

4. Comments under pseudonym are allowed but a valid email address is obligatory. The use of more than one pseudonym is not allowed.




Explore posts related to this:
Brussels effect, DSA, Digital Services Act, due diligence, meta-regulation


Other posts about this region:
Europa