03 September 2024

Auditing Platforms under the Digital Services Act

Taming the power of online platforms has become one of the central areas of the European Union’s policy in the digital age. The Digital Services Act can be considered a landmark example of the new regulatory architecture on online platforms. It now sees an articulated framework of obligations ranging from transparency safeguards to risk-based requirements, primarily oriented to the protection of fundamental rights, particularly providing additional obligations to platforms falling within the “very large” category.

Among these safeguards, the DSA increases the accountability of very large online platforms and very large search engines by introducing an auditing system. This ex-ante check aims to ensure independent experts verify compliance. Indeed, the DSA does not only introduce a new set of obligations and structure for enforcement but also extends the monitoring pressure on online platforms by providing another system to check compliance with the DSA, which would be otherwise scrutinised only ex-post, as in a case brought by a user highlighting a certain violation.

Nonetheless, the introduction of the obligation to conduct internal auditing for very large online platforms in the DSA is based on broader principles and open clauses. Even if the aim is to fight the opacity of the internal compliance process of very large online platforms, such an approach risks introducing a further layer of complexity. The many spaces of interpretation left by the European Commission lead auditors to play a critical role in defining the boundaries for correctly complying with the DSA, thus pushing these actors to act as regulators. This centralisation of decision-making is even more compelling considering the control exercised by online platforms on their spaces and the selection of auditors.

Within this framework, the audit process as defined by the DSA risks producing counterproductive consequences for the European policy objectives. From a constitutional perspective, the outsourcing of competence and decision-making from public to private actors articulates a system of compliance and enforcement based on multiple centres of power. The choice to delegate ex-ante monitoring to auditors could indeed become not a tile of a broader mosaic of compliance to increase the accountability of very large online platforms but itself an interpretative jigsaw shaped by the discretion of private actors. This situation would require the European Commission to expand the degree of details introduced by delegated acts and rely on participatory policy solutions, including co-regulation.

Auditing Very Large Online Platforms

According to the DSA, providers of very large online platforms and search engines are required to undergo independent audits annually, at their own expense, to ensure compliance with the extensive and numerous obligations defined in Section III and, additionally, any commitments made under certain codes of conduct and crisis protocols. While auditors check such a large set of obligations, platforms have to collaborate, providing access to all necessary data and premises. Even if the DSA considers the relevance of confidentiality in this process, however, it could not be a justification to impede the audit itself and limit the scope of the requirements established by the DSA on transparency, supervision and enforcement.

The DSA further clarifies that audits must be performed by independent organisations without conflicts of interest. Particularly, such a requirement is fulfilled when auditors did not provide services to the platform in the past year and commit not to provide them with such services for a year after the completion of the audit; they have not provided services to the platforms for more than ten consecutive years; or they are not performing the audit in return for fees which are contingent on the result of the audit. Furthermore, auditors must also have proven expertise, objectivity, and adhere to professional ethics.

The result of this process is a detailed report for each audit, which includes specific information such as the entities involved, methodology, findings, third parties consulted, and an audit opinion on compliance (namely ‘positive’, ‘positive with comments’ or ‘negative’), along with recommendations if the opinion is not fully positive. This assessment focuses on checking compliance with a wide-ranging set of obligations, from transparency obligations to risk assessment, including those which do not apply only to very large online platforms. This process is particularly challenging when looking at how platforms mitigate systemic risks for fundamental rights.

In case auditors have not been able to address some specific elements in their investigations, they are required to explain the circumstances and why those elements could not be audited in audit report. If platforms get a non-positive audit report, they must act on the recommendations and prepare an audit implementation report within one month, detailing the actions taken. Providers must justify their reasons if recommendations are not followed and propose alternative measures to address non-compliance.

Despite the importance of this obligation, the DSA only provides minimal guidance on the auditing process. Indeed, it recognises the role of the European Commission in creating additional rules to guide how audits are conducted, including procedural steps, methodologies, and reporting templates, considering any voluntary auditing standards. This choice underlines not only the limited degree of detail on the auditing process in the DSA but also the relevance of adopting more specific rules to increase legal certainty and avoid the risk of increasing the power of auditors in shaping compliance with the DSA.

Delegated Acts to limit Delegation

The DSA leaves wide spaces in the definition of these entities and in the performance of their role. For instance, rather than entrusting an independent body with the role of identifying competent auditors, the DSA leaves this role to online platforms, thus allowing the same actors to be audited to choose who will check their internal compliance with the DSA. Likewise, even the criteria for the expertise of the auditors are not defined. Indeed, there is no reference to the type of competence and review of the compliance process of a very large online platform.

For this reason, in October, the Commission adopted the delegated regulation on the performance of DSA audits. These rules entered into force in January, and they will apply to the first round of audits in August 2024. This step aims to provide further guidance and direction to avoid the exercise of extensive discretion in the process of auditing and increase legal certainty for online platforms which, as already stressed, are required to collaborate with the auditors in order to make them in the best position to conduct their check of compliance with the DSA.

However, the delegated acts provide a very limited set of specifications and details on audits. Among the most important issues, they do not establish standards. Indeed, the delegated acts focus on entrusting auditors with the role of defining different approaches based on ‘sufficient flexibility for the development of good practices’, as underlined in the delegated acts. This approach would potentially put consulting organisations with larger resources in the position of dominating the market of DSA compliance. The main risk is also marginalising the role of stakeholders such as non-profit organisations and experts on platform governance and content moderation. This concern also extends to civil society organisations, which, considering the lack of reference in the delegated acts, are now looking more into the possibility of participating in this process and more generally to policy involvement in the DSA, also accessing data from online platforms through Article 37.

Within this framework, delegated acts further clarify the DSA audit rule, but they remain broad and primarily based on general principles. Even if, according to Article 44(e) DSA, the Commission can address potential inadequate or incomparable auditing criteria by promoting voluntary standards, still the primary issue is related to the concentration of decision-making powers in the hands of auditors, thus inevitably shaping rules and standards of compliance with the DSA.

Auditors as Regulators

The DSA prescribes a critical role to auditors. Like trusted flaggers or out-of-court dispute settlement bodies, they become part of a new system of entities introduced by the DSA. Their role includes evaluating critical parts of compliance for online platforms, including their systemic risk management. This underlines their powerful role in the DSA architecture governing online platforms. The ambiguity of DSA provisions contributes to transforming auditors into regulators or decision-makers. Lacking binding norms or technical guidance, auditors are almost free to determine how the DSA should be interpreted, thus shaping the scope of obligations applying to very large online platforms. That is problematic whenever private entities are entrusted with assessing risks for fundamental rights, as already underlined in the case of online platforms.

Remarkably, auditors themselves have not generally welcomed this approach. They have already underlined how the “reasonable level of assurance” required by the delegated acts would be an opaque burden. The European Contact Group, representing the six largest international professional services networks in Europe, issued a public letter to the European Commission in response to the request for feedback on the draft Delegated Regulation laying down rules on the performance of audits. The group raised concerns about specific areas of the DSA where the subject matter is more complex and potentially subject to interpretation. They also highlighted that the delegated acts, as drafted, established specific requirements for the performance of the audits, alongside guidance on certain audit strategies, that could give rise to potentially incomplete, confusing, and even contradictory directions to auditors and undermine their ability to perform an effective independent audit. It was also mentioned that certain aspects of the audit, namely the lack of pre-existing benchmarks or the reliance on very large online platforms to develop their own criteria, could be highly subjective, prone to widely differing views and potentially subject to legal interpretation, which could result in a negative impact on the comparability and consistency of criteria used by auditors.

Furthermore, the issues related to the novelty and ambiguity of DSA provisions, also due to the lack of case law and details from the European Commission on obligations such as trusted flaggers or risk assessment have been amplified by the constant change throughout the audit period. New obligations have been added depending on decisions taken by the European Commission or joint interpretations of certain DSA articles by the Contact Group. This evolving framework has contributed to a level of uncertainty for very large online platforms on what obligations and controls need to be assessed by auditors.

These positions underline the risks of auditing without further guidance. The audit community created by the DSA would be increasingly involved in defining standards whose competence would belong to the legislator or a delegated technical regulator. Even if the Digital Services Act represents a significant step forward in regulating the complex and powerful landscape of very large online platforms, the implementation of its auditing system, while enhancing accountability and transparency, also introduces new challenges and potential risks. The ambiguity and limited guidance of the DSA, coupled with the substantial discretion given to auditors, may inadvertently shift regulatory power to these entities, allowing them to shape the compliance landscape in ways that could undermine the broader objectives of the DSA. Therefore, ensuring ex-ante oversight would require the European Commission to be more ambitious, particularly expanding the content of delegated acts and relying more on different stakeholders to define guidelines and codes of conduct.


SUGGESTED CITATION  De Gregorio, Giovanni; Pollicino, Oreste: Auditing Platforms under the Digital Services Act, VerfBlog, 2024/9/03, https://verfassungsblog.de/dsa-auditors-content-moderation-platform-regulation/, DOI: 10.59704/f09bcd357d6c99b0.

Leave A Comment

WRITE A COMMENT

1. We welcome your comments but you do so as our guest. Please note that we will exercise our property rights to make sure that Verfassungsblog remains a safe and attractive place for everyone. Your comment will not appear immediately but will be moderated by us. Just as with posts, we make a choice. That means not all submitted comments will be published.

2. We expect comments to be matter-of-fact, on-topic and free of sarcasm, innuendo and ad personam arguments.

3. Racist, sexist and otherwise discriminatory comments will not be published.

4. Comments under pseudonym are allowed but a valid email address is obligatory. The use of more than one pseudonym is not allowed.