14 April 2025

The Premise of Good Faith in Platform Regulation

On the Changing Conditions of the Digital Services Act’s Regulatory Approach

The DSA has now been in force for over a year and aims to ensure a secure, trusted and innovation-friendly online environment respecting fundamental rights (Art. 1 para. 1 DSA). Hardly any other piece of recent European legislation has sparked such heated debate, raised so many fundamental questions, and been implemented so quickly in practice. Just think of the controversy surrounding “trusted flaggers” (Ruschemeier, 2024; Lindner, 2024; Rosati, 2024), data access for science and its enforcement (Seiling/Klinger/Ohme, 2024; Klinger/Ohme, 2023), or now the independence of out-of-court dispute resolution bodies (Gradoni/Ortolani, 2025; Holznagel, 2024). However, anyone who takes a closer look at the DSA will not be surprised by this. The European legislator took a bold step with this regulation. The DSA not only establishes ground rules and procedural frameworks for digital services, but also addresses broader issues such as election integrity, disinformation or risks of addiction.

This blog post argues that the DSA’s regulatory framework rests on a central premise: a baseline level of trust in the good faith of its addressees. Considering the rapid politicisation and reorientation of major very large online platforms (VLOPs), this premise is seriously challenged. As a result, the achievement of the DSA’s objectives may therefore be at risk.

The principle of enforced self-regulation

At key points, the legislative methodology of the DSA follows the principle of enforced self-regulation (Barata, 2021; Templin, 2025, 127). On the one hand, VLOPs and other addressees face all kinds of due diligence obligations that are very ambitious in their objectives. Systemic risks like disinformation, influence on elections or addictiveness are to be identified, assessed and minimised (see Uhlenbusch, 2025; Harfst, 2025). Terms and conditions governing content moderation must consider fundamental rights, Art. 14 para. 4 DSA (see Senftleben, 2024; Eifert/Metzger/Schweitzer/Wagner, 2021, 1013 et sqq.). Platforms should develop codes of conduct to combat illegal content and other risks, Art. 45 para. 1 DSA (Panahi/Sequeira, 2024).

On the other hand, online platforms retain significant discretion in implementing these objectives. The obligations for VLOPs and search engines in Art. 34, 35 DSA are paradigmatic in this regard. The services must identify and mitigate systemic risks – meaning they must evaluate and improve the functioning of their algorithms or moderation systems. While the legislator makes suggestions for the categories of systemic risks (Art. 34 para. 1 DSA), the parameters to be observed (Art. 34 para. 2 DSA) and conceivable mitigation measures (Art. 35 para. 1 DSA), enforcement relies largely on self-assessment. In short: Platforms must act if they identify risks, but the entire process – from identification to mitigation – remains in their hands (Harfst, 2025).

Some argue this approach respects the freedom to conduct business (Art. 16 CFR) and the principle of proportionality, Art. 52 para. 1 CFR (Alexander, 2023, 22). However, the principle of enforced self-regulation is also an acknowledgement that the regulation of the digital economy is pushing regulatory legislators to their limits. Digital services are a “moving target” and their business models and algorithms are changing constantly. So how can laws effectively regulate algorithms and recommender systems without becoming obsolete the moment they take effect?

To tackle these regulatory challenges, the DSA embraces enforced self-regulation: setting ambitious targets is better than ignoring systemic risks altogether. This approach grants platforms considerable organizational freedom, while enforcement falls to national authorities and, most importantly, the European Commission. The latter exclusively oversees VLOPs and very large search engines (Art. 56 para. 2 DSA). Enforcement is supported by a number of transparency, documentation, and reporting obligations (see Art. 15, 24, 26, 27, 35, 38, 42 DSA).

Early criticism

This regulatory approach was diagnosed during the drafting of the DSA and has been the subject of lively debate ever since. Some argue it shifts fundamental rights protection duties from the state to private actors (Denga, 2023, 24). With regard to freedom of expression, Peukert even describes it as a “communications surveillance bureaucracy”, thereby also criticising the Commission’s indirect power (Peukert, 2022, 43 et sqq.). However, it is not the DSA causing this shift, it’s rather its intention to create a regulatory framework for it. Nonetheless, it does consolidate the power of the platforms – precisely through the enforced self-regulation approach. But what could be an alternative?

Without a fundamental reorganisation of the private platform landscape (such as Free our Feeds), delegating significant responsibility to platforms remains unavoidable (see Eifert, 2017, 1451). Ultimately, all lines of argument seem to have one thing in common: For capacity reasons alone, the state cannot take the regulation of the digital information space into its own hands. Furthermore, fundamental communication rights suggest it shouldn’t (see Kreißig, 2023). Of course, the final say always remains with the courts under the rule of law (Eifert, 2017, 1451).

The politicisation of major digital platforms

These considerations must now be re-evaluated in light of shifting realities. Elon Musk has dramatically reshaped his platform X (formerly Twitter), minimizing content moderation and openly disobeying European law (van de Kerkhof, 2025). Mark Zuckerberg is following the same path with his services Facebook and Instagram (see only Bayer, 2025). Other leaders of major U.S. platforms have also taken clearer political stances than ever before and may follow as well (Windwehr, 2025; Mullin, 2025). All of this aligns with the 47th U.S. president’s efforts to dismantle democracy at a breathtaking pace (Snyder, 2025).

These developments, among other obvious problems, undermine a central premise of the DSA. The European legislator was most definitely aware that it was dealing with powerful private actors who were primarily interested in maximizing their advertising revenues. This is precisely where many of the due diligence obligations come into play – as for algorithms that must consider risks to certain public interests (Artt. 34, 35 DSA). However, it was generally assumed that online platforms had their own economic interest in moderating illegal content, primarily to maintain a suitable advertising environment (Klonick, 2018, 1627; Raue, 2018, 8f.). Nevertheless, major VLOPs are now dismantling their moderation teams and ending their collaboration with fact-checkers (Bayer, 2025). What impact does this have on the DSA’s principle of enforced self-regulation?

The necessity of trust

The principle of enforced self-regulation requires a basic leap of faith (also Harfst, 2025). Under Artt. 34, 35 DSA, the VLOPs conduct a risk analysis of their platforms and implement appropriate, customised risk mitigation measures. The Commission, external audits and researchers monitor this process subsequently – but do not actively participate in it. For this system to function, platforms and regulators must share the common goal of mitigating systemic risks. However, if the platforms instead pursue other goals in principle, such as ending content moderation and fact-checking or withdrawing from codes of conduct – then there is not much left to monitor. Or, in other words, the whole process is poisoned.

In late 2024, the VLOPs handed in the first round of risk assessment reports. Unsurprisingly, these reports did little to ease concerns. On the contrary, they reveal a “pattern of evasion” and “fundamental failure” (Reich/Calabrese, 2025; critical also DSA Civil Society Coordination Group, 2025).

Furthermore, the enforcement structure does not solve this problem. If VLOPs neglect systemic risks, the damage often becomes apparent far too late (e.g. regarding elections Harfst, 2025; Reich/Calabrese, 2025). Enforcement by means of an order (Art. 51 para. 2 subpara. 1 lit. b DSA) will not be effective for most risks, as digital communication platforms are far too dynamic. The wave of disinformation will already have spread before the Commission has even finished conducting its assessment. Additionally, Art. 35 para. 1 DSA merely requires platforms to do something – enforcing specific obligations remains a challenge (Rademacher, 2025, Art. 51, para 22). Finally, by their very nature self-regulatory obligations are not suitable for private law enforcement (Zurth, 2023, 1334).

What remains are sanctions such as fines, which can be quite severe at up to 6% of annual turnover, Art. 52 para. 3 DSA. However, it would be fatal to rely solely on this threat. The VLOPs are very self-confident towards the European enforcer. After all, they know they have the new US administration on their side following their political alignment (Windwehr, 2025; Reuter, 2025). The shortcomings in enforcing the DSA’s objectives against reluctant addressees are evident not only in the risk assessment reports, but also in the access to research data under Art. 40 DSA (Jaursch/Ohme/Klinger, 2024). Most importantly, fines alone cannot secure a predictable and trusted online environment that upholds fundamental rights (Art. 1 para. 1 DSA).

What are the consequences?

Enforced self-regulation was likely the best available approach at the time – balancing ambitious goals with concerns about information asymmetries and overregulation. It also addresses those players with the highest level of knowledge and is therefore, in theory, effective (Eifert, 2017, 1450; Siegrist, 2025, 90). Enforced self-regulation may lead to partial improvements – especially in addressing long-term problems such as the risk of addiction. However, the notion that it may be just the right tool to tame politicised digital platforms (Uhlenbusch, 2025) cannot be agreed with (so far).

It is crucial to read and interpret the DSA under these changed circumstances. Here, academia and civil society play a crucial role. Their task must be to analyse the provisions of the DSA in relation to this issue and identify any potential needs for change. A big step forward would be a well-functioning research data access, which the DSA promises but has not yet provided (Jaursch/Ohme/Klinger, 2024; GFF, 2025).

The same applies to the European legislator, which in this term is aiming to evaluate the regulatory legislation adopted in the last years. A good opportunity to review the self-regulatory approach and the tools to enforce it.

Finally, it also applies to the European Commission as the competent supervisory and enforcement authority. In particular, enforcement must take the changed circumstances into account (in this sense also Uhlenbusch, 2025; Windwehr, 2025).

The DSA as a blessing

The challenges facing the DSA and its implementation have been outlined. However, taking a step back, the DSA is also proving to be a blessing in these times. An indication of this may be the aggressiveness with which the US platforms and administration are responding to this legislation. Despite the shortcomings described here, the DSA has the potential to be a very sharp sword in other areas. Moreover, the self-regulatory obligations are not completely useless (Uhlenbusch, 2025) – academics, enforcers and the legislator play a crucial role in making the most of them.

The DSA lays important foundations for defending our digital communication infrastructure which constitutes “de facto public spaces” (Commission, 2020, 9). The immediate priority is to withstand the pressure from the US. This requires not only defending the DSA against political pressure, but also to vigilantly enforce it and improve any shortcomings.

 


SUGGESTED CITATION  von Bernuth, Nikolaus: The Premise of Good Faith in Platform Regulation: On the Changing Conditions of the Digital Services Act’s Regulatory Approach, VerfBlog, 2025/4/14, https://verfassungsblog.de/the-premise-of-good-faith-in-platform-regulation-dsa/, DOI: 10.59704/5494324c1cc2203d.

Leave A Comment

WRITE A COMMENT

1. We welcome your comments but you do so as our guest. Please note that we will exercise our property rights to make sure that Verfassungsblog remains a safe and attractive place for everyone. Your comment will not appear immediately but will be moderated by us. Just as with posts, we make a choice. That means not all submitted comments will be published.

2. We expect comments to be matter-of-fact, on-topic and free of sarcasm, innuendo and ad personam arguments.

3. Racist, sexist and otherwise discriminatory comments will not be published.

4. Comments under pseudonym are allowed but a valid email address is obligatory. The use of more than one pseudonym is not allowed.




Explore posts related to this:
DSA, EU