03 June 2024

Soft law, hardcore?

The Legal Enforcement of Codes of Conduct under the Digital Services Act

Soft law offers the possibility of agile and flexible regulation that can adapt to dynamic digital developments. However, due to its non-binding nature, soft law is not considered to be very effective. With the Digital Services Act (DSA), however, the EU is taking an – at least from a legal dogmatic perspective – unconventional approach by combining hard and soft law in a unique way. The DSA itself is a legally binding EU regulation, but it provides for soft law instruments and even contains provisions for their legal enforcement. Although such regulatory techniques are well known in EU law, they at least call into question the public perception of the DSA as the ‘constitution of the internet’. How far-reaching can such a constitution be that outsources essential issues to (executive-initiated, privately set) soft law?

One example is the EU Code of Practice against Disinformation (Disinformation Code). Although the DSA proclaims the “voluntary nature” of such codes, there are binding effects for providers of very large online platforms such as TikTok, X and Instagram. We consider this to be problematic as it means that private companies are not only taking regulatory action in areas relevant to the fundamental communication rights but are not themselves being regulated. There are also significant questions for democracy and the rule of law as soft law ‘hardens’.

Soft Law in the EU

Soft law is distinguished from hard law by its non-binding nature and unenforceability (see Hartlapp, 2019, p. 1). At the EU level, soft law includes, for example, recommendations and declarations by EU institutions, but also codes of conduct that are drawn up by means of co-regulation between the EU and private actors.

Indeed, EU institutions are increasingly using soft law in various areas (Cappellina et al, 2022, p. 754). Firstly, the attractiveness of soft law for the EU lies in the fact that it avoids the long delays and hurdles of the regular legislative process (Slominski/Trauner, 2001, p. 4 ff.). Furthermore, soft law is seen as promising and even necessary in the digital context, particularly in view of its adaptability and flexibility in the design of rules (Vgl. Hagemann/Skees/Thierer, 2018, S. 79). This is also in line with the OECD recommendation on “agile regulation”. Co-regulation seems to be particularly advantageous in this context, i.e. the legislator sets objectives for a legal act and delegates the definition and implementation of the objectives to non-state parties, e.g. economic actors. This approach seems particularly suitable for the governance of very large online platforms such as Instagram, TikTok and X, whose community standards are already highly normative under private law.

The “hardening” of Soft Law

However, a key problem with soft law is its effectiveness. In principle, soft law only leads to a self-binding obligation on the part of the committing actors and is not legally enforceable. However, soft law can have at least indirect legal effects and de facto consequences (Cappellina et al, 2022, p. 744). Through the influence of binding legal acts, it is even possible, in the figurative sense, for soft law to “harden”, i.e. to become more binding. In particular, prescriptive formulations, implementation deadlines, reporting obligations and “comply or explain” mechanisms contained in binding legal acts can lead to hardening (Andone/Coman-Kund, 2022, p. 34). Although such hardening of soft law can contribute to the effectiveness of these rules, it is legally problematic. Co-regulation in particular poses particular problems.

Firstly, democratic legitimacy is called into question if democratic institutions are not involved at all or not involved enough in soft law acts with a de facto binding effect. It is critical, for example, when the interests of non-state actors are incorporated into co-regulation and become binding for third parties and/or have an indirect external impact on a large circle of uninvolved parties.

Secondly, it is difficult to ensure that the interests of stakeholders can be considered equally (Hartlapp, 2019, p. 1; Keller, 2008, p. 269). In the case of co-regulation, non-state actors have the opportunity to contribute to their interests. However, the actual opportunities for participation can vary considerably depending on financial and human resources, the timing of participation, and other factors. There is also a fundamental dilemma: if a binding law makes a static reference to an existing co-regulation, the parties that are subsequently bound by it have fewer opportunities to participate than those who originally drafted it. If, on the other hand, a dynamic reference is made, the democratic legitimacy of the norm may be diminished, since the text may subsequently be amended, “by-passing” the legislator.

Finally, the binding legal act that requires compliance with “soft law” can interfere with fundamental rights. On the one hand, fundamental rights of the obligated party may be violated. It follows from the fundamental right to freedom of occupation under Article 12 of the Basic Law and the freedom to conduct a business under Article 16 of the Basic Law that submission to soft law must in principle be voluntary. However, it may be permissible under certain conditions. In individual cases, co-determination options, flexibility of withdrawal and opt-out options from individual obligations must be examined in particular (further requirements: Latzer et al, 2002, p. 67 ff.). On the other hand, the content of the soft law act in conjunction with the mandatory legal act can affect various fundamental rights of the affected parties, but possibly also the fundamental rights of third parties if the soft law act has an indirect external effect.

Digital Services Act and its codes of conduct

The EU has also been using soft law to regulate online platforms for years and is deepening this strategy with the DSA, which recently came into force. The DSA is an EU regulation that fully harmonizes the rules for intermediary services such as online platforms in the internal market in order to ensure a safe, predictable and trustworthy online environment that, among other things, counteracts the dissemination of illegal content and disinformation. (cf. recital 9 DSA). This central framework for platform regulation provides for various soft law elements, such as codes of conduct, guidelines and crisis protocols.

Precisely because the DSA regulates framework conditions for the fundamental rights-sensitive area of online communication, the hardening of soft law must be critically examined. The following section takes a closer look at the Disinformation Code as an example.

EU Code of Practice against Disinformation

The current Disinformation Code was published on 16 June 2022 and has so far been signed by 34 parties, including large social media companies such as Meta and TikTok, as well as fact-checking organizations. As the previous version of the Code from 2018, which was designed as pure self-regulation, was widely criticized as ineffective (see e.g., Teeling/Kirk, 2020), a new course was taken in 2022 with the Disinformation Code.

In addition to changes in content,1) which are largely bracketed out in this blog post, the Code of Conduct has been transformed from a self-regulation into a co-regulation (see preamble lit. h Disinformation Code, recital 104 DSA). The co-regulatory approach of the Disinformation Code is particularly evident in the close interaction with complementary provisions of the DSA.

For example, the Commission and the European Board for Digital Services (hereinafter referred to as “Board”) are tasked with promoting the development of “voluntary codes of conduct” (Article 45 para. 1 sentence 1 DSA). In doing so, they are committed to ensuring that the objectives are clearly set out and that key performance indicators are included to measure the achievement of objectives (Article 45 para. 3 sentence 1). The DSA itself identifies areas for which codes of conduct are required, including “disinformation” (recital 104).

Forced to be voluntary?

In principle, the Disinformation Code is non-binding soft law. This understanding is reflected at least in part in the wording of the DSA: According to Article 45 para. 1 sentence 1, codes of conduct are described as “voluntary”. In addition, the “voluntary nature” of such codes and the freedom of choice regarding participation are expressly emphasized in recital 103 sentence 4 DSA.

For “very large online platforms” and “very large online search engines”, (VLOPs/ VLOSEs, Art. 33 DSA), however, the world looks different, as the voluntary nature of this is in question. It can start with a seemingly innocuous “request”. The Commission can request a VLOP to participate in the development of a code of conduct in accordance with Article 45 para. 2 sentence 1 DSA, for example, if significant and far-reaching systemic risks within the meaning of Art. 34 para. 1 DSA arise. In addition, a VLOP that has already signed up to a code of conduct can be requested to take “necessary measures” in the event of systematic non-compliance (Article 45 para. 4 sentence 3 DSA). Although the wording here remains vague, it can be assumed from the context that the necessary measures are aimed at compliance with the code of conduct.

Although such “requests” appear harmless in themselves, this impression changes when looking at the system of the DSA. The contradiction to the proclaimed “voluntary nature” becomes clear in the provisions on systemic risk mitigation. According to Article 35 para. 1 sentence 2 lit. h DSA, VLOPs are obliged to take risk mitigation measures, which may also include participation in codes of conduct. It is not clear from the wording of the regulation whether or not VLOPs are free to choose the risk mitigation measures. However, recital 104 sentence 6 DSA2) clarifies that the refusal to participate in a code of conduct without a “reasonable explanation” can be taken into account when determining whether the VLOP has breached the obligations of the DSA itself (Griffin/Vander Maelen, p. 6). Such a “comply and explain” mechanism is a typical feature of the hardening of soft law (see above).

Various supervisory provisions also contribute to this hardening. In general, the Commission and the Board are authorized to regularly monitor and assess the achievement of the objectives of the codes of conduct (Article 45 para. 4 sentence 1 DSA). In addition, compliance with commitments to codes of conduct pursuant to Art. 37 para. 1 lit. b DSA is subject to independent review, with the audit results to be reported to the Commission (Art. 42 para. 4 lit. c and d DSA). Furthermore, a VLOP can be requested to submit an action plan as part of the so-called “extended supervision” pursuant to Article 75 para. 2 sentence 3, whereby participation in a code of conduct is one of only a few options explicitly mentioned in law. In addition, the Commission takes commitments to comply with codes of conduct into account when assessing the action plan and monitors its implementation (Article 75 para. 3 sentence 2 f. DSA).

However, the factor that contributes most to hardening is the possibility of coercive measures and sanctions, which can be imposed for systemic violations. For example, fines can be imposed as part of a non-compliance decision under Article 73 para. 1, Art. 74 para. 1 DSA if the VLOP does not comply with the “relevant provisions” of the DSA or binding commitments. Penalties are also possible under Article 76 DSA. In the event that the action plan submitted is not sufficient, the Commission may even, as a last resort, initiate a procedure to block the service (Article 75 (4), Article 82 (1), Article 51 (3), recital 145 DSA). Thus, an “invitation” to participate in a code of conduct in conjunction with the threat of sanctions or coercive measures can indirectly lead to coercion to join and comply with a code of conduct. It must be feared that a VLOP may appear to join a code of conduct voluntarily following a request by the Commission, but ultimately only does so in order to avoid sanctions, i.e. is “forced to do so voluntarily”.

Final remarks

It is true that DSA represents a milestone in EU platform regulation due to its many valuable provisions, which are at least meaningfully supplemented by the Disinformation Code. However, structural problems are recognizable. The possibility under the DSA of requiring VLOPs to participate in and comply with codes of conduct and imposing sanctions can undermine the voluntary nature of codes of conduct and represent a questionable hardening of soft law. We must be wary lest private companies take on the role of legislators and impose quasi-legal obligations on other companies, which would be fundamentally incompatible with democracy and the rule of law. To complicate matters further, VLOPs in particular, as providers of structurally significant online communication spaces, have a considerable influence on the opinion-forming process. Since a free opinion-forming process is the basis of democracy and fundamental communication rights, submission to obligations that other private players with other business models have designed and decided upon must be viewed far more critically than co-regulation in communication-neutral sectors.

An important question that needs to be answered is whether there are equal opportunities for co-determination in the Disinformation Code. In principle, all signatories regularly have the opportunity to at least suggest changes to the Disinformation Code as part of a permanent task force (Commitment 37 Disinformation Code). As the rules governing the working methods of the task force are only defined by the task force itself, it is not yet possible to conclusively assess whether equal opportunities for participation actually exist and whether the Code will thus gain legitimacy.

Outlook

The Commission has opened the first proceedings under Article 66 DSA against the providers of the VLOPs TikTok, Meta and X, i. a. for failing to combat disinformation. While TikTok is a signatory to the Disinformation Code, X CEO Elon Musk announced the withdrawal of the service in May 2023. It is conceivable that X will be asked to rejoin and TikTok forced to comply with the Disinformation Code as part of these proceedings. This could set precedents that herald a new phase in the enforcement of communication-related co-regulation.

References

References
1 Changes to the content include commitments on the demonetization of disinformation, the handling of political advertising and the transparency of AI-generated content. Financial commitments are also included (Commitments 27.2, 30.2, 31.3 and 38), such as the financing of a fact check archive.
2 Recitals are not part of the normative part of a regulation. Nevertheless, it can be assumed that the supervisory authorities will be guided by them.

SUGGESTED CITATION  Panahi, Tahireh; de Bittencourt Siqueira, Andressa: Soft law, hardcore?: The Legal Enforcement of Codes of Conduct under the Digital Services Act, VerfBlog, 2024/6/03, https://verfassungsblog.de/soft-law-dsa-co-regulation/, DOI: 10.59704/73aaa5ac9d80b65b.

Leave A Comment

WRITE A COMMENT

1. We welcome your comments but you do so as our guest. Please note that we will exercise our property rights to make sure that Verfassungsblog remains a safe and attractive place for everyone. Your comment will not appear immediately but will be moderated by us. Just as with posts, we make a choice. That means not all submitted comments will be published.

2. We expect comments to be matter-of-fact, on-topic and free of sarcasm, innuendo and ad personam arguments.

3. Racist, sexist and otherwise discriminatory comments will not be published.

4. Comments under pseudonym are allowed but a valid email address is obligatory. The use of more than one pseudonym is not allowed.