22 February 2024

The DSA’s Trusted Flaggers

Revolution, Evolution, or mere Gattopardismo?

One of the most-publicized innovations brought about by the Digital Services Act (DSA or Regulation) is the ‘institutionalization’ of a regime emerged and consolidated for a decade already through voluntary programs introduced by the major online platforms: trusted flaggers. This blogpost provides an overview of the relevant provisions, procedures, and actors. It argues that, ultimately, the DSA’s much-hailed trusted flagger regime is unlikely to have groundbreaking effects on content moderation in Europe.

The DSA’s trusted flaggers

The (unsurprising) rationale of the system found in Article 22 DSA is encapsulated in recital 61: by prioritizing the handling of notices submitted by trusted flaggers, “[a]ction against illegal content can be taken more quickly and reliably”. Trusted flagger status shall be awarded by the appointed Digital Service Coordinator (DSC) where the applicant is established. Once there, such status shall be recognized by all platforms targeted by the DSA.

During the negotiations leading up to the adoption of the Regulation, a key issue became the eligibility criteria for trusted flaggers. Indeed, the European Commission’s original proposal was that only entities (not individuals) representing “collective interests” could – among other requirements – aspire to receive such a recognition. If such a proposal had made its way into the eventual text of the DSA, this would have meant, for example, that corporate entities only representing private interests would have not been in position to access the DSA trusted flagger regime.

The final text of the DSA (thankfully) does not contain such a requirement and instead indicates ‘private bodies’ as also potentially eligible for a trusted flagger designation. Overall, Article 22(2) provides that an entity (thus, like the Commission’s proposal, also excluding individuals) aspiring to receive such a status shall: (a) have particular expertise and competence for the purposes of detecting, identifying and notifying illegal content; (b) be independent from any provider of online platforms; and (c) carry out its activities for the purposes of submitting notices diligently, accurately and objectively.

Recital 61 itself provides examples of entities that will be eligible to become trusted flaggers under the DSA. Reference is made to internet referral units of national law enforcement authorities or of Europol, organizations part of the INHOPE network of hotlines for reporting child sexual abuse material, and organizations committed to notifying illegal racist and xenophobic expressions online.

The list is merely exemplificative. Hence, with reference to, e.g., the creative industries, their trade bodies and industry associations are also obvious candidates for trusted flagger status under the DSA given that (i) one of their key tasks is the online enforcement of their members’ rights through specialized and experienced teams and (ii) that is why they are already trusted flaggers through private agreements with platforms, from which they are clearly independent.

Does all this suggest, however, that the trusted flagger ‘floodgates’ are now open to many, if not all? The answer appears to be in the negative, as otherwise the very rationale for having a fast-track notice handling procedure would be lost. Indeed, the DSA specifies that “the overall number of trusted flaggers awarded in accordance with this Regulation should be limited” in order “[t]o avoid diminishing the added value of such mechanism”.

All this means that, while trade bodies and industry associations are encouraged to submit applications to the competent DSC, the DSA shall not affect the ability of private entities and individuals to conclude agreements with online platforms outside of the DSA trusted flagger framework. To be blunt, this sounds like a ‘nothing new under the sun’ result as such agreements have been in place for a long time already. If one thinks for example of copyright, YouTube inaugurated its trusted flagger program as early as 2012.

Nevertheless, the institutional framework that the DSA has created has the potential to be still meaningful, at least for two reasons. The first is that it will likely prompt a standardization of practices and approaches. This consideration is further reinforced by the (very welcome and much needed) harmonization of notice-and-action brought about by Article 16 DSA. The second reason is that it will serve to complement – in a lex generalis to lex specialis fashion – the regimes contained in subject-matter specific legislation. One such example is Article 17 of Directive 2019/790 (DSM Directive).

Trusted flaggers and Article 17 of the DSM Directive

As Article 17 of the DSM Directive moves from the consideration that, by storing and making available user-uploaded content, online content-sharing service providers (OCSSPs) directly perform acts of communication and making available to the public, the operators of such platforms are required to secure relevant authorizations from concerned rightholders to undertake such activities. Nevertheless, it might be the case that, despite the “best efforts” made by OCSSPs in accordance with Article 17(4)(a), no such authorization is ultimately secured, given that rightholders are not required to grant it. In such a case, OCSSPs can still escape liability by complying with the cumulative requirements under Article 17(4)(b)-(c).

In Poland, C-401/19, the Grand Chamber of the Court of Justice of the European Union (CJEU) considered that the liability mechanism referred to in Article 17(4) “is not only appropriate but also appears necessary to meet the need to protect intellectual property rights.” In this regard, two notable points may be extrapolated.

The first is that the use of automated content recognition technologies appears unavoidable under Article 17(4)(b)-(c): content moderation at a scale cannot be performed manually. Nevertheless, the CJEU has only allowed such technologies insofar as they are capable to distinguish adequately between lawful and unlawful uploads. In this regard the DSA will once again play a key role: the transparency obligations set forth therein will serve indeed to determine if the technologies employed by platforms that qualify as OCSSPs satisfy the CJEU mandate.

The second point reflects the scale of OCSSPs’ content moderation obligations: obviously, someone must be sending all those notices! In this regard, it is apparent that, at least in certain sectors (think of music, for example), ‘trusted rightholders’ will continue playing a very substantial role within the architecture of Article 17. In turn, platforms will need to prioritize their notices in order to comply with the obligations set forth in Article 17(4)(b)-(c).

The latter point is further confirmed if one considers the six key safeguards identified by the CJEU in Poland, notably the third one: OCSSPs shall be led to make content unavailable under Article 17(4)(b)-(c) upon condition that rightholders provide them with the relevant and necessary information. Clearly, entities that qualify as trusted flaggers in the creative industries will play a most significant role, whether it is through the DSA-sanctioned model or through existing or new private agreements with OCSSPs. In this sense, it will be intriguing to see if a competition arises between private trusted flagger programs and DSC-run ones, in the sense that the former might prove to be more attractive to rightholders (also because of fewer and/or less stringent obligations than those under Article 22 DSA) than the latter. In any event, it appears that the notices that rightholder will submit shall comply with the requirements set forth in the DSA.

So what?

In light of everything that precedes, is the much-publicized DSA’s trusted flagger regime to be regarded as a ground-breaking innovation? For the time being, that does not seem to be the case. All this might evoke – at least in the minds of the most cynical readers, perhaps even including myself – that statement from Giuseppe Tomasi di Lampedusa’s Il Gattopardo, which famously reads: “Se vogliamo che tutto rimanga com’è, bisogna che tutto cambi” (“If we want things to stay as they are, things will have to change.”)

Nevertheless, and at the very least, the institutional and harmonized shape conferred to trusted flaggers has the potential to smooth out divergences emerged in practice and meaningfully complement the legal regimes provided for in subject-matter specific legislation, including but obviously not limited to the field of copyright.

For this (positive) development to happen and thus avoid an insidious form of gattopardismo, however, it will be first necessary to see how appointed DSCs will handle their role, who will be awarded the trusted flagger status, and how the procedure will work in practice, including having regard to trusted flaggers’ own obligations under Article 22. In any event, it appears safe to conclude the ‘institutionalized’ trusted flagger regime of the DSA shall not replace but, rather, complement (or even compete with!) the voluntary trusted flagger programs already in place.

 

 


SUGGESTED CITATION  Rosati, Eleonora: The DSA’s Trusted Flaggers: Revolution, Evolution, or mere Gattopardismo?, VerfBlog, 2024/2/22, https://verfassungsblog.de/the-dsas-trusted-flaggers/, DOI: 10.59704/37e9e89d5812f78a.

2 Comments

  1. Daniel Holznagel Sat 24 Feb 2024 at 18:24 - Reply

    Thank you for the great blog post. I also think Art 22 is overrated. One more reason for this is I think a non-compliance (treat trusted flagger notices with a little bit more priority) will be difficult to measure from outside, but also otherwise imo ….

  2. Daniel Holznagel Sat 24 Feb 2024 at 18:31 - Reply

    And another thought why I think it is an unnecessary rule: it is kind of contradictory that a trusted flagger can need a trusted flagger rule – if one wants to be or is a trusted flagger, that entity should accomplish priority handling by its nature (competence or litigation firepower to pressure for platform reaction), as is the case as you also write … Art 22 seems one the the not so few over-buricratizations of the DSA

Leave A Comment

WRITE A COMMENT

1. We welcome your comments but you do so as our guest. Please note that we will exercise our property rights to make sure that Verfassungsblog remains a safe and attractive place for everyone. Your comment will not appear immediately but will be moderated by us. Just as with posts, we make a choice. That means not all submitted comments will be published.

2. We expect comments to be matter-of-fact, on-topic and free of sarcasm, innuendo and ad personam arguments.

3. Racist, sexist and otherwise discriminatory comments will not be published.

4. Comments under pseudonym are allowed but a valid email address is obligatory. The use of more than one pseudonym is not allowed.




Explore posts related to this:
DSA, Internet, Internet Regulaion, Trusted Flaggers