16 November 2020

​Walking from Luxembourg to Brussels in two hours

​The European Court of Justice will rule on the legality of upload filters

Eine deutsche Fassung des Beitrags haben wir hier veröffentlicht.

A little over a year ago, close to 200,000 people took to the streets to protest the European copyright reform. At the core of the controversy about Directive 2019/790 on Copyright in the Digital Single Market (DSM Directive) lies Article 17, which makes certain online platforms directly liable for copyright infringements of their users. In order to avoid liability, platforms will have to block access to works upon request of rightsholders, raising concern that legal uses of protected content will be blocked in the process. Protests have died down after the adoption of the directive, as Member States are engaged in the difficult task of transposing Article 17 into national law. It would be a mistake, however, to take this relative calm for an indication that the conflict has been resolved. While the implementation deadline for the Member States is coming closer, the conflicts have been taken to court.

​Article 17 before the Court

A public hearing before the European Court of Justice (ECJ) last Tuesday, November 10, dealt with the compatibility of Article 17, more precisely the provisions of Article 17 that require platforms to block copyright infringements, with the Charter of Fundamental Rights (Case C-401/19). The Court heard the complaint by the Republic of Poland against the European Parliament and the Council (representing the European Union’s legislative branch), which seeks the annulment of Article 17 or parts thereof on the grounds that it violates the fundamental right to freedom of expression and information by requiring platforms to install upload filters. Reflecting on the results of the hearing, the German Society for Civil Rights (Gesellschaft für Freiheitsrechte e.V.) is publishing a study today, of which I am a co-author, which demonstrates the incompatibility of Article 17 with the Charter in several respects.

It may seem ironic to the casual observer that this fundamental rights case is brought before the ECJ by the Polish government, which itself has a less than stellar fundamental rights record. However, it must not be forgotten that Poland was part of an informal coalition of six governments spanning the political spectrum who voted against the DSM Directive in 2019, citing the failure to balance the rights of copyright-holders with those of citizens and companies.

​Alternatives to upload filters

Tuesday’s hearing was structured around four questions. First, the Court asked whether Article 17 is going to render the use of upload filters mandatory. This question is relevant for the fundamental rights assessment because the Court has found in the past that general obligations on platforms to monitor all user uploads for illegal activities violate the fundamental rights of platform operators and their users.

While some commentators have argued that mandatory upload filters which prevent the use of specific copyright-protected works do not constitute a general monitoring obligation, this interpretation of the ban on general monitoring is inconsistent with the CJEU’s case law. It is based on the false assumption that when the ECJ rejected injunctions requiring the installation of automated filtering systems in Scarlet and Netlog, it did so because those injunctions would have required providers to look for infringements of unknown works, without receiving specific information that would help them identify those works. This is not the case. The judgement in the main proceedings of Scarlet shows that the dispute concerned the installation of the filtering system Audible Magic, which only works on the basis of fingerprints submitted by copyright-holders. This is precisely the kind of technology that platforms are expected to use when implementing Article 17. In McFadden, the Court even ruled that an obligation to automatically detect a single, clearly identified song would constitute a prohibited general monitoring obligation.

In order to meet the requirements of the Court formulated in Glawischnig-Piesczek, a monitoring obligation must at a minimum be specific to an act that has been deemed illegal by a court. Copyright filters based on rightsholder information about protected works, rather than specific infringing uses of those works, are incapable of meeting those requirements and preventing the collateral blocking of legal uses.

In the hearing, the supporters of Article 17 (the European institutions, backed by France and Spain) argued that upload filters are not mandatory. They pointed to the fact that Article 17 does not mention particular technological solutions by which platforms can demonstrate that they have made best efforts to block copyright infringements, and that different technologies are available, which are likely to change over time and could be combined with human review.

Poland argued that while Article 17 does not explicitly prescribe the use of upload filters, no alternatives are available to platforms to fulfil their obligations under Article 17(4). In a similar vein, Advocate General Saugmandsgaard Øe asked Parliament and Council whether a person who is required to travel from Luxembourg to Brussels within two hours can really be considered to have a choice between driving and walking. Poland also correctly pointed out that the alternatives presented by the European institutions, such as fingerprinting, hashing, watermarking, Artificial Intelligence or keyword search, all constitute alternative methods of filtering, but not alternatives to filtering.

Balancing freedom of expression and intellectual property

The second and third questions concerned the risk of upload filters for the freedom of expression, and whether those risks can be mitigated by interpreting Article 17 as requiring only the blocking of manifestly infringing content. These questions exposed the deep divisions between the supporters of Article 17. The European Commission, supported by Parliament and Council, defended its interpretation of Article 17, which it has recently presented to Member States in the form of a draft implementation guidance: While the obligation to block copyright infringements in Article 17(4) is merely an obligation to make best efforts, the obligation stemming from Article 17(7) to keep legal content online is an obligation of result. Therefore, the rights of users to upload legal content must prevail. Platforms should only be required to block access to manifestly infringing content. Uploads that could constitute a legal quotation, for example, must stay online until human review has been concluded.

France and Spain presented the opposite interpretation of Article 17. According to them, a national implementation in line with the Commission’s draft guidance would violate Article 17. The right to intellectual property should be prioritized over freedom of expression in cases of uncertainty over the legality of user uploads, because the economic damage to copyright-holders from leaving infringements online even for a short period of time would outweigh the damage to freedom of expression of users whose legal uploads may get blocked.

Poland raised that regardless of the legal interpretation of Article 17, platforms do not have the technical means to distinguish between legal and illegal uses of protected content, leading to overblocking in practice. The existence of a complaint and redress mechanism alone would not be sufficient to mitigate the damage to freedom of expression, because online communication relies on speed. Whereas copyright-holders could be compensated for the time in which an infringing work was available online, no equivalent compensation could be envisioned for restrictions on freedom of expression.

The balancing of different competing fundamental rights is likely going to be central to the Court’s judgement. Aside from the intellectual property of rightsholders and the freedom of expression and information of users, other fundamental rights need to be taken into account, most notably the freedom to conduct a business of platform operators. Article 17 leaves few guarantees for the freedom to conduct a business, by introducing extremely broad obligations on platform operators, while only limiting those obligations through a general reference to the principle of proportionality.

When analyzing the relevant case law on freedom of expression, including that of the European Court of Human Rights (most notably Yıldırım v Turkey, Kharitonov v Russia and Kablis v Russia), which the ECJ has to take into account, it becomes clear that the ECtHR has consistently rejected ex-ante restrictions on the freedom of expression that risk the blocking of lawful content as a form of prior restraint, because information is blocked that has not been deemed unlawful by a court. This finding holds true even for cases of hate speech, where the impact of leaving unlawful material online on the fundamental rights of victims is at least as severe as the impact on the intellectual property of copyright-holders. The ex-ante restrictions on freedom of expression established by Article 17 can only be justified in exceptional circumstances and require precise and specific safeguards against overblocking.

Article 17 fails to meet those strict requirements, because the only specific safeguards against overblocking apply after legal content has already been blocked. While the European legislator has expressed the goal that legal content should not be affected by Article 17, it does not instruct Member States on how to achieve this goal. Although the case law of the ECJ on freedom of expression to date is far less detailed than that of the ECtHR, the hearing gave an indication that the ECJ is paying particular attention to the issue of safeguards in this case.

EU responsibility for fundamental rights safeguards

In its final question, the Court asked whether the EU legislator has met its obligation to establish the necessary minimum safeguards against the violation of fundamental rights in Article 17. In its recent Schrems II ruling, the Court reiterated that “the legal basis which permits the interference with [fundamental] rights must itself define the scope of the limitation on the exercise of the right concerned”.

While the proponents of Article 17 considered that the safeguards included in it are sufficient, such as the complaint and redress mechanism and the obligation to leave legitimate uses unaffected, Poland argued that the EU legislator had deliberately passed on these difficult questions to the national legislators and ultimately the platforms, in an effort to sidestep politically sensitive issues. This is supported by the fact that the fundamental rights safeguards included in Article 17 lack enforcement mechanisms, and that those Member States States who have presented implementation proposals so far – with the exception of Germany – appear intent on ignoring those safeguards altogether. Judging by the standard developed by the Court in Digital Rights Ireland and Schrems II, the European legislator has failed to meet its central responsibility for the safeguarding of fundamental rights.

The judgement will likely come too late

While the fate of Article 17 is far from certain, the clock is ticking for Member States, who are obliged to implement the DSM Directive by next June. Advocate General Saugmandsgaard Øe announced at the hearing that he will publish his opinion on the Polish case on 22 April 2021. The implementation deadline is therefore almost guaranteed to pass before the Court could deliver its judgement. Member States now face the dilemma of implementing a provision that could be invalidated by the Court on fundamental rights grounds shortly after, or to violate their obligation to implement the DSM Directive on time.