A General Obligation to Monitor
How the ECJ Is Turning the Liability Regime of European Platform Regulation on Its Head
Notice-and-takedown – this is the core principle of European platform regulation since the introduction of the eCommerce Directive in 2000. It is the principle of an internet in which every user becomes a “content creator”, without prior selection and evaluation of the content. It is the decisive factor in the fundamental difference between newspapers, radio, and television on the one hand and the internet on the other, between one-to-many and many-to-many communication. Although this liability privilege is coming under increasing pressure—legally through exceptions for copyrighted content and special monitoring obligations for violations of personality rights, politically through civil society initiatives such as Save Social, which are calling for the liability privilege to be reviewed— there has not been any fundamental legislative decision to move away from this principle in the last 25 years.
Instead, the revolution now seems to be starting in Luxembourg. In Russmedia Digital, the ECJ ruled at the beginning of December that in cases dealing with data protection violations, such as defamatory content, the notice-and-takedown procedure should not be applied, but rather that the respective platform is (jointly) liable for illegal content from the publication of the content on. Clearly unaware of the enormous implications of its decision for the freedom of expression and information of millions of users in the EU, the Court is thus demanding the establishment of a comprehensive monitoring system for communication in the digital public sphere.
Defamation on Online Marketplaces
The starting point for the ECJ’s ruling was a set of questions referred by a Romanian court. An unknown third person had placed advertisements on an online marketplace (owned by Russmedia Digital, which gives the case its name) that falsely suggested that the applicant in the original proceedings offers sexual services. The advertisement included photos of her, her telephone number and was also reproduced on various other websites. After the applicant informed the owner, Russmedia, about the defamatory advertisement, Russmedia deleted it within less than an hour, but it remained accessible on third-party websites. The applicant then claimed non-material damages from Russmedia, including for violations of her right of personal portrayal and the rights to honour and reputation, and unlawful processing of personal data. After various lower courts initially disagreed on whether the applicant was entitled to damages, the Court of Appeal (Curtea de Apel Cluj) referred the question to the ECJ, essentially asking whether Russmedia (firstly) should be considered the data controller responsible for the processing of personal data – which includes the false information about the offer of sexual services, the applicant’s photo, and her telephone number – and (secondly) whether liability such for unlawful data processing is determined solely by the General Data Protection Regulation (GDPR) or whether the specific liability regime for online platforms applies (while the applicable law in this case was still the eCommerce Directive, the relevant provisions have since been transferred to the Digital Services Act (DSA)).
The issue of responsibility
Are operators of online platforms (or, in this specific context, online marketplaces) responsible for illegal content on their platforms, and if so, under which circumstances? The answer to this question is perhaps the most important legislative decision for the structure of the internet as we know it today. While the infamous Section 230 of the Communications Act (“The Twenty-Six Words That Created the Internet”) protects platforms in the US from almost any kind of liability, Europe has taken a different path. Since the introduction of the eCommerce Directive 25 years ago, hosting providers (which include both online marketplaces and online platforms) are only protected from liability for illegal content as long as they have no knowledge of the content. As soon as they are notified, they must act – otherwise they become liable for the content (“notice-and-takedown”, Art. 14 eCommerce Directive, now Art. 6 DSA). Both the eCommerce Directive and the DSA also emphasize that providers cannot be subject to a “general obligation to monitor” the content published on their sites (Art. 15 eCommerce Directive, now Art. 8 DSA). The importance of this so-called “liability privilege” cannot be overestimated. Because platforms are not responsible for the content of third parties at the moment of publication, we have an internet where, in principle, everyone can express their opinion and share creative content without prior review—with all the advantages and disadvantages that such broad and initially uncontrolled participation in public discourse entails.
In Russmedia, however, the liability issue is now being approached from another side: instead of examining the (traditional) responsibility for content in accordance with the provisions of the eCommerce Directive and the DSA, data protection law is being used as a starting (and end) point. The ECJ first (convincingly) establishes that the applicant’s information—i.e., her photo and telephone number, among other things—constitutes personal data within the meaning of the GDPR, regardless of whether its untrue in nature (paras. 47-53). Since the false claims relate to the (alleged) sex life of the person concerned, they are also “sensitive data” within the meaning of Art. 9 GDPR, which is subject to special protection. The unlawfulness of the data processing, specifically the publication of (inaccurate) sensitive data about the data subject without her consent, is also unproblematic (paras. 80-84).
It is, however, substantially more difficult to determine who’s the “controller” for this unlawful data processing in terms of data protection law. The key distinction which has to be made is between “controllers” (who decide on the “purposes and means” of data processing, i.e., process data in their own interest, Art. 4 (7) GDPR) and “processors” (who process data on behalf of someone else, Art. 4 (8) GDPR). The unknown user who created the advertisement clearly clarifies as controller (para. 64). However, in the opinion of the ECJ, Russmedia (jointly with the unknown user) also qualifies as a controller. By granting itself extensive rights to the content created in its terms and conditions (including use, distribution, reproduction, modification, removal, and transfer to third parties), the company “can exploit those data for its own advertising and commercial purposes” (para. 67-68), and, thus, does not only process the data on behalf of the user. Furthermore, the mere provision of an online marketplace as such leads the ECJ to the conclusion that Russmedia has had an impact on the “means” of that publication, in particular by influencing parameters for the dissemination, such as the target audience, the presentation, and the duration of the advertisement (paras. 70-73). Hereby, the ECJ is further developing its established jurisprudence on the joint responsibility of content creators and online platforms/marketplaces under data protection law (see in particular Wirtschaftsakademie Schleswig-Holstein).
GDPR trumps the eCommerce Directive
The Court therefore finds (convincingly up to this point) that Russmedia is responsible for the unlawful processing of personal data created by the unknown user. While the ECJ’s further comments on identification obligations for online users (paras. 77-106) and measures necessary to prevent the copying of personal data (paras. 107-126) require their own critical assessment elsewhere, the focus of this article will be on the consequences of the responsibility for data protection violations (under the GDPR) for the general liability regime for online platforms (within the meaning of the eCommerce Directive, now the DSA). In this context, as the ECJ notices, “[t]he question therefore arises as to the relationship between those two instruments of EU law” (para. 128). In a mere ten paragraphs, the ECJ then revolutionizes the established understanding of liability for online content, apparently without being aware of the implications of its assessment.
The ECJ starts by reiterating that both the GDPR (Art. 2(4)) and the eCommerce Directive (Art. 1(5)(b), now Art. 2(4)(g) DSA), mutually emphasize that they are “without prejudice” to the application of or, respectively, “shall not apply to” questions relating the other legal act (paras. 129-133). However, in the present case, the outcome under the GDPR (liability for unlawful data processing from the moment of publication on) and the eCommerce Directive (liability privilege as long as the hosting provider has not been notified) are clearly in conflict. The fact that the legislator stipulates that two legal acts do not apply to each other’s legal questions, even though they do in fact heavily interfere with each other, obviously puts the judiciary in a difficult situation. One would expect that these legal texts are now examined intensively using various methods of interpretation, reviewing general legal principles of the relationship between lex specialis and lex generalis, and taking into account relevant primary law, in particular the Charter of Fundamental Rights.
The ECJ, however, took a different approach: it succinctly states that the GDPR being “without prejudice” to the liability rules of the eCommerce Directive merely means that an operator is “not automatically preclude[d] from being able to rely on [the liability rules] for matters other than those relating to the protection of personal data” (para. 134). The content of Art. 2(4) GDPR is therefore limited to a declaratory repetition of the scope of application of data protection law. Only if the GDPR is not applicable, there is room for other legal regimes. Thus, the liability privilege of the eCommerce Directive does not apply to GDPR violations, and Russmedia is liable for the unlawful advertisement published by its (unknown) user (paras. 135-136).
A comprehensive filter regime
This is a bombshell. It reduces the scope of the liability regime for online marketplaces and platforms to an absolute minimum. As soon as content, be it a comment, an image, or a video, contains personal data, the online service becomes liable for it from the moment of publication. Such content containing (false) personal information, be it defamation, deepfakes, or other violations of personal rights, has always been a major focus of the debate on platform responsibility in recent years (including fundamental ECJ rulings such as Glawischnig-Piesczek). In all of these cases, the liability regime of the GDPR is now “overruling” the liability privilege. Consequently, instead of notice-and-takedown, it is publish-and-perish which applies to online platforms. If a platform allows its users to publish illegal content, it becomes liable.
How does this look in practice? In order to avoid the risk of becoming liable under the GDPR, platforms must install comprehensive AI-supported filter systems that identify (potentially) illegal content from the wide variety of different statements expressed by users and prevent its publication. Thereby, the obligation to implement “technical and organizational measures” (TOMs) to ensure the lawfulness of data processing under Articles 24 and 25 GDPR becomes a “general monitoring obligation” (for the questionable understanding of TOMs as an “identification obligation” for users, see paras. 85-106 Russmedia). Under these circumstances, content which was first deemed legal but is later assessed as illegal poses legal risks for services, leading to significant incentives for online platforms to prevent the publication of legal content in cases of doubt (over-blocking).
It is precisely this danger of suppressing legitimate contributions that brought tens of thousands of people onto the streets in 2019 in the context of the EU’s copyright reform to protest against “upload filters”. The provision in question contained a restriction of the liability privilege for “online content-sharing service providers”, such as YouTube, with regard to copyright-protected content, unless they had “made, in accordance with high industry standards of professional diligence, best efforts to ensure the unavailability of specific works” (Art. 17(4)(b) DSM Directive). The ECJ emphasized that this open, technology-neutral wording ultimately constitutes an obligation to use automated detection and filtering systems for “prior review” (para. 54, Poland v. Parliament/Council). If such a filtering system were to lead to the “blocking of lawful communications”, it would be incompatible with the freedom of expression and information enshrined in Art. 11 of the Charter of Fundamental Rights (para. 86). In this regard, it also refers to its case law in Glawischnig-Piesczek and reiterates that rights holders (in copyright law) or the respective person (in the case of violations of personality rights) must specify the relevant content in such a way that it does not “require an independent assessment” by the service provider (paras. 89-90).
Back in 2022, in response to Poland’s action for annulment, it took the ECJ 41 paragraphs, an extensive consideration of the comprehensive set of procedural safeguards laid out in the Copyright Directive, and an interpretation of the ordinary law in accordance with fundamental rights to confirm the compatibility of an explicit legislative decision for “upload filters” with primary law. Now, in Russmedia, the ECJ does not test the compatibility of such a filter regime with fundamental rights at all. If European data protection (!) law is indeed to be interpreted as providing for comprehensive monitoring and filtering of communication– indirectly and without any indication that this was the intention of the legislator – the ECJ should have examined in detail whether these provisions are compatible with the freedom of expression and information of the users affected by the filtering systems. Instead, the ECJ simply states, without any further explanation or justification, that compliance with the obligations under the GDPR cannot be classified as a “general monitoring obligation” (para. 132). Thereby, the ECJ has missed the opportunity to clarify the relevance of the liability privilege (introduced by ordinary law) for the exercise of fundamental rights (and further develop the line of jurisprudence developed in Poland v. Parliament/Council), while at the same time reducing legal uncertainty regarding the vague exceptions for “special monitoring obligations”. Instead, the unsettling impression remains that the ECJ was not even remotely aware of the significance of its decision.
A new line of case law is in its infancy
After reading this ruling, one is left somewhat puzzled as to what the state of liability of hosting services for third-party content in the EU is. Although the decision directly only concerns an “online marketplace” and by now replaced provisions of the eCommerce Directive, the ECJ’s reasoning does not contain any restriction according to which these principles would not be directly applicable to online platforms and liability under the DSA. One needs to keep in mind that these rules not only affect Big Tech services like X (formerly Twitter), Facebook, and TikTok, but also, in principle, every niche and common good-oriented online platform. From very large online platforms, such as Wikipedia, to smaller Mastodon instances, the impact is likely to be quite significant. Without any limitation of liability, these services would be threatened in their existence.
There are two entry points for more restrictive interpretations of the judgement in the future: First, the requirements for joint responsibility under data protection law (paras. 66-67, to what extent does this line of argumentation, for example, apply to non-commercial services without algorithmic curation?); and second, the manifest unlawfulness of the content and its harmful nature in the specific case (see paras. 39-40 on the questions referred by the referring court). Furthermore, the ECJ has also established notice-and-takedown-like procedures in other decisions in data protection law (Google Spain, see also Bäcker in BeckOK Datenschutzrecht, 54th ed., Art. 2 GDPR, para. 35).
Germany’s Federal Court of Justice (BGH) now has the opportunity to contribute to the development of this new line of case law: For more than three and a half years, Renate Künast, a former German minister, and Facebook have been arguing before various courts about Facebook’s obligation to prevent the repeated publication of a misquote of Künast on the platform. While the Regional Court and Higher Regional Court of Frankfurt am Main had still regarded this as a question of the interpretation of the liability provisions of the eCommerce Directive, the BGH recognized the significance of data protection law and suspended the proceedings to wait for the decision in Russmedia. It remains to be seen whether the BGH will now be satisfied with the ECJ’s argumentation or whether it will offer the Court the opportunity to rectify the decision by asking further questions about its interpretation.



