From the DMCA to the DSA
A Transatlantic Dialogue on Online Platform Regulation and Copyright
For most of the early 21st century, EU law on online intermediaries was sparse, with no comprehensive harmonization of intermediary liability. The centerpiece of the legal framework was the 2000 e-Commerce Directive, which contained mere conditional liability exemptions, or “safe harbors,” for certain types of intermediary services involving claims for damages (mere conduit or access, caching, and hosting), as well as a prohibition on the imposition by Member States on intermediary service providers of general monitoring obligations (arts.12-15 e-Commerce Directive). Under this regime, intermediaries may still be required to take measures against the infringement of third party rights, because it remains possible to subject intermediaries to injunctions in regards to intellectual property rights, and duties of care (van Hoboken et al. 2018, Wilman 2021). The interpretation of this constellation of provisions is complex and far from settled (see e.g. Angelopoulos 2020). It is sufficient here to state that the development of case law from the Court of Justice of the EU (CJEU) in interpreting specific subject matter rules to extend the reach of harmonized EU law to online intermediaries, like in the context of intellectual property, led to increasing push towards additional regulation of online platforms.
This push has been justified around a somewhat blurry concept of legal, societal, political and even moral “responsibility” of online platforms (Helberger, Pierson and Poell 2018; Taddeo and Floridi 2017). The potential result, could “represent a substantial shift in intermediary liability theory,” signaling a “move away from a well-established utilitarian approach toward a moral approach by rejecting negligence based intermediary liability arrangements”, practically leading to a “broader move towards private enforcement online” (Frosio and Husovec 2020).
In Europe, this state of affairs has led to a deluge of new “platform regulation” legislation in the past years, featuring the adoption of rules on terrorist content online, video-sharing platforms, copyright content-sharing platforms and–in what is the centerpiece of this push–horizontal rules for all online intermediaries in the Digital Services Act (DSA) (Buiten 2021, Farrand 2019). The DSA – which came into force on 17 February 2024 – takes a novel regulatory approach to intermediaries by imposing not only liability rules for the (user) content they host and moderate, but also separate due diligence obligations for the provider’s own role and conduct in the design and functioning of their services (Husovec and Laguna 2022, Wilman 2022; Hoboken, Quintais, Appelman et al. 2023). The main target of these obligations are Big Tech companies, namely very large online platforms and search engines. They are subject to the largest set of obligations, including on due process and risk assessment and mitigation. These obligations extend to algorithmic moderation systems and the effect of their services on users’ fundamental rights. This legislative push has also featured non-binding instruments like Codes of Conduct, Memoranda of Understanding and Recommendations on hate speech online, counterfeited goods, disinformation, and piracy of live events (Quintais, Appelman, Ó Fathaigh 2023)
The US platform regulation story is different. It is undeniable that most of the largest and most successful internet intermediaries – at least in the Western world – originate from the US. Authors like Kossef causally link this fact to the US legal landscape (Kossef 2022), in particular to Section 230 of the Communications Decency Act (CDA) – passed in 1996 – which immunizes online platforms for liability arising from significant amounts of user-generated content.
Importantly, Section 230 contains a number of exceptions, such as for the enforcement of federal criminal law, copyright law, and electronic communications privacy laws. The copyright law exception is found in the 1998 Section 512 of the US Digital Millennium Copyright Act (DMCA). Section 512 sets out a notice-and-takedown system that caching, hosting and linking platforms must comply with in order to qualify for the safe harbors. This regime directly influenced the design of the intermediaries’ “safe-harbors” in the e-Commerce Directive and, as Sag notes, has also influenced the shape of online copyright enforcement online, leading to the implementation of “DMCA-plus” private agreements between rightsholders and large commercial platforms “in the shadow of those safe harbors” (Sag 2018). These have ultimately resulted in automated copyright content moderation systems, including sophisticated filtering tools like YouTube’s Content ID or Meta’s Rights Manager.
A similar sector specific path was followed in Europe based on the combined application and CJEU interpretation of direct liability rules for communication to the public of copyrighted works and the “safe-harbors” in the e-Commerce Directive, and national (non-harmonized) rules on secondary liability under national law. The latest development in this legislative story has seen the EU adopting a highly complex special regime for “online content-sharing service providers” (OCSSPs) in Article 17 of the Copyright in the Digital Single Market Directive (CDSMD). This provision applies to OCSSPs that host and provide public access to copyrighted content. This regime is unique in that it imposes direct liability on OCSSPs, sets aside the application of the hosting safe-harbor, and imposes its own special liability exemption mechanism, featuring best efforts obligations to obtain licenses, and implement measures for notice and takedown, notice and stay down, and preventive filtering (see Quintais et al., 2022, 2024; and COM/2021/288 final).
Also, in the US there is significant pressure to reform these legal regimes. For the moment, efforts to implement a solution similar to Article 17 CDSMD have largely failed (Samuelson 2021), in part due to skepticism surrounding its adoption (e.g. Bridy 2019) and its roll out in Europe, which has already included a challenge on its validity on fundamental rights grounds (Case C-401/19 – Poland v Parliament and Council; Quintais 2022, Husovec 2023). Section 230 CDA, on the other hand, has faced much more persistent frontal attacks – including in ongoing US Supreme Court litigation (see e.g. Funk et al 2023, Rozenshtein 2023) and calls for reform with bipartisan support, even if on different grounds (see e.g. Anand et al, 2021, Jurecic 2022, Perault 2023).
Against this background, a group of European and American scholars convened in 2023 to discuss the potential benefits and risks of the EU’s new approach in its transatlantic context. They debated the DSA’s potential to lead to a new EU/US consensus or even EU influence on US platform regulation and liability debates (see Urban 2023). The first meeting in the US led to the publication of a special issue on the topic in the Berkeley Technology Law Journal. The second workshop in Amsterdam gave rise to this blog symposium.
The contributions to this symposium come from leading academics in the EU and US, often in collaboration with each other. They can be divided into two larger themes. A first set of contributions considers transversal issues of platform regulation in the EU and US, namely those of consistency (Rebecca Tushnet), due process (Eric Goldman and Sebastian Felix Schwemer), fundamental rights (Christophe Geiger and Giancarlo Frosio; Martin Senftleben) and the potential “Brussels Effect” of the DSA (Martin Husovec and Jennifer Urban). A second set of contributions zooms in on key regimes, critically assessing rules on trusted flaggers (Eleonora Rosati), human in the loop (Rachel Griffin and Erik Stalman), access to data for researchers (Niva Elkin-Koren), and transparency (Pamela Samuelson and Natali Helberger).