Spillovers and Unexpected Interactions
Reading the La Quadrature du Net II decision in context
For more than a decade now, the Court of Justice of the European Union (“CJEU”) has struggled with the legality of various bulk surveillance mandates imposed under European and national law. Since its 2014 judgment in Digital Rights Ireland, the CJEU has been unequivocal about the need for non-trivial legal constraints on data collection (in that case, under Articles 7, 8 and 11 of the Charter of Fundamental Rights). In the 2016 cases of Tele2 Sverige and Privacy International, and subsequent rulings, the Court crafted a reticulated, multi-tiered framework matching different objectives with different regimes for bulk data collection and retention.
Its April 2024 judgment in La Quadrature du Net II (C-470/21) extends this proportionality-oriented framework to the retention and sharing of IP address with Hadopi, a French public authority that “protect[s] works and subject matter covered by copyright or related rights against infringement” (Para. 52). Hadopi used that data to identify the transmission of unlicensed material. The Court declined to find that a Charter violation in Hadopi’s authorized access to, or use of, internet protocol (“IP”) addresses provided, inter alia, that such data was strictly partitioned from other bodies of data (including information about the work downloaded) that could be used to reveal sensitive personal information (e.g., “sexual orientation, political opinions, religious, philosophical, societal, or other beliefs,” the so-called “special categories of personal data” of Article 9 of the EU General Data Protection Regulation (Para. 110)).
To our eyes, the La Quadrature du Net II decision does not mark a sea-change in the CJEU’s approach. The Court applied again a general principle that the intensity of the limitation of personal rights and freedoms has to mirror the seriousness of the interest put at risk, and the identification and categorization of what is “intense.” What is “serious” now has to be determined by the courts. Both the direct and the indirect impacts of this decision for privacy, we argue here, arise from interactions with other bodies of law and commercial practice related to data.
Assessing the marginal impact of mandatory disclosure
The effect of a novel mandatory retention and official access regime depends on the other ways in which covered entities already come into contact, and share data, with officials. Where those firms are already sharing user data (including perhaps IP addresses) with officials—whether voluntarily or pursuant to a legal mandate–the effect of such retention and sharing mandates will be diminished.
The French law at issue in La Quadrature du Net II was the Intellectual Property Code (or “CPI”). It applied to “[e]lectronic communications providers … and service providers.” This covers firms that provide access to the internet for individual private consumers. The CJEU did not ask whether such entities are subject to any other legal regimes that might lead to the sharing of IP addresses with agents of a European state. The gap is puzzling. A proportionality analysis should logically take account of the way in which extant law already gives a (potentially bad intentioned) state actor a path to access such data.
In this regard, consider the effect of the 2024 Digital Services Act (“DSA”), which came into force on February 17, 2024 (i.e., two months before La Quadrature du Net II), upon the electronic communications providers covered by the CPI. The DSA’s most well-known provisions concern its implications for very large online platforms. But these are not the only entities reached by that extensive and intricate legal measure. Chapters II and III of the DSA impose new rules for many entities that likely rank also as electronic communications providers. For instance, Article 18 requires hosting services to “promptly inform” the state of certain suspected criminal activity. The DSA also requires certain intermediaries to trace sellers on online marketplaces as a means to protect purchasers.
While the exact scope and implementation of the DSA are a work in progress, it is hardly far-fetched to posit that the DSA’s obligations will fall on some of the firms covered by the CPI, and that firms under the DSA will finds themselves in close and frequent contact with regulators. Article 18, for instance, envisages information sharing on an ongoing basis. Verifying compliance with the DSA regulators will also need top peer inside commujnications systems. Whatever the formal terms of the law, it would be very surprising if, in practice, this did not lead to any leakage from firms to officials, and did not lead to private-public relationships that could serve as effective springboards for informal cooperation.
If officials (especially bad intentioned ones) already have a way of accessing IP addresses and other data, that is, are the CJEU’s new constraints doing no work? The effect of the DSA on privacy is not necessarily a negative one, so there is no easy answer to this question. After all, perhaps officials’ familiarity with how electronic communications providers structure and preserve their data creates a new or additional interest in finding ways to get data lawfully. That is, it may stimulate the problem to which the CJEU responded. But the interaction does underscore the oddity of evaluating risks to privacy in a vacuum.
There is a second way in which existing electronic communications practices interact with the privacy risks of the CPI. When packets of data are moved across the physical infrastructure of the internet, they are generally labeled with both the source and the recipient IP address. According to Vadim Nikitim, some 70 percent of this traffic flows through physical switches and data centers in the United States. And, as Henry Farrell and Abraham Newman document, the U.S. has long taken advantage of its unique access to the physical infrastructure of the internet to access data without the permission of other sovereigns. To our knowledge, such access is not constrained by the rules promulgated by the European Data Protection Board. In practical effect, the security of IP addresses, which was the specific kind of data at issue in La Quadrature du Net II, turns on the nature of the relationship between a given European country and the U.S. national security apparatus. While it might seem that mere access to source and recipient IP addresses does not reveal a person’s civil identity, we suspect that application of data-intensive AI analytic will often (perhaps almost always) allow accurate inferences of id…
Again, the reason to highlight this is not to undercut the CJEU’s legal conclusions, but to point out how they might be enriched and complicated through contextualization. This underscores the threshold need for the CJEU to produce decisions that are more and more specific and provide some guidance.
The Hidden Regulatory Ambition of the CJEU
Our observations so far have raised questions about the efficacy of the La Quadrature du Net II decision as a protection of privacy. In another respect, however, the decision has an unanticipated, even hidden force: it engenders a right against automated, machine decisions far beyond the scope of the extant European law concerning that right.
Explaining why requires some backtracking: one of the questions discussed by the CJEU was whether the CPI’s retention mandates triggered a demand for “prior review” by a court or an independent administrative body (Para. 123). The Court’s ruling on this point is complex. Not all applications of the CPI, it explained, involved serious violations of fundamental rights. Where they did, however, the CJEU held that prior review was required. In response to this threatened ruling, the French authorities had suggested that such review could be “entirely automated” because of the sheer volume of such instances (Para. 147). The CJEU baulked at this suggestion. It directed instead that “in no case” could prior review be “entirely automated,” since this would make it impossible to strike a “fair balance” in an individual case (Para. 148). Eventually, the data subject has a right against a fully automated decision by the French government and a parallel right to a fair assessment from a human mind, able to contextualize and understand the full picture as a prerequisite to a balanced decision.
This ruling is striking because European law elsewhere considers the scope of such a right to a human decision (as it might be paraphrased), and does not extend it to these circumstances. Article 22 of the General Data Protection Regulation creates an individual “right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.” It explicitly limits that right, though, in several ways. One limit concerns instances in which a fully automated decision is “authorised by Union or Member State law to which the controller is subject and which also lays down suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests.” That could include the CPI. If it does, the CJEU seems to have extended the Article 22 right to a human decision considerably further than its legislative specification. As such, it may be innovating beyond the available legislated materials in unexpected ways.
One of us has argued elsewhere that the Article 22 right to a human decision does not have sturdy normative foundations. But that is beside the point here: more simply, the important observation here is that in crafting new rules for privacy protection in the bulk retention and surveillance contexts, the CJEU may be engaging in legal innovations that run far beyond what written law imagines. Perhaps this is desirable, perhaps not. But the spillover effects of its La Quadrature du Net II are more complex than commonly appreciated—and require some contextual analysis in order to be excavated. Paradoxically, what seemed an innovation or a step back to those who only looked at the decision, can be analyzed as the specification of existing principles. What is really innovative are the subtle implications resulting from a mise en perspective, which is the interaction of the decision with other principles and bodies of law.
Another question is worth asking: if we consider the previously mentioned “intensity” of the limitation of personal rights and freedoms, how “intense” is sharing an IP address? More explicitly, how much does an IP address say about an individual? In this debate, it seems crucial to recall that not all personal data are equal. Some data, in fact, do not say much about the person. It does not allow immediate identification by the general public. We hence think it desirable that a clearer analysis be conducted of what, practically speaking, can be considered as “telling” personal data, seriously damaging one’s freedom and reputation in case of disclosure. There certainly is a variability in the significance of personal data that alters the practical effects of disclosure and the possibility of identifying – and potentially damaging – an individual. The GDPR seems to hint to this gradual approach by referring to indirect identification (Article 4) and sensitive data (or special categories of personal data – Article 9). Furthermore, the safeguards surrounding interfering measures (e.g. confidentiality obligations imposed on public agents) have to be weighed in when assessing how much an individual is in reality impacted.
Conclusion
We understand decisions such as La Quadrature du Net II best by locating them in a legal and socioeconomic context, considering how data protection rules exist and are applied in very practical contexts and how they should exist to protect individual rights, without ever sacrificing general interests. We have tried to show how this might be done, and how it could yield analytic payoffs and a better understanding of implications that, without appearing as immediate consequences, are powerful in their effects. We hope that these methods can be used elsewhere in respect to other important questions of European data privacy law. What seems much needed in our time is a constant contextualization and an ability to put things in perspective and in communication, without ever adhering to data protection orthodoxies that could, in the end, damage in far more serious ways the individuals whose privacy we want to protect.