In his seminal book titled “The Morality of Law”, Lon L. Fuller explained that, among other things, the rule of law cannot be reconciled with the existence of secret laws, unclear laws and laws which cannot be obeyed. Yet, these seemingly straightforward postulates may turn out to be surprisingly difficult to realize in practice. This is especially the case in the regulatory areas where full transparency is at odds with the legislative goals; where a certain degree of flexibility of rules is necessary to address changing circumstances, in which these rules function; and where a disconnect occurs between the visions of the lawmaker and reality created by modern technologies that are utilized to pursue them.
In the following, we reflect on the Passenger Name Record Directive (PNR Directive), as recently interpreted by the Court of Justice of the European Union (CJEU) in the Ligue des droits humains judgement, from the perspective of the aforementioned postulates concerning the rule of law. The CJEU was requested to respond to a number of pertinent questions regarding the asymmetry of powers provided by the PNR Directive in relation to individuals, whose personal data and privacy are at stake by the processing of PNR data. In this contribution, we will focus specifically on how the Court addressed the problem of foreseeability of measures established under the PNR Directive. We argue that the Court shared the concern about the PNR system’s compliance with the rule of law, but failed to provide conclusively guidance on what minimum criteria to demand of the quality of the law that governs modern security measures.
The PNR Directive and Asymmetry of Powers
The PNR Directive provides for the collection and transfer of PNR data of passengers of extra-EU flights by air carriers to the designated state institutions, namely the Passenger Information Units (PIU), and the processing of the PNR data by Member States, as well as exchange of PNR data between Member States. By imposing the obligation on the air carriers to collect and transfer information for the purposes of preventing, detecting, investigating and prosecuting terrorist offences and serious crime, the EU legislator made the air carriers an important link in the chain of modern security architecture. This architecture is characterized by the ever-increasing role of private actors in contributing to the performance of law enforcement and security tasks by the state. The joint-forces of public and private actors create a new social reality, in which an individual is no longer confronted only with the power of the state, but with a network of power created by both the state and non-state actors. This implies that the protection traditionally afforded by the rule of law standards, such as foreseeability of coercive measures, needs to reach beyond the state-individual relation and take account of the third player in this picture: the private actors.
The Role of Private Actors and the Foreseeability Requirement
The foreseeability of fundamental rights interferences is relevant when private actors provide the public decision-maker with personal data, which then becomes automatically processed by public bodies. In the case of the PNR Directive, the national PIUs carry out automated risk analyses of the personal data provided by the air carriers – both by comparing it to ‘relevant databases’ and against pre-determined risk criteria – and any positive match is individually reviewed by non-automated means by the Member States’ PIUs (Article 6).
The requirement of foreseeability is a substantive part of the principle of legality, which in essence is a limit to the exercise of public power, and strives to ensure that powers are sufficiently defined in order for its exercise to be foreseeable to individuals and not arbitrary. In the judgement, the principle of legality is discussed under the assessment of Article 52(1) of the Charter, which concerns the legality requirements on fundamental rights limitations. According to this provision, legal rules must have a certain level of foreseeability for interferences in the rights to privacy and data protection to be “provided for by law”.
Foreseeability is a rule of law requirement which does not rest on private actors, but on the public in its exercise of power. However, the regulation and interpretation of how private actors can transfer personal data to public actors is central to this public requirement on exercise of power, as it affects the foreseeability and legality of these public actors’ automated processing.
In general, any lack of clarity regarding the rules and safeguards applicable to the processing by private actors and their collaboration with the public leads to a lack of clarity when it comes to what kind of data will be used for the public automated processing and the subsequent decision-making. Therefore, if the outcome of semi-automated decision making is unforeseeable due to the involvement of private actors, that becomes a rule of law problem.
Quality of Law Requirements
It is not sufficient for the Court that there are formal legal grounds for processing, since, tacitly following Fuller, the Court also takes account of a quality of law test related to clarity and foreseeability. This is expressed by the Court in the judgement in terms of requiring that an act permitting interferences with rights “must itself define the scope of the limitation on the exercise of the right concerned” and that the legislation must lay down “clear and precise rules governing the scope and application of the measures provided for” (paras 114, 117).
In the judgement, the Court finds that the legality requirement is satisfied, stating that the PNR Directive lists PNR data and provides a detailed framework for processing those data. However, as the Directive was formulated, not all provisions met the requirement of clarity and precision according to the Court. For example, provisions on what PNR data the air carriers are obliged to provide (paras 129 et. seq.) or what constitutes ‘relevant databases’ which may be compared against PNR data (paras 182 et. seq.). This led the Court to clarify those provisions itself, as the Court may specify legislation by interpretation.
The Court does not develop much on, or criticize, the clarity and precision of the legal basis for the collaboration with private air carriers. Instead, the Court mainly clarified when the General Data Protection Regulation or Law Enforcement Directive applies (paras 77–84). The main problem, as has been discussed in previous research (see e. g. Purtova; Gottschalk; Brewczyńska), is that the different applicable legal regimes offer substantially different data protection standards. On a practical level, in the context of public private collaborations, the duality of legal regimes may also lead to confusing situations, where different standards apply to the same sets of data, depending on the purpose of the processing and type of entity carrying out the processing.
Another problematic aspect, looking generally at the implementation of regimes involving automated analysis of personal data, is that they are based on software which has rarely only been developed by the public body using them. The design of software, such as risk screening algorithms, and the choices made in that process, are essential to the decision-making process. Hiring private actors to do this could be considered a form of de facto delegation, which raises questions on monitoring, foreseeability and accessibility (see Hofmann, p. 20).
Further Down the Road: The Foreseeability of Automated Processing
In the judgement, the Court considers that the rules on automated processing are sufficiently detailed. When the Court defines what is clear and precise – as it did here – this sets a standard for future automated processing in the European security architecture. One may question whether the rules on automated processing in the PNR Directive are clear and precise enough to be sufficiently foreseeable, considering the new challenges related to opacity which comes with automation of public decision-making.
The judgement allows for quite a lack of foreseeability and wide discretion as regards the establishment of pre-determined risk criteria. This is central, as – together with the personal data – the risk criteria are the main components of automated risk assessments (Article 6(3)). What triggers a hit is a correlation between these two components, which is a very different foundation for decision-making than causality (see Bayamlıoğlu and Leenes). As the Court held, the extent of the interference of automated analysis of PNR data essentially depends on the pre-determined models and criteria (para 103).
The limits to the establishment of pre-determined risk criteria are held in general terms in the judgement and Directive: they must be specific, non-discriminatory, targeted and proportionate (paras 105, 189; Article 6(4)). These generic limitations leave wide discretion to the actor who gets to define these risk indicators in non-accessible and often secret acts, which is what Fuller is sturdily warning against. In the PNR regime, the defining actor is the national PIU. Parallels can also be drawn to the upcoming IT-system ETIAS, which provides for similar automated risk screenings of visa-exempt third-country nationals, but where the EU agency Frontex will finally establish the pre-determined risk criteria (Musco Eklund). While the requirement that risk criteria must be non-discriminatory is not developed in depth here, it should be underlined that this requirement in itself represents a challenge for the actors defining risk criteria, as previous research has stressed the risk of discriminatory profiling (both direct and indirect by proxy) which comes with using personal data for automated risk-profiling systems (Vavoula; Derave et. al.).
Rule of Law Implications on the European Security Architecture
When balancing the right to privacy and data protection of the individual against the public interest of security, one problematic aspect is what seems to be a rising logic of the automated European security architecture. The CJEU relies heavily on the proper functioning of subsequent manual processing to correct the inherent issues and margin of errors which come with automated processing (paras 123–124). This overlooks that while manual assessment is considered an additional safeguard against fully automated decision-making, it also constitutes a form of heightened surveillance of certain individuals. Not only the final manual decision of a PIU matters, as the automated filtering constitutes an important de facto decision in an automated decision-making process (Binns and Veale), particularly as only those filtered out by the automated processing are subject to further checks.
100% foreseeability is not reasonable to expect in a security context where full transparency is at odds with the legislative goals, and a certain degree of flexibility of rules is necessary. However, there seems to be a new logic in this automated context of what can justify a fundamental rights interference. It is not a person’s behavior, or any reasonable suspicion that subjects you to the first interference, but simply being a passenger, or in the case of the ETIAS regime, being a third-country national from a visa-exempt country. ‘Reasonable suspicion’, in this risk-based approach, is replaced by other triggers for interferences: who you are, in the case of ETIAS, which base risk indicators on data such as education level, age and occupation (Arts. 17(2), 20(5)), or how you travel, in the case of the PNR directive (Annex 1).
The PNR judgment illustrates an approach where less foreseeability will be accepted in the regulatory area of automated security and border control. This should be critically discussed from a rule of law perspective in light of automated systems’ opacity challenges, as automation arguably instead calls for a higher level of foreseeability to uphold rule of law safeguards. As stated by the Court: “The need for such safeguards is all the greater where personal data are subject to automated processing” (para 117).
Foreseeability and Room for Legal Interpretation
Legal rules must be clear and precise to ensure “that situations and legal relationships remain foreseeable” (Venice Commission, Report on the Rule of Law, para 46). In the discussed judgement, however, the CJEU attributed a considerable value to the role of interpretation. The Court emphasized the need for interpreting the PNR Directive, as far as possible, in a way, that would not affect its validity and conformity with primary law and, in particular, with the provisions of the Charter (para 86). Furthermore, “when a directive allows the Member States discretion to define transposition measures adapted to the various situations possible, they must, when implementing those measures, not only interpret their national law in a manner consistent with the directive in question but also ensure that they do not rely on an interpretation of the directive that would be in conflict with the fundamental rights protected by the EU legal order or with the other general principles recognized by EU law” (para 87).
This approach may bring into doubt the PNR Directive’s compliance with the standards of clarity and precision. On the one hand, as discussed earlier, the Court seems to notice several shortcomings of the PNR Directive in the Ligue des droits humains decision. On the other hand, it attempts to mitigate them by calling upon a pro-Charter interpretation, thereby leaving individuals whose privacy and personal data are concerned ‘at the mercy’ of the good will of the Member States. This further includes the challenging responsibility Member States face to establish non-discriminatory risk criteria, as mentioned above.
If we return to Fuller, the features of law, which he perceives as foundations of the internal morality of law, are not an all-or-nothing affair. On the contrary, they are qualities, which the law and legal systems should aspire to. This implies that the rule of law can, in fact, be considered a matter of degree (Dworkin, p. 5). The level of foreseeability of law can differ and, as showed in the discussed judgement, it certainly does. The CJEU noted that the law which permits the use of coercive powers must itself define the scope of the limitation of the fundamental rights, which it entails. At