11 May 2023

EU Privacy and Public-Private Collaboration

Ligue Des Droit Humains and Private Standard-Setting

A widely noted international trend is the delegation of core state functions, such as law enforcement, to private actors. Nowhere is this more apparent than in the development and use of security technologies. For instance, police authorities increasingly use Artificial Intelligence (AI) tools created and managed by the private sector – claims of efficiency, speed and accuracy justify their use. This public-private collaboration harbours detrimental consequences for fundamental rights and the rule of law, in particular, the principle of legality.

In this blog, I focus on three different categories of private actors’ participation in the European Union’s (EU) security policy arena. I illustrate this through a number of policy examples on coerced, voluntary and proactive forms of “cooperation” between the public and private sector, and their effects on the rule of law. The policy outcomes which result from this public-private collaboration are not democratically accountable, and allow human rights to be superseded by private, profit-driven interests.

Legal obligations

The standard situation of public-private cooperation is one of state regulation requiring private sector compliance. In this scenario, private actors react to the legal norms imposed on them. In the context of Passenger Name Record (PNR) data, this occurred already in the 1990s in the USA, when US authorities required airline carriers to provide personal (PNR) data on travellers for security purposes. After 2001, the USA’s insistence on the transfer of PNR data proliferated globally, and the first controversies regarding the protection of personal data and its transfer to foreign states (i.e., the USA) emerged. The transfer of personal data by companies operating in the EU to a foreign country, where  EU data protection rules would not apply, was at the heart of this debate. Despite early controversy, the EU’s tendency to mimic US approaches in some areas of securitisation is particularly apparent here. By 2016, the EU had adopted legislation requiring the transfer of PNR data by carriers to destination authorities, covering both flights into and out of the EU, as well as intra-EU ones (although the latter was not mandatory). The resulting controversy is the subject of this broader blog series, while the resulting impacts on fundamental rights are discussed in the third section of this blog.

The harnessing of the private sector in achieving security goals is also apparent in the area of online content governance and surveillance. The Regulation on Disseminating Terrorist Content Online, adopted in 2021, and the proposed Child Sexual Abuse Regulation (CSAR) showcase how EU legislation furthers a form of securitisation of the internet. The objective is no longer to regulate the conduct of business as such but rather to pursue law enforcement purposes and security policy goals. The consequence is the integration of internet platforms into state security activities as central actors. For example, it is clear that, although the CSAR is lex specialis to the Digital Services Act – the EU’s core legislative piece for regulating illegal content on intermediaries – and its legal basis is Article 114 of the Treaty on the Functioning of the European Union, which supports harmonising measures for the internal market, other parts of the CSAR clearly relate to the practices of law enforcement. The absence of a legal basis for law enforcement competencies in the EU, introduced instead under the cover of internet regulation, enables a harmful privatisation of the protection of children, which is and should remain a law enforcement responsibility.

Private standard-setting

The problem of achieving law enforcement and security goals by way of internet regulation is further exacerbated by the promotion and adoption of privately-developed standards and practices as implementation norms. The reasoning behind this endorsement of private actors’ standards in digital policies is that they are better placed than the legislator to know what is efficient and realistically achievable. In the content governance field, this often takes place through the prescribed deployment of risk assessment tools and proactive measures, a strategy of incentivizing certain outcomes while leaving it to the private sector to define how to reach them. Although the reasoning of state actors in drawing on the private sector is the desire to build on industry best practices, the outcomes are the use of so-called technology-neutral tools, which are too controversial for the public sector, and constitute blatant challenges to key principles of EU data protection and internet regulation.

For example, the use of content automatic filters (also known as ‘upload filters’) and hash databases that monitor, identify and remove content have already been put in place by social media platforms. These tools have been promoted as the industry-wide solution in the fight against terrorist content and online radicalisation – an apparent simple answer to a very complex societal issue. However, it has been repeatedly demonstrated how these tools fail to assess the context of publications accurately, thus leading to the worrying censorship of legitimate expression. They also constitute a form of mass monitoring, contradicting the EU’s own legal values.

In the context of the ongoing negotiations on the CSAR proposal, a (dominant) part of the technology industry is attempting to seize the opportunity of the proposed series of new user surveillance obligations by suggesting their technological products as solutions, and having the legislation impose them on their competitors. One of them, extremely controversial, is called “client-side scanning”, an intrusive technology that circumvents end-to-end encryption. This shows that depending on their position on the market, private actors either put up with legal obligations and try to adapt, or they exploit them to cement their dominance by seeking new market opportunities and making their products the legal standard (further discussed in the last part of this blogpost).

The PNR Directive placed private actors in the air transport sector in a complicated position. While some airlines collect PNR data as part of their commercial practices to improve services to regular customers, not all airlines do so directly. The obligation to transfer this data to public sector actors in countries other than the one where the data was collected (mainly the USA, Canada and Australia) meant that many companies chose to outsource the collection and transfer to other companies specialising in this service. The private sector itself did not challenge the legislation requiring it to provide access to this personal data. It acquiesced to state demands but insisted that inter-state agreements were entered into, to protect private sector actors from challenges by individuals. Instead, the legal challenges in the EU context were launched by the European Parliament and, more recently, by specialised NGOs.

Voluntary Disclosure

Another form of private sector involvement as regards public sector demands on access to personal data for law enforcement purposes (notably in criminal investigations) is ‘voluntary cooperation’. Private actors are encouraged to disclose their clients’ personal data ‘informally’, outside the scope of the law. Europol’s 2022 SIRIUS EU Digital Evidence Situation Report reveals that direct requests for data to foreign-based online service providers under ‘voluntary cooperation’ continue to be widely used in the EU. 63% of officers indicated direct requests as their main type of request in 2021, whereas only 19% favour legal judicial cooperation channels.

This approach, which is much favoured by public authorities, side steps the problem of fundamental rights and data protection. Indeed, voluntary data disclosure represents further processing of that data by the private controller for a purpose inconsistent with the original purpose – which is expressly prohibited under the General Data Protection Regulation (GDPR). Nevertheless, the practice shows that data controllers, rather than looking to their obligations in EU law, frequently merely acquiesce as the requests are coming from state authorities, so voluntary cooperation is assumed to be lawful. To the contrary – data disclosure to law enforcement bodies should always be regarded as a restriction of fundamental rights that must be provided for by law and satisfy requirements of necessity and proportionality in accordance with Article 52(1) of the Charter.

Despite the blatant legal uncertainty of this practice, EU legislation demonstrates a  trend of voluntary disclosure of personal data by the private sector outside any legal basis. In the recent reform of Europol’s mandate, private actors are strongly incentivised to breach personal data protection rules by ‘cooperating’ with Europol. This only creates extensive problems around transparency and accountability, while at the same time undermining fundamental rights, in particular, procedural rights.

The question of lawfulness is very much known by the institutions themselves. Europol’s Data Protection Function (DPF) was consulted to assess compliance of the practice with data protection law. To date, there are no final conclusions from the various data protection authorities involved in this process on this question.

A feedback loop between financial support and legislative agenda-setting

Another reaction of the private sector has been to proactively engage with public sector law enforcement (or other agencies) to take control of the policy-making agenda and to jointly shape policies in directions which are beneficial to the private sector itself. In the security field, technological development usually precedes the establishment of a legal basis. The industry develops security solutions to sell to state authorities and, at worst, they do so by receiving EU public funds. Once the system is already in place, it’s like a fait accompli. The only thing missing is legal backing.

For example, the development of national PNR systems were helped along with at least €50 million by the Commission, years before an EU Directive was finally agreed upon in April 2016. The Eurosur border surveillance system was in development for at least five years before legislation was approved in 2013, with numerous EU research projects helping put the pieces in place before a Portuguese firm won the multi-million-euro maintenance contract. The ‘smart borders’ project involving the Entry/Exit System (EES) and the European Travel Information and Authorisation System (ETIAS) followed a similar path – research projects helped develop the technology in the last decade, but the complementary legislation only came up for adoption years later, while the implementation deadline has been postponed several times by now.

In terms of agenda-setting, the International Civil Aviation Organisation (ICAO), a specialised agency of the United Nations that coordinates the principles and techniques of air travel, is being used as a forum for policy influence by both governments and the travel industry to promote “worldwide tracking, surveillance, and control of all individuals’ movements.” Airlines are active participants in ICAO working groups and use them to impose their own technical standards for PNR data collection and storage. By governing the data collection processes, they minimise their burden of collecting additional data, selecting or reformatting passenger data differently, to satisfy different governments’ demands. That way they become norm-setters.

Lastly, Statewatch’s report on ‘the development of the EU security-industrial complex’ shows through the example of “the legislative procedure that led to the establishment of the €1.7 billion security research programme within Horizon 2020” how different corporate interests engage in the design and implementation of the EU security policy, literally co-drafting the legislation along with the EU institutions.

Consequences

The consequences for fundamental rights and the rule of law are enormous. At the most basic, private agenda- and technical standard-setting constitutes corporate capture of the legislative agenda and process, which should be democratic and not guided by profit-driven interests. The extensive use of EU funds transferred to the private sector through security research programmes represents a grave diversion of public resources towards harmful tech applications and uses.

In all three public-private collaboration cases presented in this blog, the risk is that human rights are superseded by security objectives. The public-private security community sketched out above, that bridges corporate interests and government policy, while benefiting from a disturbing lack of democratic accountability, should be urgently called into question.


SUGGESTED CITATION  Berthélémy, Chloé: EU Privacy and Public-Private Collaboration: Ligue Des Droit Humains and Private Standard-Setting, VerfBlog, 2023/5/11, https://verfassungsblog.de/pnr-pub-pri/, DOI: 10.17176/20230511-181705-0.

Leave A Comment

WRITE A COMMENT

1. We welcome your comments but you do so as our guest. Please note that we will exercise our property rights to make sure that Verfassungsblog remains a safe and attractive place for everyone. Your comment will not appear immediately but will be moderated by us. Just as with posts, we make a choice. That means not all submitted comments will be published.

2. We expect comments to be matter-of-fact, on-topic and free of sarcasm, innuendo and ad personam arguments.

3. Racist, sexist and otherwise discriminatory comments will not be published.

4. Comments under pseudonym are allowed but a valid email address is obligatory. The use of more than one pseudonym is not allowed.