Competition law as a powerful tool for effective enforcement of the GDPR
It looks like a good week for data protection. On Tuesday, the Commission presented a new proposal for a Regulation on additional procedural rules for the GDPR, and a few hours later, the ECJ published its decision C-252/21 on Meta Platforms v Bundeskartellamt (Federal Cartel Office). While the Commission’s proposal to improve enforcement in cross-border cases should probably be taken with a pinch of salt, the ECJ ruled on some things with remarkable clarity. The first reactions to the ruling were quite surprising; few had expected the ECJ to take such a clear stance against Meta’s targeted advertising business model. It does however represent a consistent interpretation of the GDPR in the tradition and understanding of power-limiting data protection. Of particular interest: the Court’s comments on the legal basis of data processing and its transferability to other areas of law. This procedure has been discussed numerous times in terms of competition law; in this article, I will focus on the key points from a data protection law perspective.
Background to the preliminary ruling procedure
The proceedings began in 2019, when the Bundeskartellamt (BKartA) issued a decision prohibiting Meta from collecting user off-Facebook data without consent, based on its then general terms and conditions. By “Off-Facebook-Data”, the EJC refers to data collected via third parties (websites and apps other than Facebook) and data collected using other online-services belonging to Meta itself (Instagram, WhatsApp). Furthermore, the BKartA clarified that consent is invalid if it is a prerequisite for use of the social network. The main reasoning was based on the fact that Facebook is abusing its dominant position in the online market, as the data processing described in the general terms and conditions does not comply with the values and principles of the General Data Protection Regulation (GDPR). Facebook’s appeal against the decision was granted suspensive effect by the OLG Düsseldorf (Higher Regional Court of Düsseldorf), and later overturned by the BGH (Federal Court of Justice). The OLG then referred questions on the GDPR and its application by the competition authority to the ECJ, concerning the power of competition authorities to review and consider data protection rules, as well as key issues of interpreting the GDPR.
Competences: Competition authorities may take the GDPR into account
Normatively, competition authorities’ role in EU antitrust law is rooted in Art. 102 TFEU, prohibiting abuse of a dominant position in the internal market. The court clarified that data protection supervisory authorities and competition authorities pursue different objectives (par. 44 f.). While preserving this division, the ECJ emphasised that when assessing a dominant position, market conduct compatibility with the GDPR can also be considered to determine the effects on competition and consumers (par. 47). The Competition Authority does not take the place of the data protection supervisory authority, nor pursue the objectives set out in Art. 51 GDPR or monitor compliance with the GDPR.
Procedurally, this results in an obligation for data protection and competition authorities to coordinate and cooperate (par. 52) and confirms binding effect: if a national data protection authority, the lead supervisory authority, or the Court of Justice has already ruled on conduct relevant to the case, the competition authority may not deviate from that ruling. However, the competition authority may come to a different conclusion in the specific context of competition law (par. 56). The scope of binding effect is difficult to assess in individual cases; competition law often presents special features which may test the competence of the competition authority. The BKartA consulted with others for this case, as is always advisable in case of doubt, namely the Federal Data Protection Commissioner and the Irish data protection authority, fulfilling its duty of loyal cooperation.
This review of the abuse of a dominant position opened a gateway to review compatibility with the GDPR in this area. From a legal theory perspective, this is reminiscent of the concept of a general clause serving as a ‘gateway’ for the fundamental rights test, and in keeping with the idea of the unity of legal order. From the perspective of the rule of law, it also makes sense that structural breaches of law in a cross-sectional area such as data protection law also play a role in other areas. After all, data protection is in many cases consumer protection (in Germany, § 2 (2) no. 11 UKlagG).
Special categories of data require special treatment
The use of Facebook and linked applications constitutes the processing of special categories of personal data if information falling under Art. 9 (1) GDPR is made available, linked or transmitted (par. 73). This is subject to the exceptions in Art. 9 (2) GDPR, regardless of whether the information concerns the user or another person. The Court distinguished between different forms of data processing: (1) the input of special categories of personal data by the users themselves, (2) the linking of these data to the user account and (3) the use of these data (par. 71). In addition, the Court stated, without elaborating, that it is in fact no longer possible to distinguish between different categories of data (paragraph 89).
Furthermore, active use of Facebook is not considered to be making data public within the meaning of Art. 9 (2) e GDPR. Processing of sensitive data on platforms is highly problematic: others have shown that social media advertising for clinical studies enables platforms to train specialised predictive models of medical conditions in any of the (present and future) platform users. Focussing on the user’s decision to make the data available, the ECJ found the decision must be voluntary, and information must explicitly refer to an unlimited public (para. 78).
However, the Court did not go on to explicitly address the problems of the use of collective databases through predictive models fed by the total number of users and even non-users, allowing predictions to be made about individuals (see here, here, here and here). Nor does it explicitly address the problematic secondary use of the sale of these models. Nevertheless, these practices may fall under data use, which is an important step in the right direction for effective data protection. The ECJ seems to agree, stating “in certain cases, the processing of data by accessing the website or application in question may already reveal such information without the user having to register or place an order” (para. 72). It is important to reiterate that the data subject can only decide to make their own data public, not that of other persons (par. 75). Conversely, this could mean that even where every individual has made their data public, predictions about individuals based on collective data cannot be justified.
Meta’s business model is not GDPR-compliant
It comes as no surprise to the critical data protection law community that the court ruled Meta’s current business model incompatible with the GDPR. The court rejected both the general applicability of the legal basis of performance of contract (Art. 6 (1) (b) GDPR) and legitimate interest (Art. 6 (1) (f) GDPR).
The ECJ first clarified that the personalisation of content to this extent is not necessary in order for the network to offer its services, and that the various offers can also be used independently of each other (par. 102 f.). Importantly, the Court emphasised that effective consent must distinguish between off-Facebook data and ‘on-Facebook data’, as users have no reason to expect off-Facebook data to be processed. This is a very clear rejection of Facebook’s business model, which operates as a social network but makes its profits through mass data processing for commercial purposes. The ECJ emphasised that the processing must be necessary for the fulfilment of the legitimate interest, Art. 6 (1) (f), and that the principles of the GDPR, such as data minimisation, must be considered in the interpretation of what constitutes a legitimate interest. The legitimate interest of network security put forward by Meta was recognised in principle by the ECJ, but it returns the examination of this question to the referring court (par. 119).
The ECJ finally confirmed the perhaps unsurprising finding that advertising to individuals to finance the network does not constitute a legitimate interest that prevails over the rights of users. This is especially true in view of the fact that the processing is particularly extensive, potentially unlimited (!), and that Meta is in a position to monitor the entire private life of its users (para. 117, 118).
I have argued before, that legitimate interest is not a sufficient legal basis for the processing of personal data when it comes to the mass extraction of all kinds of data from collectively generated data bases since it requires a balancing of interests. Structural breaches of the GDPR or other legal requirements, the scale of data collection, and the collective impact significantly diminish the legitimate business interest of data processors. In my opinion, this argument can be applied in other areas.
Consent: problems remain
The ECJ did not assume that voluntary consent was excluded merely due to the dominant position (para. 140 f.). This is understandable, as the problem of consent in the digital environment is not only due to the market position of an actor, but also to the sheer flood of information. I have explained the problems of consent in digital environments here and here, backed by interdisciplinary research. In this context, the French data protection authority recently issued a decision against the real-time bidding company Criteo; constellations in which consent is given to 300 different data processors at the same time cannot be overlooked by users. Importantly, the Court emphasised that effective consent must distinguish between off-Facebook data and ‘on-Facebook data’, as users would not expect off-Facebook data would be processed (par. 151).
A very interesting aspect of the decision, which could easily pass for a marginal note, can be found in para. 150: the court suggested offering users alternatives to pay for services while keeping their data private in order to safeguard the free will of consent. This could above all help to make users aware that the use of such services is by no means “free”.
Impact of the decision: Data is power
The really important statement of the ruling is the explicit recognition that data is power. The decision partly undermines Meta’s business model, which relies on selling targeted advertising based on the vast amounts of data it collects about its users as they use its services and browse the wider web. However, Meta won’t have any problems in the future either: the new app “Threads” is designed to compete with Twitter and had ten million registrations in its first few hours. It will now be interesting to see whether the decision of the OLG Düsseldorf opens the door to Art. 82 DSGVO, which could expose Meta to substantial claims for damages. This should be particularly interesting in the context of the action brought by the associations. Competition law is proving to be an effective tool for data protection, let’s see how the DMA shakes things up.