15 April 2024

GDPR Overreach?

The Challenges of Regulating Pay-or-Consent Models through Data Protection Law

I. Dispute over the classification under data protection law

Various institutions in Europe are currently in the process of carrying out a legal assessment of so-called “pay-or-consent” models. The term refers to online business models where the service provider, seeks compensation for its services by offering users a choice: Either they pay a monetary price (“pay”) for the media or service offering (e.g. social network service, blogs, newspapers, etc); or allow brands to pay through the placement of personalised advertising, which – according to the CJEU in the Meta/Bundeskartellamt decision1) – requires the user’s consent to the processing and evaluation of their personal data (“consent”). The European Data Protection Board (EDPB) is currently preparing a statement on the compatibility of “pay-or-consent” models with the GDPR, which is due to be published at the beginning of May. After Meta introduced this model for its social networking services Facebook and Instagram in November 2023, several national data protection authorities called on the EDPB to clarify the compatibility of this model with the GDPR.2) The Information Commissioner’s Office in the UK sees the need to provide legal certainty to the companies using such models and launched a public consultation in March 2024.3) The EU Commission also announced in March 2024 that it would investigate Meta’s decision under the DMA.4)

The clarification comes under enormous political pressure from data protection activists and NGOs who are attacking the introduction of a “pay” model on socio-political grounds (“tax on privacy”/”price for privacy”).5) According to them, data protection law is to be used as a lever to prohibit media companies or online service providers from offering a service that is more data-minimalist than the traditional business model. Data protection authorities are therefore faced with the question of whether the GDPR should address “social justice” concerns.6)

II. Freedom to consent and equivalent alternative offers

At the core of the legal disputes is the notion of “freedom” to consent under Art. 6 (1) a) and Art. 4 (11) GDPR. According to the now established position of the European Data Protection authorities and confirmed by the CJEU, consent to the processing of personal data for the placement of personalised advertising may only be deemed “free” if the controller makes the data subject and user an “equivalent” offer and thus creates freedom of choice. In light of the CJEU decision Meta v Bundeskartellamt, three points must be considered established law:

1) Even a company with market power can use a pay-or-consent model. The occasional claim that the position of market dominance of a company reduces or hinders users’ freedom to consent is not only incorrect, but also unrealistic.

2) The CJEU has also conclusively clarified that a “pay-or-consent” model can, in principle, offer users a genuine choice, which is necessary for valid consent under GDPR. The CJEU expressly states that “those users are to be offered, if necessary for an appropriate fee, an equivalent alternative not accompanied by such data processing operations” (para. 150). According to Art. 19 Treaty on European Union (TEU), the decisions of the CJEU are binding on all EU institutions, bodies, offices and agencies. It would be an unprecedented event if an administrative institution of the EU were to openly disregard a decision of the CJEU.

3) The CJEU has also clearly stated that it is sufficient if users are offered one equivalent alternative option. It would be incompatible with the Court`s decision if companies were required to make three, four or even more offers. While the EU legislator could stipulate in a separate regulatory instrument that a model of monetarily free and non-personalized advertising must be placed alongside a “pay-or-consent” model, data protection authorities cannot interpret the GDPR in a quasi-legislative fashion in order to pursue political preferences. From a data protection point of view, the “pay” model offering minimal data collection for advertising purposes is optimal. It cannot be reasonably argued that data protection law additionally requires the provision of a model based on non-personalized advertising. Since this model remains more intrusive than the “pay” option, the assumption that the GDPR requires the company to offer such a model would be incoherent.

III. Four Principles for the Interpretation of the GDPR

In the current debate, it is clear that the notion of free consent  is being stretched and used as the basis for demands to regulate digital business models. This applies not only, as just mentioned, to the number of options to be offered, but above all to the design of the options: Art. 6 (1) (a), Art. 4 No. 11 GDPR are meant to provide answers to the question of which options should be presented to the users. In many cases, the demands are not driven by concern for the protection of informational self-determination and informational privacy, but by regulatory policy preferences beyond the purpose of data protection law. This is data protection overreach that amounts to industry regulation through data protection law. The data protection authorities would exceed their powers and act ultra vires if they were to interpret the GDPR from a social justice perspective or a consumer protection policy perspective.

In response to those voices that argue for a rather loose and arbitrary interpretation of the GDPR in light of political interests, the article aims to formulate four theses on the interpretation of the GDPR’s concept of voluntariness and thus dispel misunderstandings that are being stirred up by interested parties in the data protection debate.

“Pay-or-consent”-model implies an end to the free culture in the internet

The legal assessment of the pay-or-consent model essentially depends on how these models fit into the socio-cultural norms of the time. If one assumes that services of a company, be it industrial or digital, are generally offered against payment, the “pay” model must be conceived as the rule and an advertising-funded, free-of-charge offer must be considered an exceptional concession. This reconstruction of the market reality on the internet reflects the observation that the times in which economic activity on the Internet was dominated by a “free culture” are coming to an end. In the third decade of the new century, the ongoing evolution marks a shift away form the earlier dominance of a “free culture” understanding of the digital economy. In large parts of the digital economy, pure pay models are now prevailing (media offerings such as FT, WSJ, NYT, etc.; streaming services such as Netflix, HBO, Spotify, etc.). It is clear that the “free culture” has led to a loss of diversity, a decline in quality and the exploitation of content providers, particularly in the media sector. A free culture does not allow for high-quality value creation. In the social market economy established by the TFEU (Art. 119 TFEU), every company is free to restructure its business model, which may involve shifting from free content to pay model offerings. In the current debate, it is not seriously disputed that a company is free under data protection law to base its offering on the principle of “pay or leave”.

If the company then offers an advertising-financed and monetarily free service in addition to a “service for money”, this expands the scope of action of the users, who are financially better off than in the normal case, even if they agree to the use of their personal data for the placement of personalised advertising. The high number of users who choose this offer indicates a preference structure that must be accepted by data protection law. The mistake made by some data protection activists is to deny the end of the “free culture” in the digital economy for individual – arbitrarily selected – economic sectors. Only if the free culture is declared the norm and normative ideal, the introduction of a pay offer can be presented as a “privacy fee” or “privacy tax”. However, such a socio-cultural reconstruction of the world of digital markets cannot be derived from the GDPR. It would be unprecedented if the EDPB were to use such an interpretation.

Normative concept of freedom of choice

Behind Art. 6 (1) (a), and Art. 4 (11) GDPR is a normative concept of freedom of choice. Freedom of choice does not mean that the user’s preferences are fulfilled to the greatest possible extent – or even completely. If you ask users about their preferences, you will regularly find that they would prefer not to pay at all. You would get a similar picture if you asked customers in the supermarket whether they would prefer to pay for the goods or receive them free of charge. However, individuals’ general preference towards the most economical offers does not call into question the voluntary nature of agreeing to the purchase contract. A survey of user preferences would most likely also reveal that users also want to provide personal data to the smallest possible extent for the provision of personalised advertising. Again, the observation of such preferences is irrelevant under Art. 4 (11) GDPR. Data protection activists often blur the line between the voluntary nature of consent and the observation of actual or perceived preferences. Even data protection law cannot change the fact that you cannot have everything in the world – and certainly not everything at the same time. There is no sensible reason to relate the voluntariness criterion of data protection law to preferences. Rather, the key is positioning users in a situation, in which there is scope for decision-making. A “pay-or-consent” model opens up a decision-making space if the price charged is not so high that it exceeds the financial capacity of the average user. The fact that a user with limited financial resources must make trade-offs (e.g. foregoing other purchases)does not call into question the voluntary nature of their decision, but points to the dilemmas of dealing with scarce (financial) resources.

No commercialisation of the GDPR

It would be a fatal mistake if data protection authorities were to interpret the criterion of the equivalence of offers on the basis of economic value considerations alone. In this case, it would be necessary to attribute an economic (utility or market) value to the data used to provide personalised advertising in the case of the ad-financed offer and compare this with the monetary price of the “pay” offer. This  would force data protection authorities to rethink their position on the commercialisation of  personal data under data protection law. In addition, accurately assessing that value in the context of commercialisation would pose significant challenges. Alternative approaches focusing on advertising revenue per customer or the cost structure of the digital company and wanting to derive comparative standards from these metrics are completely incoherent in terms of data protection law insofar as they are entirely disconnected from informational self-determination and privacy protection. It is striking how NGOs and other data protection activists are suddenly arguing with questions of market fairness or with criteria of “reasonable profit” – and all on the basis of Art. 4 (11) GDPR.

If data protection authorities were to claim that the “pay” option is only fair if the price is “reasonable”7), they would effectively assume the role of price regulators of the data economy.  and the GDPR would become an instrument of price control, based on the concept of digital autonomy. The damage this would cause would be great.

Firstly, this would run counter to the fundamental principles of a liberal market economy      recognised under EU law. The EU owes its greatest successes and its political legitimacy to its orientation towards this market liberalism. Forcing companies to change prices through the GDPR could have serious negative consequences, both morally, politically, and economically. The GDPR is not a planned economy instrument that could be used to regulate the price charged in the “pay” model using criteria such as “reasonableness” or “appropriateness”.

Secondly, the EDPB not only lacks the competence to set maximum limits for the price charged in the pay model, it may also lack the necessary expertise  in economic analysis and price intervention strategies.

Thirdly, reinterpreting the GDPR as an instrument for controlling prices would depart from the main goal of the GDPR, i.e. guaranteeing individuals’ informational self-determination and informational privacy.

Fourthly, the GDPR’s approach to protection would be de-individualized if it no longer considered the individual autonomy of the recipients of a service, but instead compared the equivalence of the revenue that a digital company generates from the placement of personalised advertising with the revenue that it generates through the monetary pricing of its service. A legal instrument that protects people would be turned into an instrument that compares key business figures.

The reinterpretation of the GDPR as an instrument of price regulation would violate the fundamental rights of companies (Art. 16 CFR) – even if it is done under the pretext of ensuring the “equivalence” of the consideration of users of a digital service. There is no need for this interventionist paternalism.

No Destruction of the General Approach of the GDPR

The voluntary nature of consent is an important notion under the GDPR. The GDPR pursues a horizontal regulatory approach, which in principle formulates identical requirements for all controllers (Art. 4 (7) GDPR) (“one size fits all” approach). The EU legislator has deliberately decided against sectoral regulatory approaches in data protection law (unlike in the Data Act, where provisions on data portability largely focus on connected devices8)). The requirements that a “pay-or-consent” model must meet in order to offer a genuine choice must therefore also be formulated uniformly for all economic sectors. What applies to media companies must also apply to social network operators, and vice versa. If data protection authorities were to attempt to formulate sector-specific requirements, they would destroy the basic architecture of the GDPR. They would also violate the right to equal treatment under Art. 20 of the Charter of Fundamental Rights. A sector-specific and discriminatory approach would be industrial policy, corporate regulation and therefore beyond a possible interpretation of the GDPR. Litigation would seem inevitable.

References

References
1 CJEU, July 4, 2023, C-252/21, Meta Platforms Inc. v Bundeskartellamt, EU:C:2023:537.
2 https://www.datatilsynet.no/en/news/aktuelle-nyheter-2024/request-for-an-edpb-opinion-on-consent-or-pay/; https://datenschutz-hamburg.de/news/abo-modelle-bei-grossen-online-plattformen; and https://autoriteitpersoonsgegevens.nl/actueel/ap-privacy-is-een-grondrecht-niet-alleen-voor-rijke-mensen .
3 https://ico.org.uk/about-the-ico/ico-and-stakeholder-consultations/call-for-views-on-consent-or-pay-business-models/.
4 See the statement by EU Commissioner Breton of 30.1.2024. (https://www.europarl.europa.eu/doceo/document/E-9-2023-003424-ASW_EN.html).
5 See e.g.: https://noyb.eu/en/28-ngos-urge-eu-dpas-reject-pay-or-okay-meta and https://www.accessnow.org/press-release/open-letter-to-edpb-pay-or-consent/.
6 In detail: https://eulawlive.com/weekend-edition/weekend-edition-no181/.
7 For example https://ico.org.uk/about-the-ico/media-centre/news-and-blogs/2024/03/ico-launches-consent-or-pay-call-for-views-and-updates-on-cookie-compliance-work/.
8 https://eur-lex.europa.eu/eli/reg/2023/2854.

SUGGESTED CITATION  Nettesheim, Martin: GDPR Overreach?: The Challenges of Regulating Pay-or-Consent Models through Data Protection Law, VerfBlog, 2024/4/15, https://verfassungsblog.de/gdpr-overreach/, DOI: 10.59704/de7ffdfea30b45f1.

Leave A Comment

WRITE A COMMENT

1. We welcome your comments but you do so as our guest. Please note that we will exercise our property rights to make sure that Verfassungsblog remains a safe and attractive place for everyone. Your comment will not appear immediately but will be moderated by us. Just as with posts, we make a choice. That means not all submitted comments will be published.

2. We expect comments to be matter-of-fact, on-topic and free of sarcasm, innuendo and ad personam arguments.

3. Racist, sexist and otherwise discriminatory comments will not be published.

4. Comments under pseudonym are allowed but a valid email address is obligatory. The use of more than one pseudonym is not allowed.




Explore posts related to this:
DSA, DSGVO, EU, Europarecht, Internet, data protection