14 December 2021

The EU’s Proposed Platform Work Directive

A Promising Step

On 8 December 2021, the European Commission published its long-awaited draft of a Directive aimed at improving working conditions in the platform (or ‘gig’) economy. In response to the explosive growth of platform-mediated work, not least as a result of the Covid-19 pandemic, the proposal aims to ensure, first, ‘that people working through platforms have – or can obtain – the correct employment status’; second, ‘to ensure fairness, transparency and accountability in algorithmic management in the platform work context’; and third, ‘to enhance transparency, traceability and awareness of developments in platform work and improve enforcement of […] applicable rules’. In this post, we scrutinise key elements of the proposed Directive against these goals. Our tentative conclusion is positive: while there is some room for improvement during the legislative process, the framework laid down promises to tackle some of the most salient problems arising from platform work.

1. Background

The past five years have seen a sustained pushback against the gig economy’s insistence that platform workers should not enjoy access to legal protections afforded to employees. Supreme Courts in countries including France, Spain, and the UK have ruled that platforms are employers, and that their workers are duly entitled to relevant protections. Legislators are increasingly weighing in, too: a recent Spanish ‘Riders Law’ tackles both gig economy employment status and the rise of algorithmic management, which is the automation of employer functions from hiring through to firing. Platform work was also expressly included as a regulatory priority in European Commissioner Schmit’s Mission Letter in 2019, leading to the adoption of the proposed Directive.

2. The Proposal

The proposed Directive covers everyone in the EU who has or may be deemed to have an employment relationship (Art 1(2)) with a digital labour platform (Art 2(1)). The scope of personal protection is therefore broad. This is crucial in order to tackle platforms’ sustained avoidance efforts. In terms of substantive scope, however, the proposal is limited by reference to the definition of ‘digital labour platform’. The definition, which is adapted from that of Information Society services, would only cover organisations which match individual client requests to available workers at a distance and via electronic means. Most traditional employers would therefore fall outside its scope, as would digital economy platforms such as AirBnB, in line with the Court of Justice’s case law.

There are four substantive chapters to the proposal, covering (a) employment status, (b) algorithmic management, (c) transparency, and (d) enforcement.

a. Employment Status

Chapter II tackles employment misclassification through a presumption of employment when a platform exercises control over work along at least two axes, for example by setting upper limits on remuneration and enforcing a dress code (Art 4). The presumption, whilst rebuttable, is a welcome step in the right direction – though it is difficult to predict how it will play out across different legal systems given national procedural autonomy.

The proposal also seeks to address information asymmetries between platforms and workers, long identified as a major problem in practice: without access to reliable information about their status, and in the face of obfuscation by platforms, workers may be unaware of their rights. The proposal would therefore require information about the rebuttable presumption to be made clear and ‘publicly available’ (Art 4(3)). Member States would also be required to develop guidance to assist platforms, workers, and authorities.

b. Algorithmic Management

The gig economy was the cradle of automated management systems. The proposal requires platforms to provide workers with information about automated monitoring systems, as well as decision-making systems which ‘significantly affect’ working conditions (Art 6). It also mandates human monitoring of such systems’ impact on working conditions (Art 7), as well as human review and a written statement of reasons for significant decisions, such as decisions to suspend a worker’s account or refuse remuneration for work performed (Art 8).

These Articles build on Article 22 of the GDPR, which provides for more limited, individualised protection against automated decision-making. Indeed, the new proposal expressly notes that while the GDPR establishing a framework for data processing, more specific rules are required in the context of platform work (recital 29). Data protection pioneer Spiros Simitis argued in the 1990s that omnibus regulation is inadequate in the employment context, and Article 88 of the GDPR permits more specific rules to be adopted in this space. This Directive follows suit by adopting rules specific to algorithmic management.

Importantly, Articles 6 and 7 apply to all platform workers, including those without an employment relationship (Art 10). This is justified by the similar impact automated systems have on working individuals, regardless of their status (recital 40), and underpinned by Art 16 TFEU as a legal basis in addition to Art 153 TFEU.

c. Transparency

Digital labour platforms will be required to publish and regularly update information about the terms and conditions of their workers, on which public authorities and workers’ representatives will have a right to request further information (Art 12). Public bodies will also be endowed with new powers to order platforms to disclose evidence (Art 16).

d. Remedies and Enforcement

The transparency obligations will support Member States’ obligations to ensure efficient redress (Art 13), especially in combination with the multiple enforcement channels envisaged, including representative legal actions (Art 14, mirroring the Pay Transparency proposal’s Art 13) and ‘proactive’ targeting by public bodies in cases of non-compliance (Art 4(3)(c)). The Directive also mandates protection against adverse treatment and retaliatory dismissals (Arts 17 and 18).

3. Initial Comments

The proposed Directive is a welcome and decisive step towards improved working conditions in the gig economy. It sends a clear message as regards the default approach to employment status, cutting through the labyrinthine complexity created by platforms’ ‘armies of lawyers’. It also looks beyond mere status debates, tackling a key substantive challenge: protection against the downsides of algorithmic management, from intrusive monitoring to algorithmic discrimination. By carefully tying in with recent Union legislation and legislative proposals, as well as broader developments strengthening the effective enforcement of the social acquis, it provides clear evidence of the Union’s commitment to delivering on the Social Pillar.

a. Rights at the Collective Level

The proposal envisages a significantly improved role for consultation and bargaining at the collective level. The Directive would provide representatives with access to information about automated systems (Art 6(4)), for example, in a context where workers’ representatives currently rely on highly individualised data protection rights when gathering evidence to challenge unfair algorithmic management. Such reliance is far from straightforward: in a recent Dutch case, Uber sought to argue that drivers’ data subject access requests were an abuse of rights because they had been coordinated by a union seeking to gain transparency about algorithmic management tools. The new proposal would cut through specious conflicts of this type by putting relevant information directly into representatives’ hands.

There are other provisions directly addressing collective action, as well: the proposal reaffirms collective consultation rights which are present in existing EU law (Art 9), and requires the creation of unmonitored, in-platform communication channels for labour organising (Art 15). Furthermore, on the same day as the Directive was published the Commission also released draft Guidelines on collective bargaining rights for self-employed workers, for whom competition law is a barrier to collective action.

However, while the Directive is alive to the importance of promoting the role of worker representatives in the gig economy context, it does not extend beyond that sector. The Commission’s Communication explains that platform work is unusual in that there are few ‘practical opportunities for collective representation and organisation’. While it is true that the traditional factory floor may have provided more opportunities to collective engagement, many traditional workplaces are also increasingly shifting to hybrid working models, with workers broadly dispersed. Moreover, while the Commission suggests that social partners will be able to ‘initiate social dialogue on algorithmic management in the context of the new information and consultation rights’, it does not address the fact that the same rights would be equally valuable for social partners in traditional workplaces – where trade unions are already working to elevate their members’ voices about the use of algorithmic management tools.

b. A Broader Substantive Scope?

This is part of a bigger problem with the Directive: although its personal scope is broad, its substantive scope is limited. Most gig workers are covered, whilst workers outside the platform economy receive scant protection. This is unfortunate. As the Commission recognises, algorithmic management is far from unique to the platform economy. It is now present in workplaces across the socio-economic spectrum and a range of sectors: Amazon’s use of automated decision-making systems in its warehouses is widely documented, for example, while the Covid-19 pandemic has seen an uptick in employers’ use of remote monitoring tools for formerly office-based employees.

Commissioner Schmit had a specific mandate to tackle the platform economy, but the proposal could still be expanded during the legislative process. Spain’s algorithmic management rules, for example, cover all employers, whether platform-based or not.

c. More Specific Rules for the Employment Context

Nonetheless, the proposal does provide a crucial first response to the need for employment-specific rules on technology: a point which, as mentioned above, has long been recognised in the context of data protection. The proposal also makes way for even more specific rules to be adopted at the national level, explicitly providing that the Directive should be taken as a floor of rights (Art 20). This context-sensitivity and flexibility contrasts starkly with the proposed AI Act, an omnibus law which would introduce provisions affecting employment, whilst also shutting the door to regulatory innovation at the domestic level. Overall, therefore, our initial verdict is clear: as the single market turns digital, it is crucial that the Union’s social acquis keep pace. Although there is still some way to go, the proposed Directive is a welcome step towards that goal – and provides a positive indication for future protection against algorithmic management.

This blog is part of a project that has received funding from the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (grant agreement no. 947806).


SUGGESTED CITATION  Kelly-Lyth, Aislinn; Adams-Prassl, Jeremias: The EU’s Proposed Platform Work Directive: A Promising Step, VerfBlog, 2021/12/14, https://verfassungsblog.de/work-directive/, DOI: 10.17176/20211215-142502-0.

Leave A Comment

WRITE A COMMENT

1. We welcome your comments but you do so as our guest. Please note that we will exercise our property rights to make sure that Verfassungsblog remains a safe and attractive place for everyone. Your comment will not appear immediately but will be moderated by us. Just as with posts, we make a choice. That means not all submitted comments will be published.

2. We expect comments to be matter-of-fact, on-topic and free of sarcasm, innuendo and ad personam arguments.

3. Racist, sexist and otherwise discriminatory comments will not be published.

4. Comments under pseudonym are allowed but a valid email address is obligatory. The use of more than one pseudonym is not allowed.




Explore posts related to this:
Platform Governance, Platform economy, algorithms, gig economy