Reforming the GDPR
Dilution or Progress?
After a surge of new digital legislation over the past two years, the European Commission appears to have no intention of easing its pace in reshaping Europe’s regulatory landscape – whether for better or worse. Its current agenda includes not only the potential reopening of the AI Act, but also a reform of the GDPR (General Data Protection Regulation). While less visible, the GDPR reform could have far-reaching consequences. It could either weaken the EU’s core legal protections due to lobbying pressure or successfully modernise the regulation to meet the challenges posed by today’s data-driven technologies.
This post provides a brief overview of the proposed adjustments to the GDPR and situates them within the broader debate on regulatory reform. I argue that reform efforts should focus not merely on simplification and deregulation. They should strengthen enforcement, clarify the interrelations within EU digital law, and fix the structural problems of the GDPR as part of the broader project of advancing digital regulation.
Why is a reform of the GDPR on the table?
Current debates on regulation take place in a bigger context often dominated by a successful but harmful narrative of innovation, and an overly narrow focus on competitiveness and geopolitical developments (see here, here and here). Regulation and enforcement issues feel more politicised than ever. Not only are the usual actors – NGOs and civil society – calling on regulators. In late June, Nobel Prize winners in Economics and Physics drew significant attention with an open letter urging the Commission “to resist pressure from those attacking the rules on general-purpose AI.” The proposed reform of the GDPR has not (yet?) attracted comparable public attention. The Commission currently appears to favour incremental changes rather than a sweeping overhaul.
The GDPR serves as a “backbone of the EU’s digital rulebook”, but has also been a source of intense debate and criticism from the outset. Broadly, two main sides have emerged: some say the GDPR constitutes an excessive regulatory burden on businesses and the economy; others warn that revising it risks undermining fundamental rights.
In the current political climate, shaped by lobbying pressure from powerful private actors, political narratives are increasingly aligned with the first view. The Commission, as well as the coalition treaty of the current German government, call for reducing the burden on SMEs (small and medium enterprises) from GDPR obligations and other “bureaucratic” requirements.
While easing burdens on non-commercial actors, such as associations, is arguably reasonable and legitimate (as data protection is, after all, a balancing of interests between processors and data subjects), the debate is overly fixated on simplification, efficiency, and competitiveness. These terms, however, lack intrinsic normative value. Without clear reference points, they remain empty rhetoric. One can, after all, dismantle democracies very efficiently. Unless they are linked to fundamental rights and sustainable social and political objectives, an exclusive focus on these buzzwords will ultimately benefit only those actors who already dominate the market.
Meanwhile, several structural challenges remain unresolved: the dysfunctional instrument of consent; the increasing impossibility of distinguishing between data categories, especially in the age of AI; the structural erosion of data protection principles; the lack of coordination with other legal acts, and the continuing unbridled power of global players in the digital economy, whose aggressive data extraction continues. In this context, structural solutions that supplement the individual rights of data subjects – such as actor-related obligations addressing systemic risks, as seen in the DSA – and coordinated oversight should be urgently critically discussed.
What are the suggestions for reforming the GDPR?
One long-standing point of contention, dating back to the original GDPR debates in 2013, is its horizontal or “one-size-fits-all” approach: identical legal obligations apply equally to Meta and to a local kindergarten. To implement a graded regulatory approach and address the structural challenges mentioned above, various reform proposals are being discussed.
At the EU level, proposed amendments are currently at various stages. On 21 May 2025, the EU Commission proposed reforms in its fourth omnibus package. These include amendments to Article 4 GDPR defining SMEs as companies with fewer than 250 employees and annual turnover under EUR 50 million or a balance sheet total under EUR 43 million, and small mid-cap enterprises with fewer than 750 employees and annual turnover under EUR 150 million or a balance sheet total under EUR 129 million. Additionally, companies with fewer than 750 employees wouldn’t have to keep records of data processing – unless their processing is likely to pose a “high risk” to data subjects (analysed here). Further changes include amendments to the Code of Conduct under Article 40 GDPR and the certification schemes under Article 42 GDPR, tailored to the needs of SMEs and SMCs. However, these amendments are unlikely to have a major practical impact.
Another reform proposal by Voß and Schrems goes further by focusing not only on company size but also on actual data risk and business model. They suggest that companies processing large volumes of data and/or sensitive data, or those whose primary business model relies on personal data processing, should face stricter obligations. Smaller companies, however, should be exempt from documentation requirements and the need to appoint data protection officers, thereby significantly easing their compliance burden while tightening regulations for data-intensive businesses.
A more radical proposal is presented in a draft discussion paper for a new “AI Data Protection Regulation” by Christiane Wendehorst. It also suggests a tiered model but aims to exclude “minimal-risk data activities by small-scale controllers” from the scope of the GDPR altogether. A detailed discussion of this paper is beyond this post’s scope. Notable concepts, however, include the introduction of a quasi-controller concept or a database for high-risk controllers. In my view, other suggestions, such as the proposed exceptions to GDPR requirements and data subjects’ rights for the purpose of AI training, are particularly problematic and raise serious concerns (see Articles 11 and 12 of the proposal).
Despite valid criticisms regarding the details of the reform proposals, the development of concrete proposals is, in principle, welcome. Many, myself included, have long advocated for additional risk-based and systemic solutions in data protection (see, for example, here , here and here). The prevailing tendency (particularly in Germany) to treat the GDPR as a regulatory panacea is unhelpful and ignores practical enforcement realities. In practice, supervisory authorities are resorting to stopgap solutions, such as assuming that LLMs do not process personal data. The focus, however, should remain on protecting autonomy, privacy, and data protection rights, rather than repurposing the GDPR as an “enabling” framework for generative AI and other data-intensive technologies.
What is the risk in reopening up the GDPR?
The GDPR – and digital regulation more broadly – is entangled in global political tensions. Enforcement against Big Tech is often portrayed as an unfair practice and targeted approach against the US. At the same time, data protection is frequently blamed for a wide range of perceived deficiencies: from slow administrative digitalisation to Europe’s lagging AI sector.
The EU must resist such political pressure and reaffirm its commitment to a fundamental rights-based regulatory model – one that increasingly diverges from approaches in other parts of the world. As has been argued repeatedly, effective regulation fosters innovation by providing legal certainty and encouraging trustworthy, human-centred and fair technical innovation, instead of favouring the strongest, faster or richest.
In the current political climate, reopening the GDPR under slogans like “reducing bureaucracy”, “simplification” and “deregulation” risks more than minor tweaks. It could erode accountability and fundamental rights protection. In the age of AI, the protection of fundamental rights should be strengthened, not weakened, which also means effectively enforcing the existing rules.
The protection of fundamental rights is a normative issue, not just a matter of numbers, like company size or the number of data subjects affected. Focusing only on these figures misses important normative concerns related to fundamental rights, structural power imbalances, and the integrity of democratic governance. Moreover, data processing and its implications for fundamental rights are highly contextual. A small company can process highly sensitive data of a few people while a large enterprise may process large amounts of personal data with less relevance for fundamental rights. As already explained here, the proposed amendment of Article 30 GDPR will probably not have much effect anyway, which raises the question whether the opening of the GDPR is worth it after all.
The pressing societal problems of the protection of privacy, data, autonomy and freedom are not limited to overburdening regulatory burdens on SMEs, but in managing data power, tackling digital oligopolies, and enabling digital transformation in the public interest. Data protection is essential to democracy: democratic decisions require the space to choose autonomously, freely, and – if desired – in private.
Improving Enforcement
Another major unresolved issue is the enforcement of the GDPR, especially in cross-border cases. Instead of solely focusing on deregulation, enforcement should be strategically strengthened. The Irish Data Protection Authority (DPA), due to the presence of many Big Tech companies in Ireland and the GDPR’s one-stop-shop mechanism, has become a bottleneck in enforcement procedures.
In June, after two years of negotiations, the European Parliament, the Council and the Commission reached a preliminary agreement on improving cooperation between national DPAs in cross-border cases. The agreement, still pending formal adoption, introduces new timelines: cross-border investigations must be completed within 15 months, extendable by 12 months for complex cases. If the scope of investigation is clear, no objections arise, and the lead authority has relevant experience, the deadline shortens to 12 months. The new “early solution mechanism” would allow DPAs to resolve a case before triggering the standard cross-border complaint process.
Cooperation would be further streamlined by requiring a “key issues report” and allowing non-contentious cases to be resolved without formally involving other authorities, subject to a four-week objection period.
These timelines have drawn sharp criticism for being too long, potentially hindering rather facilitating faster enforcement. Critics argue they ignore user interests and introduce more procedural burdens instead of simplifying enforcement. Indeed, it is doubtful that this reform will improve the position of affected individuals. Rigid timelines can have varied effects: speeding up or dragging out procedures.
Conclusion & Outlook
The much-invoked goal of simplifying the EU’s regulatory framework – now so complex that even experts struggle to navigate it – must not be confused with deregulation. Quality matters more than quantity, in both directions. More rules are not always better. Nor are fewer.
What is needed is a deliberative, informed, and democratic discourse on the direction of digital governance. Three points are especially important.
First: Clarify how legal instruments and different regulations relate to each other. Repeatedly stating that the GDPR remains unaffected” does not enhance legal certainty. Clarification would benefit all stakeholders.
Second: Strengthen enforcement of digital regulation in line with the rule of law. This requires not only well-resourced and independent supervisory authorities but also systemic approaches that go beyond individual cases and are embedded in the wider regulatory landscape.
Third: Promote alternatives to Big Tech digital infrastructure. This would not only enhance European digital sovereignty but also open opportunities for SMEs. Data protection is not an obstacle to this – quite the opposite.