Hack the DSA
Four Actionable Ways to Improve DSA Reporting
The EU Digital Services Act (DSA) is one of the latest regulatory instruments adopted by the European Union to rein in the power of digital platforms and to create a “safe, predictable and trusted online environment”. It applies to all platforms since February 2024.
The DSA introduces a wide variety of due diligence and transparency obligations for intermediary services, which increase depending on the size and impact on society of the service concerned. It has been largely advertised as a transparency machine. Yet, from a preliminary assessment of the reports mandated by the DSA, this transparency promise is failing to deliver. Such a conclusion prominently emerged from the “Hack the DSA” workshop organised by the Chair on Online Content Moderation on October 3rd, 2025 (for a full summary of the event, see here). The workshop brought together 30 DSA’s experts, notably lawyers, researchers, representatives from civil society organisations and regulators, who explored and analysed the various reports and documentation published under the DSA in order to discuss its concrete implementation and pool their collective intelligence.
Drawing from the main findings of this workshop as well as its key takeaways, in this blog post we illustrate why the DSA’s promise of transparency is not living up to its expectations. We first argue that this primarily stems from the fact that Very Large Online Platforms and Search Engines (VLOPSEs) are prioritising ceremonial aspects of reporting processes at the expense of the substance. We then offer four practical recommendations for improving DSA reporting.
The DSA’s reports: a ritualistic process that prioritises form over substance
With more than 80 provisions establishing transparency obligations, the DSA formally aims at rendering previously opaque processes visible and, most importantly, accessible. The DSA transparency obligations mandate each VLOPSE to publish a variety of documents: transparency reports, reports on systemic risks, audit reports, statements of reasons… Other actors also contribute to this transparency machine, including, for instance, trusted flaggers or out-of-court dispute settlement bodies. While these documents should not be viewed as a panacea, a cross-analysis of the plethora of reports produced through the DSA architecture could be incredibly beneficial not only to understand how platforms operate but also to cross-verify the content of the different reports (number of contents taken down, reasons for the take down, effective role of the trusted flaggers…).
Transparency reports are intended as a fundamental component of the success of the DSA: it is through these transparency obligations that content moderation practices and the risks stemming from platforms’ design can be effectively scrutinised. Yet, our common assessment shows that the current implementation of these obligations is not improving the general understanding of platforms’ practices and that, overall, the transparency promise of the regulation is not being kept.
One of the main problems identified during the workshop is the manner in which reports are produced and presented. They are offered in a variety of formats, covering different periods and framing issues inconsistently. They are also not easy to digest and difficult to retrieve: locating them can be quite challenging, as they are not always readily available on the individual platforms’ dedicated webpages (such as transparency centres). Moreover, they are also published in unsuitable formats: even if some are machine-readable, their inconsistent formats and structures create unnecessary barriers for external scrutiny.
It is therefore evident from the reports that platforms are mostly engaging with ceremonial aspects of the reporting processes, neglecting their substance. The self-celebratory tone of some of these documents suggests, in fact, a mere box-ticking approach to compliance. This approach is emblematic of a generalised concern with the implementation of due diligence obligations by business enterprises, which is often reduced to superficial compliance with the process (in this case, the publication of the reports) at the expenses of the ultimate outcome the obligations themselves aim to achieve.
This, in turn, has a detrimental effect for the substantive assessment of the quality and added value of these reports for determining and elucidating platforms’ content moderation practices. Not only do the reports fail to clarify in detail platforms’ practices, including the systemic risks identified, the methodologies used, the mitigation measures adopted and their effectiveness, but lack of standardisation (for a similar analysis see also here and here) renders their comparison difficult to achieve. Standardisation does not mean that reports should necessarily follow a specific template. Such an approach, in fact, may even be counterproductive, and further contribute to the production of generic reports that would not sufficiently take into consideration platforms’ specificities and, instead, encourage “cosmetic compliance”. Variation and flexibility, on the contrary, could allow platforms to produce more substantive and revelatory reports depending on their activities and specific operational contexts. Some of these concerns should be alleviated with the introduction of a standardised approach to the transparency reports, put forward by the Commission’s Guidelines on the topic. However, this will only cover one typology of reports published under the DSA.
The superficial engagement with the EU regulatory framework is also confirmed in the context of political advertising. During the workshop, in fact, it was emphasised how Meta or Google consistently remove or wrongly label content that relates to political issues. This finding is particularly problematic in light of the fact that both Meta and Google have decided to end political, electoral and social issues advertising in the EU to avoid complying with the EU’s Transparency and Targeting of Political Advertising (TTPA) regulation.
When coupling the ceremonial reporting engagement with evidence of misclassification of content, such as for political advertising, and the difficulties with effectively cross analysing the DSA reports, it is evident that the aims of the regulation are not met: VLOPSEs are not fully complying with their transparency obligations and are failing to make their content moderation practices accessible. The workshop therefore ultimately showed significant shortcomings with respect to the documents published under the DSA, which will have to be rectified if the regulation is to unleash its full potential.
Key takeaways and recommendations
To fully exploit the DSA transparency potential, it is necessary to take stock of a collaborative effort to chart a way forward and address these shortcomings. Drawing from the workshop’s findings, the main recommendations emerging from the collective exercise revolve around four general themes: harmonisation, archiving, accountability for political advertising, and collaborations.
1. Harmonisation
A cross-cutting issue that needs to be addressed in future reporting is the lack of harmonisation.
In order to fully capture and harness the added value of the DSA, the methodologies, timeframes, and categories utilised in the documentation need to be harmonised, archived and standardised. Harmonisation is not to be understood as the production of a template to be followed blindly by the different stakeholders. Rather, it is to be comprehended as comprising a series of soft guidelines relating to timeframes and methodologies. Some common elements, particularly in the context of systemic risks reports, are in fact necessary to be able to compare the documentation and assess the measures implemented by different services. This can be addressed while taking into account the variety of activities covered by VLOPSEs: guidance, in fact, does not need to be “universal” but could be provided by taking into consideration the type of service concerned, such as whether they offer a marketplace, or a pornographic site, or a social media platform… The next step here might be to involve the various stakeholders (VLOPSEs, the Commission, and civil society) in order to establish a suitable set of standards or criteria that are vital to effectively compare the data emerging from the reports and, as such, scrutinise VLOPSEs’ content moderation practices.
Beyond this soft degree of harmonisation, reports should also be more accessible and avoid jargon-filled language, include more source data as well as details of the assessment methods used.
2. Archiving
During the hackathon, one team, notably composed of people from Open Terms Archive, decided to tackle the archiving issue and created a library of systemic risk reports. With the threat of services discreetly amending their systemic risk reports, it seems essential to have a third-party archive of these documents. With the objective of making the reports comparable when amended, Open Terms Archive offer a markdown version of the documents.
Platforms, but also stakeholders with reporting obligations, should publish reports in machine-readable format.
While the centralised European Commission’s webpage is a helpful resource for stakeholders, it merely contains links to the individual platform’s pages (with some links not even working correctly). DSA documentation should therefore be permanently archived, in a manner that would also enable easier traceability and comparability.
3. Accountability for political advertising
With respect to political advertising, ads archiving and labelling need to be strengthened. Here again, the role of civil society is crucial: researchers or associations should actively archive and report misidentified content as much as possible.
Additionally, regulators also have a vital role to play, especially in defining clear and harmonised labelling criteria that would be consistent across all platforms.
4. Collaboration
Participants left the day thrilled with a renewed energy, new perspectives and a wish to continue promoting a proper implementation of the DSA. Therefore, in line with the report “Putting collective intelligence to the enforcement of the DSA”, it appears it will be beneficial to reiterate this kind of collaborative event. Cooperation between experts, who each brings to the table different perspectives, allows for a better understanding of the DSA reports and for a closer monitoring. This experience demonstrates how a collective analysis of the documentation creates a dynamic and stimulating environment to measure and critically assess the actual implementation and effectiveness of the DSA.
Next Steps
The findings of the “Hack the DSA” workshop have demonstrated that the DSA’s transparency promise has failed to deliver so far. Such a conclusion is not isolated and is further compounded by the fact that researchers’ data access, which is the simplest yet most critical transparency mandate under the DSA, is only possible as of 29 October 2025, following the entry into force of the delegated act on data access. Nonetheless, the regulation still has the potential to significantly improve our understanding of online content moderation practices, provided that VLOPSEs meaningfully engage and fully comply with their DSA obligations.
The transparency potential of the DSA is therefore yet to be fully seen, and a closer scrutiny of its implementation by all the relevant stakeholders is essential. A crucial conclusion of our “Hack the DSA” experience is that there is considerable scope for improvement in the DSA’s transparency obligations’ implementation, as evidenced by the recommendations we have put forward. These recommendations are designed to address some of the current shortcomings in DSA reporting. They also aim at ensuring that the reports actually offer a clearer understanding of platforms’ opaque practices and, as such, become a useful transparency tool. By mobilising collective intelligence, collaborative initiatives can play a decisive role in effectively enforcing the DSA: in revealing enforcement shortcomings, these collective workshops can strengthen the accountability value of the DSA by shedding light on VLOPSEs engagement and compliance with the instrument. These initiatives also offer a platform that is conducive to a refreshing way to engage with dense (and even boring sometimes!) documentation and to formulate shared recommendations for further improving the DSA reporting. We can therefore only hope that more collective workshops will be organised in the future to further unpack transparency practices.



