Here is what a strong Digital Services Coordinator should look like
The Digital Services Act requires EU member states to name a “Digital Services Coordinator” (DSC) to coordinate national regulators involved in platform oversight. But the DSCs are more than just “coordinators,” as they have to fulfill specific oversight tasks themselves. That is why member states should resist the temptation to build a small-scale coordinator and instead build a strong DSC with skills in data analysis, community management and flexible case-based work.
“[L]andmark rules adopted for a safer, open online environment”, the European Parliament declared on its website in the summer of 2022. Parliament had just concluded its negotiations with member states and the European Commission on the Digital Services Act (DSA). The law provides for new, EU-wide rules for platforms and online marketplaces, including Amazon, Facebook, TikTok and YouTube. The DSA is indeed a big accomplishment. To be sure, there are still clear weaknesses in the law, for example, the limited progress on the problem of “deceptive platform design”, or aspects of the “crisis mechanism” introduced shortly before the end of negotiations. Nonetheless, it is at least a recognition that self-regulation by the tech industry has often been insufficient to protect fundamental rights, ensure consumer protection and enable research.
But whether the DSA will actually create a “safer, open online environment” is still completely open. The DSA’s success depends on how well it is enforced. The best rules on paper will not achieve much, if regulators are not willing or able to enforce them. An unfortunate example of this is the EU’s General Data Protection Regulation (GDPR), which has been suffering from weak enforcement over the pastv several years. For the DSA, the EU clearly wanted to learn from the GDPR, which is reflected in the DSA’s enforcement regime.
How the DSA is supposed to be enforced
The DSA distinguishes between different types of platforms (for instance, hosting providers, search engines and online marketplaces) and between platforms of different sizes. Platforms with more than 45 million users per month in the EU are considered “very large online platforms” (VLOPs). For them, the DSA lists special due diligence requirements (Articles 33-43). Compliance with these due diligence requirements is monitored solely by the Commission (Articles 56, 65-78), with the aim of ensuring a consistent, EU-wide application, contrary to GDPR enforcement. Other rules for very large platforms are also mainly the responsibility of the Commission. For smaller intermediary service providers, national authorities in the respective member states are responsible. Unlike the GDPR, the DSA has clear timelines for cross-border cooperation between national regulators (Articles 57-59).
In each member state, multiple authorities may be tasked with enforcing the DSA (Article 49; Recital 113). This is likely to be the case in many member states because the DSA touches on issues as diverse as consumer protection, media regulation and data protection, for which countries often have separate authorities instead of one dedicated platform regulator. However, to coordinate these agencies and ensure exchange at the EU level, there must be a single body in each member state acting as the DSC (Articles 49-51). In addition to this coordination role, the DSC must perform important oversight functions itself: It is the complaints body for all users (Article 53), it vets researchers seeking data access to platforms (Article 40) and it can certify out-of-court dispute resolution bodies (Article 21), among other things. It is also part of the newly created European Board for Digital Services (Articles 61-63), bringing together all DSCs and the Commission. While this is mainly an advisory body, the Board can also initiate proceedings against platforms of all kinds (Articles 58, 60) and recommend actions in crisis situations (Article 36).
Who will the DSCs be and what do they have to do exactly?
Legislators in the member states now face the decision of how to set up their DSCs. Will it be an entirely new body? If so, does it take over as the country’s single platform agency, merging other national regulators’ platform-related tasks into its portfolio? Or is set up merely as a secretariat, forwarding most tasks to existing regulators? If no new body is created, what national regulator will additionally take on the role of the DSC? At least in the months after the conclusion of the DSA, it did not seem as if member states were keen on building the DSC as a centralized, all-in-one platform regulator. Rather, an existing regulator will take on the additional DSC role, allowing other authorities to potentially fulfill some DSA enforcement tasks as well. For instance, France is likely to pick its newly merged audiovisual and digital communications agency Arcom as the DSC and Ireland is building a new Media Commission, too, which will be tapped as the DSC. If there is a DSA enforcement case involving questions on data protection, the French and Irish DSCs would have to coordinate with the respective national data protection agencies or forward the case to them. This could happen, for example, for questions surrounding online advertising transparency or deceptive platform design. In Germany, it is likely that a federal regulator will be named DSC, which would then have to work with state-level media authorities for DSA proceedings pertaining to media regulation. These cases highlight the coordination role the DSC has to play at the national level, ensuring information exchange between regulators.
Yet, there other tasks that the DSC must fulfill and that go beyond national coordination. Consider these hypothetical sample cases and what oversight duties they each entail:
- A DSC requests data from a VLOP to check how the platform detects and mitigates potential risks for public health emanating from its services (Articles 34(1)(d), 40(1)), for example, the sale of supposed miracle cures for COVID-19.
- A person in Italy suspects that an online service operating in Italy but based in Ireland uses deceptive design practices. The user files a complaint with the Italian DSC, which does a first evaluation and forwards it to Ireland (Article 53).
- A research team at a university has found potential DSA violations at a VLOP and alerts the DSC in their country. The DSC writes up a reasoned request for the Commission to become active (Articles 65, 66). This leads to an investigation, in which the Commission involves the DSC (Articles 67(5), 68(2), 69(7)).
- The European Board for Digital Services, which is made up of all DSCs, gives a recommendation to the Commission regarding a current crisis situation, which allows the Commission to require short-term measures from VLOPs (Article 36).
- The internal complaint handling mechanism at a VLOP does not follow DSA standards. The Commission has yet to become active on this and the local DSC is either unwilling or unable to act. Three other DSCs request an investigation via the Board, ultimately leading to a joint cross-border investigation (Articles 56(4), 58(2), 60(1)(b)).
Recommendations for building a strong DSC
The fictitious cases show that the DSC requires skills and structures not only to coordinate various national agencies but also to conduct data analyses, build a community of researchers and other stakeholders, and take part in EU-level enforcement actions. The DSA emphasizes this important role for the DSCs, too (Recital 111), and stipulates certain requirements for the DSC. It needs to be “completely independent” from political and business interests (Article 50(2), Recital 111), have certain investigatory and enforcement powers (Article 51) and adequate resources (Article 50(1), Recital 111). Considering these formal requirements and the demands of the sample cases, what could a strong DSC look like?
Independence by law and design
The legal bases to create independent regulators will vary across member states, but beyond this, the way the DSC is built can provide for some independence as well. Leadership that is well-versed in economic and societal issues related to platforms and that is not picked (only) by the government can strengthen the DSC’s standing. A transparency registry documenting meetings with lobbyists from all sides in real-time, cooling-off periods for job changes between the DSC and platforms, and strong whistleblower protections could help the DSC gain and maintain the public’s trust. Moreover, the DSC should not receive instructions from the government, should have its own budget and regularly report to parliament as well as the public.
Platform experts and data science
The DSA is a data-generating piece of legislation. It contains 20 reporting obligations for VLOPs, the Commission or DSCs, there are various transparency and evaluation reports and, crucially, DSCs and vetted researchers have the right to request data from VLOPs. Analyzing different types of data will require data science capabilities at the DSC. Various governments and regulators have begun to establish data science units, one example being the French Pôle d’Expertise de la Régulation Numérique (PEReN). The DSC should strongly build up this expertise as well, functioning as the primary national hub for platform research. With a dedicated research budget and data science unit, it could both finance external research and conduct its own studies, especially with a long-term view that civil society research often cannot afford. Data science skills should be paired with the specific expertise needed to understand the systemic platform risks the DSA tries to tackle. For example, the DSC needs to recruit and retain talent well-versed in content moderation, human rights impact assessments, fact-checking and risk management.
Community- and capacity-building via fellowships and an expert advisory council
In addition to developing internal expertise, the DSC should foster structures and a culture that actively engages and builds a community with platform experts. One way to do this could be via fellowships. Companies and not-for-profits employ various forms of fellowships that the DSC could draw inspiration from. Some regulators use this tool as well. For instance, the UK’s Information Commissioner’s Office seeks fellows to help answer tech policy questions. Another way to tap into external expertise is creating a DSC advisory council or roundtable made up of experts from academia, civil society, business, media and maybe also platform users. These experts could help in a lot of the cases mentioned above, for example, regarding data access, complaints or cross-border investigations. There are lots of examples of advisory councils that offer good and bad practices for the DSC, as does the debate around “social media councils”. In general terms, for the advisory council not to be just a talking shop, it is necessary to clearly define its role and tasks, incorporate different perspectives and delineate its responsibilities from those of the DSC. Such a structural, continuous forum of exchange might further increase trust in the DSC’s oversight work.
Flexible, cross-regime, case-based task forces
The DSC should be designed to work in case-based project groups or task forces. This seems like a suitable set-up for many EU countries, because the topics the DSA addresses are often spread across multiple regulatory fields (for example, consumer protection and media regulation). Thus, cross-regime regulatory cooperation will be necessary, which could be done by a task force compromised of those national regulators with expertise on the specific case. If the case at hand can be clearly placed within the remit of a particular regulator, enforcement would remain with this regulator and the DSC would merely serve as a forum for information exchange. In cases of overlapping or missing responsibilities for some DSA rules, the DSC would step in to oversee compliance itself. For this to work, a collaborative mindset among regulators and a well-built communication system that can connect to or is based on the information exchange system that the Commission is building (Article 85) are crucial. This type of case-based approach is nothing new at all. It is not only common in private companies, but also regulators. Individual regulatory decisions – be it regarding monopolies, TV licenses or electricity grids – are, after all, “cases.” Within the Commission’s Directorate-General for Competition, “case handlers” work on files, for instance regarding antitrust or state aid. Furthermore, the Commission’s proposals on enforcing the DSA and its restructuring of the Directorate-General for Communications Networks, Content and Technology (DG CNCT) hint at a case-based approach as well.
Member states’ turn to build innovative platform regulators
When building the necessary structures for a strong DSC, member states might face legal and financial hurdles as well as political opposition. For instance, the inter-agency work envisioned for DSC task forces might not be feasible or, in fact, desired by existing agencies. Finding (and keeping) the right staff is hard anyways, but long recruitment processes in the administration and competition with big tech companies might make this more challenging. Budgetary debates might cause a stir, especially if existing agencies feel like a new “Coordinator” might take money or powers away from them. Ideally, member states would embrace these challenges and treat the development of the DSC as an opportunity to create a platform regulator that is fit to take on future tasks in this area as well, considering other EU legislation on artificial intelligence and the data economy are pending. While a dedicated, specialized national platform regulator would be the most suitable solution, this scenario is unlikely in most member states in the short term. Rather, EU countries will each tap an existing agency as DSC. As a first step in this likely scenario, member states should bring together key national regulators with DSA oversight functions as well as academic and civil society experts to build a strong system for information exchange for the DSC. This could function as a trust-building exercise for agencies that will have to work together to enforce the DSA in the future anyways. Without this type of cooperation to ensure robust enforcement, it will be much harder to proclaim that the DSA has contributed to a “safer and open online environment.”
Amended on 10 February 2023 to reflect the Irish Media Commission’s designation as DSC.
Leave A Comment