Towards a Digital Constitution
How the Digital Services Act Shapes the Future of Online Governance
Digital Service Providers (DSPs) like Google, Facebook, and Twitter/X have become key players in the modern digital landscape, influencing social interactions, political discourse, education and research, and cultural norms. Their widespread impact, however, brings challenges such as intellectual property (IP) infringement, privacy issues, hate and dangerous speech, misinformation, and political manipulation, highlighting the need for effective governance.
The European Union’s Digital Services Act (DSA) is a significant step in addressing these challenges, redefining digital platform regulations. It focuses on content moderation, user rights, and balancing regulation with innovation. The DSA aims to clarify platform responsibilities in content moderation, ensuring transparency and accountability, while protecting user rights and fostering digital market growth.
The DSA exemplifies the EU’s efforts to create a fairer, more responsible digital environment. Through the DSA, the EU appears to be advancing a process of constitutionalisation of Internet governance, as an important milestone in the evolving landscape of “digital constitutionalism”, aiming to establish a unified framework of rights, principles, and governance norms for the digital space, while also contributing to the development of new governance structures and regulatory bodies dedicated to effectively safeguarding fundamental rights online. This shift from reliance on private, market-driven solutions to a democratic, fundamental rights-centered approach represents a major change in perspective. Importantly, this trend extends beyond the EU, gaining traction globally in various jurisdictions. Legislative initiatives like the UK’s Online Safety Bill and Brazil’s “Fake News” Bill also reflect a move towards public governance in moderating online content. Such a multi-faceted approach to digital constitutionalism is increasingly seen as a practical response to the legitimacy crisis in privately managed online content moderation.
Digital Service Providers and Fundamental Rights: A Balancing Act
Large DSPs have transcended their roles as mere content hosts to become active shapers of public discourse and gatekeepers of information access. This transformation has significant implications for the democratic process and the exercise of fundamental rights, particularly in the realms of free speech and privacy. A poignant example of the complex role DSPs play in moderating public discourse is Twitter’s decision to suspend the account of a U.S. President. This action sparked a global debate on the limits of free speech and the power of private companies over public communication channels. Similarly, Facebook’s approach to content moderation, especially during politically charged events, has raised questions about the role of DSPs in influencing electoral processes and shaping political narratives. These incidents underscore the delicate balancing act DSPs must perform between allowing open discourse and curbing misinformation and harmful content.
The legal and ethical considerations of DSPs’ content moderation policies are multifaceted. On the one hand, there is a legal imperative to adhere to national laws and regulations regarding illegal content. On the other, DSPs face ethical dilemmas when their policies intersect with issues of free expression and censorship. The European Union’s General Data Protection Regulation (GDPR) and the DSA are legislative attempts to provide a framework for addressing these challenges, aiming to ensure that DSPs operate transparently and are held accountable for their content moderation decisions.
The impact of DSPs’ content moderation policies on democratic processes and individual rights is profound. The role of DSPs in shaping public discourse and information access is a double-edged sword. While these platforms have the potential to enhance democratic engagement by providing a space for public discourse, their algorithms and moderation policies can also lead to the silencing of voices and the suppression of certain viewpoints. This has led to concerns about the “echo chamber” effect, where users are only exposed to information that reinforces their existing beliefs, and the potential for algorithmic bias, which can inadvertently marginalize certain groups.
Balancing these competing interests—ranging from freedom of expression to freedom to conduct a business, and from the right to an effective remedy to privacy and data protection—is a complex challenge that requires careful consideration of both legal and ethical dimensions. The EU’s framework for online fundamental rights forms a complex but pragmatic scaffold upon which to construct a comprehensive platform liability regime. It emphasizes the need to strike a balanced approach that respects the nuanced interplay among various fundamental rights. While the regulatory fabric laid out by the EU Charter and European Convention on Human Rights allows for the imposition of obligations on DSPs, these must be carefully calibrated to protect the ecosystem of online platforms—from large, commercial entities to smaller, non-profit players. Importantly, IP rights, although recognized, are not to be overprotected to the point of overshadowing other fundamental rights or societal interests.
Evolving Liability and Regulatory Frameworks: From E-Commerce to Digital Services
The legal frameworks governing DSPs have undergone significant evolution, mirroring the rapid development and growing influence of digital platforms in our society. The 2000 EU E-Commerce Directive marked the beginning of formalized legal regulation for online services. It set the foundation for the digital market within the EU, primarily focusing on creating a harmonized environment for electronic commerce and introducing the concept of limited liability for service providers. This directive laid the groundwork for the regulation of digital services, although it was crafted in a different era of the internet, where the roles and impacts of DSPs were considerably different from today.
The regulatory landscape has since diversified, with regions like the EU, U.S., and others adopting varying approaches. In the EU, recent legislative developments, notably the DSA, represent a paradigm shift that that could potentially widen a transatlantic divide. The DSA aims to modernize the digital market’s regulatory framework, addressing contemporary challenges like online harm and platform influence. This approach contrasts with the U.S., where Section 230 of the Communications Decency Act still provides broad immunity to online platforms from liability for user-generated content, a principle that has been pivotal in the growth of these platforms but also a subject of intense debate and calls for reform.
Creating a global standard for digital governance remains a formidable challenge, given the divergent legal and cultural contexts across regions. The global internet landscape comprises various stakeholders with differing priorities and values, making the harmonization of digital laws an intricate task. This diversity often leads to conflicts of jurisdiction and enforcement, exemplifying the complexities of regulating a borderless digital space.
The shift towards more stringent regulations reflects a growing recognition of the substantial impact DSPs have on public discourse, individual rights, and market competition. The DSA, for instance, introduces more robust obligations for platforms, such as transparency in content moderation, due diligence, and increased accountability. While these regulations aim to create a safer and more trustworthy digital environment, they also pose challenges for DSPs and users. For platforms, the increased responsibility and compliance requirements could impact operational models and innovation strategies. For users, while these changes promise enhanced protection and rights, they may also lead to increased content moderation and potential overreach.
Content Moderation: The Interplay of Private Ordering and State Influence
Content moderation on digital platforms represents a complex and multifaceted challenge, intricately weaving together technology, law, and ethics. DSPs are at the forefront of this challenge, grappling with the monumental task of monitoring and moderating the vast amounts of content uploaded daily. The core of this moderation effort increasingly relies on sophisticated algorithms designed to detect and filter harmful and illegal content. However, these automated systems are not without their shortcomings. Issues of algorithmic bias and a lack of transparency have raised significant concerns, as they can inadvertently silence certain voices or amplify harmful narratives.
There is an intricate relationship between government policies and the content moderation practices of private platforms, introducing complexity to the landscape. Governments worldwide, each operating within their unique cultural and legal frameworks, influence platforms to adhere to local laws and societal norms. This influence ranges from explicit legal requirements, like those in the EU’s DSA, to more subtle forms such as political and public opinion pressures, shaping content moderation policies. The interplay raises concerns about the independence of DSPs and the potential for state censorship under regulatory compliance. The challenge lies in striking a balance between safeguarding freedom of expression—a fundamental right in democratic societies—and preventing the dissemination of harmful content. In this context, platforms grapple with the ethical and technical complexities of fostering open discourse while minimizing the impact of harmful content like hate speech and misinformation on public safety and social harmony.
From a societal public interest-perspective, users’ freedom of expression (and information) is crucial, given its role as part of “the essential foundations of [a democratic] society, one of the basic conditions for its progress and for the development of every man”. Optimal regulation in the field of platform governance must thus attempt first to preserve users’ and citizens’ rights, as more online enforcement – and potential over-enforcement – equates with less access to information and less freedom of expression, thus a shrinking space for debate essential to democracy. The centrality of users’ rights – and the overall goal of the EU legal system to preserve those rights against invasive proactive algorithmic enforcement – has been reiterated by the Grand Chamber of the CJEU in the Case C-401/19 of 26 April 2022, possibly acknowledging a fundamental right of users to share content online that cannot be limited by algorithmic content moderation.
The Digital Services Act: Towards a Fair and Transparent Digital Market
At its core, the DSA seeks to modernize the regulatory framework for digital services, addressing the challenges and opportunities presented by the evolving digital landscape.
The DSA is built on a foundation of key provisions that aim to reshape the way digital services operate. One of its primary objectives is to enhance transparency, particularly in areas such as content moderation and advertising. By requiring platforms to disclose how they target and amplify content, the DSA promotes a more open digital environment. Furthermore, the act introduces stringent measures against illegal content online, mandating platforms to swiftly address such issues while providing clear reporting mechanisms for users.
A pivotal aspect of the DSA is its emphasis on accountability. The legislation imposes a due diligence obligation on DSPs, making them more responsible for the content they host and the services they provide. This shift signifies a move away from the laissez-faire approach that has predominantly governed the digital sphere, marking a new era where platforms are held to higher standards of responsibility.
The potential impact of the DSA on innovation, user rights, and platform responsibilities is profound. By establishing clearer rules, the DSA offers a stable legal environment that can foster innovation and growth. For users, enhanced protections and greater transparency mean more control over their digital experiences and improved safeguarding of their rights. For platforms, and Very Large Online Platforms and Search Engines in particular, the DSA introduces new responsibilities and challenges, requiring them to adapt their operations to comply with stricter regulatory standards, including risk assessment and mitigation for algorithmic processes that might affect users’ fundamental rights.
The DSA has the potential to serve as a model for global digital governance. Its comprehensive approach to digital regulation addresses many of the issues that have emerged in the digital age, setting a precedent for other countries and regions. By striking a balance between protecting user rights and fostering a healthy digital economy, the DSA could influence future legislation worldwide, promoting a more harmonized approach to digital governance. However, although it introduces innovative regulatory mechanisms for platform governance, it is also an exceptionally intricate and lengthy legislative document, where preference for national oversight strategies over unified European approaches risks further complicating its harmonised implementation. Given these complexities, subsequent revisions and fine-tuning, also via delegated regulation, will inevitably be required to best protect fundamental rights in a rapidly evolving digital landscape.
Conclusion: Charting a Path Towards Digital Constitutionalism
Navigating the digital age highlights the need for effective digital governance, as discussed in this blog post. DSPs play a key role in public discourse and are central to evolving regulatory frameworks, especially in content moderation. The DSA marks a major shift in digital governance, focusing on transparency, accountability, and user protection, setting standards for DSPs to foster a safe, reliable, and innovative digital environment. Yet, digital governance is an evolving journey. The rapidly changing digital landscape presents continuous challenges and opportunities, requiring adaptable governance strategies.
The DSA, while providing a solid foundation, is the beginning of ongoing refinement and development. Looking ahead, the DSA’s emphasis on fundamental rights, transparency, and regulatory oversight could guide transatlantic and global digital governance. The DSA, serving as a potential model for other nations crafting their digital strategies, leads us to distill 10 key principles that are rooted in its fundamental rights-centered approach. These principles not only offer a blueprint for global digital governance but also serve as a valuable reference for other jurisdictions looking to update their legal frameworks for platform liability:
(1) DSP regulation in information societies is crucial for democratic information access and expression, requiring a balance of Fundamental Rights to uphold democracy and rule of law. Past DSP regulations have faced challenges in balancing competing fundamental rights.
(2) The DSA aims to balance interests while upholding rights, but its complexity and national oversight preference complicate implementation. Revisions are needed, and “digital constitutionalism” offers insights.
(3) The EU E-Commerce Directive and C-DSM Directive shaped DSP liability, with the DSA maintaining fundamental rights balance in a changing landscape.
(4) Fundamental Rights balancing in the DSA should be guided by European human rights texts and case law, with international standards as reference. Challenges include:
a) Avoiding additional limitations on freedom of expression, including access to information, information sharing, and artistic expression, with restrictions varying based on content nature but excluding hate speech and incitement to violence from protection.
b) Balancing DSPs’ monitoring obligations with privacy rights via restricting wholesale data processing in monitoring and filtering and prioritizing public interest information access.
c) Ensuring due process in algorithmic enforcement by DSPs.
d) Considering disproportionate obligations’ impact on DSPs freedom to conduct a business, with special empasis on Small and Medium-sized Enterprises and start-ups.
e) Balancing IP rights with public interest and avoiding over-enforcement due to the use of algorithms in order to safeguard creativity and expression, media pluralism and the right to information online.
(5) The DSA modernizes the e-Commerce Directive, emphasizing ex-post moderation over proactive measures and maintaining a ban on general monitoring obligations. Exceptions to this rule should be rare, primarily for manifestly illegal content that doesn’t require independent assessment. Relying solely on automated filters for content moderation is ill-advised due to technological limitations. Adhering to the “human-in-command” principle is essential for accurate and nuanced content moderation.
(6) The DSA distinguishes between illegal and harmful content, focusing on harmonizing rules for illegal content. From a freedom of expression perspective, controversial content should not be censored simply because it may make the audience uncomfortable. Different regulatory approaches should be applied to illegal and manifestly illegal content.
a) Manifestly illegal content includes content promoting offenses against human dignity, war crimes, crimes against humanity, human trafficking, incitement to violence, acts of terrorism, and child abuse. It may also encompass content blatantly infringing on IP rights without the need for equity-based assessment. Such content should be clearly defined to avoid ambiguities.
b) For content that is illegal but not manifestly so, requiring human review for legality assessment is necessary. Further independent scrutiny should be available upon request, with consistent standards for expeditious removal within a reasonable timeframe.
c) When content is harmful but not outright illegal, complete removal may not be the best approach from a freedom of expression standpoint. Alternative strategies like content flagging by DSPs and users, along with counter-speech mechanisms like “like” or “dislike” buttons, should be explored. Users should have more control over the type of content they engage with.
(7) Enhanced procedural guarantees for content moderation include:
a) Increased user access to information and opt-out options.
b) Efficient notice-and-action mechanisms with procedural safeguards, enabling the swift reinstatement of unjustly removed content.
c) Keeping notified content accessible during review, exempting DSPs from liability.
d) Transparency and human oversight in decision-making.
(8) The DSA regulates algorithmic content moderation, emphasizing fundamental rights and requiring:
a) Transparency and non-discrimination in algorithms.
b) Human review of algorithmic decisions.
c) Periodic audits and oversight for compliance from independent regulators.
d) Risk assessments and mitigation protocols.
e) Yearly transparency reports on algorithmic moderation.
However, there’s room for refining the DSA’s approach to algorithmic transparency and accountability to counter the challenges of algorithmic opacity. Specific obligations could be introduced to address issues like algorithmic bias, provide clearer explanations for automated decision-making logic, ensure transparency around data sets used for algorithmic training, and establish robust redress mechanisms to handle potential harm arising from algorithmic decisions.
(9) The DSA proposes specialized oversight bodies for monitoring DSP compliance. This oversight should be implemented according to the following guidelines:
a) A centralized EU entity for harmonized DSA implementation and policy guidelines, with a focus on fundamental rights, should operate in partnership with the EU Agency for Fundamental Rights (and, possibly, also other existing and future regulation authorities to be created to manage creativity online).
b) This entity should serve quasi-judicial functions in content moderation, acting as a final dispute resolution authority for borderline cases and setting precedents for DSP moderation practices. However, this resolution option should not replace users’ ability to seek recourse through an independent judiciary.
c) An Ombudsperson could represent users in these proceedings, ensuring their rights receive adequate protection.
(10) DSP obligations should be proportional and clear, avoiding impractical or ambiguous requirements that hinder business freedom and create barriers for Small and Medium-sized Enterprises. The DSA’s nuanced approach to assigning responsibilities based on size and market share should be a benchmark for future content moderation regulations.