30 August 2021

The DSA Proposal’s Impact on Digital Dominance 

One of the most pressing questions in the ongoing European debates about the Digital Services Act (DSA) proposal is the question of dominance, and specifically the question of the disproportionate power and societal impact of dominant services on online speech and the fundamental rights of users.

With the DSA proposal, the European Commission aims to provide a new regulatory framework for the responsibility of online services in the EU internal market. Specifically, the DSA departs from the self-regulatory paradigm for online service responsibilities. It sets out to overcome the existing fragmentation and regulatory gaps by defining clear and proportionate obligations for online services with regard to illegal content and content moderation practices more generally, which reflect the difference in resources and societal impact of the various actors on the market. In the intention of the Commission, legal clarity should, in turn, translate into a safer online environment, where providers are held accountable and users’ fundamental rights (in particular, freedom of expression, privacy and data protection, non-discrimination and right to an effective legal remedy) are duly protected.

Thus, the DSA aims at providing a harmonized regulatory framework for addressing online harms, while protecting users’ fundamental rights. However, there is a risk that imposing accountability at the threat of fines might increase the power of already dominant intermediaries. This problem is particularly evident for content moderation, where the DSA framework threatens to further strengthen the role of Big Tech in determining what is acceptable online speech. Over the last decades, a handful of services have consolidated their position as the primary arbiters of speech and online activity. The fact that Facebook is actively calling for increased regulation, widely considered to be informed by a wish to further consolidate its power, may serve as a warning for the potential anti-competitive impacts of regulation such as the DSA.

In this contribution, we discuss the question of whether the DSA can be expected to further entrench the power of dominant services. First, we consider how the DSA impacts the relative competitiveness of dominant and smaller services, and how the economics of content moderation tend to favour very large online platforms (VLOPs). Second, we focus on a selection of provisions in the DSA and how these have the potential to entrench VLOPs’ dominance and private power, while providing suggestions for better safeguards.

The Likely Economic Impact of the DSA

We can start with the question whether the DSA can significantly alter the power dynamics underlying innovation and competition in the market of intermediary services. While the DSA focuses on issues of liability for illegal content and responsibility in content moderation processes more generally, a separate proposal, the Digital Markets Act (DMA), aims to address issues of competition in digital markets. General issues of economic dominance and monopoly power will have to be addressed through competition law and the DMA.

With the DSA, a central goal for the European Commission (EC), related to competition, is to overcome legal fragmentation with a set of harmonized rules and provide “the conditions for innovative digital services to emerge and scale up in the internal market” (Recital 4). Legal fragmentation has been a problem since the adoption of the eCommerce Directive (ECD), which left crucial details to self-regulation and national law. In the last years, fragmentation has further increased as a result of legislative developments at the national level (such as the German NetzDG). While a more harmonized framework would certainly benefit dominant companies, well-positioned to respond to a more strictly harmonized EU regulatory environment, a recurring fear is that smaller service providers might falter.

However, the EC’s Impact Assessment accompanying the DSA proposal is very positive about the benefits of overcoming fragmentation for smaller companies: they would be able to scale up their offerings in a more robust EU market. Focusing on the increased cross-borders turnover resulting from harmonization, the EC estimates a cost reduction of around 400.000 € per annum for a medium enterprise operating in three Member States and of 4 million € for the same company operating in 10 Member States. In the EC’s view, the cost savings would be particularly beneficial to micro and small enterprises, who encounter prohibitive costs when offering services in more than two Member States.

Although the EC acknowledges that compliance with the DSA obligations entails additional costs for all hosting service providers, according to its calculations, however, these costs would be lower than those of facing a fragmented legal environment. Estimating the DSA-related expenditures is complicated though, as these costs are highly dependent on the volume of notices received by the individual service provider.

Overall, the EC’s assessment report does not warrant the conclusion that the costs of DSA compliance would be prohibitive for SMEs and/or disproportionately affect them vis-à-vis VLOPs. Notably, though, the calculations on costs and administrative burdens at company level are not exhaustive on the economic impact of the DSA: for instance, the costs for out-of-court dispute settlement are not included, and the impact assessment’s tables only refer to the costs of notice and action procedures, while it is unclear if this covers the moderation of harmful/undesirable (but not necessarily illegal) content. The impact assessment’s calculations should therefore be taken with a grain of salt.

In addition, the economic considerations found in the Impact Assessment primarily relate to the first policy goal of the DSA – strengthening the Digital Single Market by removing legal fragmentation – and do not necessarily capture the broader economics underlying content moderation today. In the absence of a complete overhaul of platform governance, the economics of content moderation significantly benefit larger companies. First, the reduction of costs through legal harmonization does not per se translate into an ability of smaller actors to scale up and compete with the big ones. Content moderation of internationally operating social media services also entails significant investments in personnel (including moderators) with relevant expertise on language, politics, culture, government relations and other jurisdictional specifics, with significant efficiency gains for larger services. The possibilities of automation in detecting and addressing illegal and harmful content issues are likewise significantly greater for services with the largest volume of user activity and content notices.  Arguably, the higher costs envisaged for VLOPs are marginal, compared to their structure and turnover, and their established risk management procedures.

In summary, addressing legal fragmentation will have some important benefits for small service providers, but this will likely benefit large service providers as well – if not more – and not affect their ability to dominate. But of course, the question of digital dominance and content moderation extends beyond the economics of content moderation to the other two key policy objectives of the DSA: addressing online harms and protecting users’ fundamental rights.

Beyond the Economic Impact

The DSA proposal keeps the basic intermediary liability safe harbor regime of the ECD in place (Articles 3-5 DSA). The importance of a harmonized safe harbor regime is likely significant for smaller service providers. On the contrary, for larger service providers that are actively moderating content, one could raise the question if a safe harbor is still warranted. While we do not support such proposals, considering their probable negative impact on fundamental rights, any initiative to condition the safe harbors on compliance with particular due diligence obligations should be restricted to dominant service providers, to prevent further harm to the competitiveness of smaller players. More importantly, in our view, there are a number of noted uncertainties about the applicable scope of the DSA’s safe harbors and due diligence framework, in particular in relation to search engine, infrastructural, messaging, and ancillary services. There is a risk that the legal certainty provided by the DSA will be the greatest for dominant social media and marketplace services.

The DSA continues, and in some ways further reinforces, the tendency to outsource primary decisions on fundamental rights and speech governance to platforms. This “privatized enforcement” phenomenon is present in many of the DSA procedures for tackling illegal content, including orders to act against illegal content in Article 8 and 9 DSA, the notice-and-takedown mechanisms in Article 14 DSA, the measures and protection against misuse in Article 20 DSA and the notification of suspicions of criminal offences in Article 21 DSA. This tendency combines with the incentives, also present in the DSA, towards over-removal of lawful content in order to avoid fines. The complaint procedure of Article 17 DSA is also noteworthy in this regard.  It provides users with an ability to contest undue removals of content by online platforms. But it also places the responsibility for operating these procedures on the platforms and does not impact the discretion of online platforms to act on the basis of their terms of service. This, in combination with the proposed regime for out-of-court dispute settlement, highlights the extent to which content moderation is steered away from more robust judicial processes, which more robustly guarantee users’ fundamental rights.

The risk-based approach for content moderation imposed on VLOPs raises some additional concerns in this regard. The risk assessment will involve a complex balancing exercise between fundamental rights and other policy objectives (the eradication of online harm and disinformation, in particular) by dominant online platforms. The only stakeholders with some leverage over these risk assessments are the European Board for Digital Services (composed of the national regulators) and the EC. Specifically, Article 27 DSA requires them to recommend best practices for VLOPs to mitigate the systemic risks identified either in the assessment under Article 26 DSA or through data access and transparency reporting. Once such balancing has been conducted and guidelines have been issued, VLOPs are easily provided with an additional line of defense in imposing their standards for content moderation on users. Additional clarity is needed on the precise focus of the risk assessments, to prevent this new framework from becoming captured by dynamics between dominant platforms and regulators to the detriment of users.

Article 12 DSA proposal obliges intermediaries to provide information on any restrictions applied for the purpose of content moderation, and to act in a diligent, objective and proportionate manner, with due regard for the fundamental rights of users. Dominant online platforms’ terms of service exert a particularly strong influence on users’ fundamental rights, shaping the boundaries of legitimate online speech globally. Within the current text, the proportionality standard could be used to give horizontal effect to freedom of expression. But Article 12 DSA could more explicitly take into account the dominance of particular online platforms. It could incorporate stricter standards for dominant services, limiting their discretion in moderating speech on matters of public concern. Given the exceptional power over speech of the VLOPs, Article 12 DSA could more generally clarify that fundamental rights are applicable in the horizontal relation between them and the users and require VLOPs to follow human rights law standards for online content moderation.

Conclusion

This discussion has looked at how the problem of digital dominance is affected by the DSA proposal. While the DSA proposal tries to not aggravate this situation – and we can agree with the EC that addressing legal fragmentation helps smaller companies operating in the EU – there is no doubt that the economics of content moderation strongly benefit larger companies. Outside of a restructuring of the market (through the lens of economic regulation and competition law), obligations on companies to address online harms and at the same time protect fundamental rights end up playing into the hands of dominant companies. Within its current scope, what should be expected from the DSA is to include robust safeguards for users, minimize privatized enforcement dynamics and put more focus on the horizontal effect of fundamental rights, including to limit the discretion of dominant players. These elements could be complemented by stronger restrictions on the business models of dominant platforms (notably based on pervasive tracking and targeting of their users and attention-maximizing algorithms), which have been linked to the spread of harmful content and to a variety of other individual and societal risks, including some of the issues identified by the DSA as “systemic risks”. While we cannot expect the DSA (considering its scope and focus) to solve the issues of dominance in content moderation, several improvements are warranted to limit the exceptional power of the big actors in content moderation and thus support a better protection of fundamental rights and key societal interests.


SUGGESTED CITATION  Buri, Ilaria; van Hoboken, Joris: The DSA Proposal’s Impact on Digital Dominance , VerfBlog, 2021/8/30, https://verfassungsblog.de/power-dsa-dma-01/, DOI: 10.17176/20210830-112903-0.

Leave A Comment

WRITE A COMMENT

1. We welcome your comments but you do so as our guest. Please note that we will exercise our property rights to make sure that Verfassungsblog remains a safe and attractive place for everyone. Your comment will not appear immediately but will be moderated by us. Just as with posts, we make a choice. That means not all submitted comments will be published.

2. We expect comments to be matter-of-fact, on-topic and free of sarcasm, innuendo and ad personam arguments.

3. Racist, sexist and otherwise discriminatory comments will not be published.

4. Comments under pseudonym are allowed but a valid email address is obligatory. The use of more than one pseudonym is not allowed.




Explore posts related to this:
DSA, Digital Dominance, Platform Governance


Other posts about this region:
Europa