01 November 2022

An Intersectional Lens on Online Gender Based Violence and the Digital Services Act

The European Union (EU)’s Digital Services Act (DSA) introduces novel mandatory due diligence obligations for online platforms, with additional obligations for large online platforms (VLOPs) to address potential societal risks posed by the provision of their services, such as risks to fundamental rights, participation in civic discourse, or as this essay will focus upon, the risk of online gender based violence (OGBV). This approach of mandatory due diligence obligations is a recognition by EU co-legislators of the complexity of the issues the DSA aims to address. Striking the balance between the protection of free expression, addressing illegal content and creating a safe online environment will be challenging. However, the DSA is ambitious in its aims; if effectively implemented, these provisions have the potential to set important standards for tackling some of the most pervasive harms of the digital ecosystem.

Efforts to address these systemic risks and the related mechanisms for access to redress, will require the adoption of an intersectional methodology as presented by Kimberlé Crenshaw which will be elaborated below. Without such a methodology, the development of risk assessments by online platforms, the subsequent mitigation measures, evaluation by the European Commission and the effectiveness of and access to remedy provisions will simply fail to provide the necessary mechanisms for those most acutely impacted by these rights violations.

Tackling Online Gender Based Violence through Due Diligence Measures

Balancing freedoms whilst addressing harms is a fine line which, if not approached correctly in the DSA’s implementation, risks further infringing rights, especially for those already historically marginalised within society. The obligations for VLOPs to conduct comprehensive assessments of systemic risks to fundamental rights from their services (Article 34), to develop and implement mitigation measures (Article 35), and to be subjected to independent audits to assess their efforts (Article 37), if implemented appropriately, may set a global precedent for striking this balance. After extensive advocacy from civil society and numerous amendments to the text, EU co-legislators chose to explicitly name a few specific systemic risks including negative consequences in relation to OGBV.

The inclusion of OGBV as a systemic risk within the DSA aligns with the EU’s aim to criminalise certain forms of OGBV within a draft Directive to Combat Violence against Women and Domestic Violence, published in March 2022. The Directive seeks to establish minimum criminal standards for the perpetration of cyber stalking, non-consensual sharing of intimate or manipulated material and cyber-incitement to violence or hatred. The prevalence of OGBV in Europe has only magnified over the years. As documented by the European Institute for Gender Equality, 51% of young women hesitate to engage in online debates after witnessing or directly experiencing online abuse. Women of colour and non-binary people are at increased risk of experiencing OGBV.

Content moderation efforts by some online platforms to address OGBV continue to fall short. Reporting mechanisms often force users to attribute their experiences to predetermined categories which fail to capture the multifaceted nature of the abuse. Content moderators are not provided with relevant, gender-sensitive training and many who report instances of OGBV feel “left in the dark” about the outcome of their reports or are informed that their experience did not violate community standards. A handful of large online platforms committed to making substantive improvements at the UN Generation Equality Forum, however progress is yet to be made.

OGBV exists on a spectrum and can take many forms, including actions that may not rise to the level of illegal conduct, which nevertheless have a chilling effect on women and non-binary people’s speech. Therefore, the provisions of the DSA will need to address the systemic risks that stem from non-illegal conduct that nevertheless results in abuses. This would seem challenging, however researchers have reiterated that comprehensive coordination between legislators, online platforms and civil society to holistically analyse and address such phenomena is the best method to tackle systemic risks such as OGBV. Mandatory due diligence obligations must be accompanied by effective accountability mechanisms to ensure online platforms cannot renege on their responsibilities. Consultation with civil society, who can develop policy and enforcement recommendations should be better informed by an ever-increasing library of analyses based on increased access to data for researchers. This is why the DSA is a concrete opportunity; the cyclical nature of the due diligence obligations provides a solid foundation for all stakeholders to consistently improve upon their efforts to evaluate and mitigate these systemic risks.

Intersectionality

We do not experience our lives in silos; the experiences of, for example, a woman of colour who is a member of a religious minority, reflect the unique intersections of those different identities and go beyond a summation of the experiences of women, of people of colour, and of religious minority groups. This reality means that in order for the risk assessments of the online platforms to be informative and effective, and for the subsequent evaluations of the European Commission to identify any shortcomings, a comprehensive understanding of how these forms of discrimination may intersect will need to be developed; adopting an intersectional methodology is therefore the best approach. Therefore, the risk assessments should be envisaged Human Rights Impact Assessments (HRIAs), which are extensive, cyclical processes of identifying, understanding, assessing and addressing the adverse effects of the business project or activities on the human rights enjoyment of impacted rights-holders. This process will not only identify specific impacts but their severity and how they may intersect with other fundamental rights violations.

Intersectionality is an analytical framework for understanding how aspects of a person’s social and political identities combine to create different modes of discrimination and privilege. Concretely, the method looks at the interconnected nature of social categorisations such as race, class, and gender, which can create overlapping and interdependent systems of discrimination or disadvantage. The methodology can be applied in various iterations such as in research, data analysis and in ex ante or ex post analyses of the effectiveness of a given policy or legislation for specific groups. This facilitates a more critical policy analysis and deeper understanding for lawmakers of how policy operates in different contexts, thereby leading to more progressive and inclusive legislative frameworks.

The adoption of this methodological framework within the risk assessments and the associated provisions of the DSA is an indispensible approach in ensuring that these assessments do not become empty check-box exercises. Thus, as a first step, the assessments of the systemic risk of online abuse will need to have specific indicators related to the experience of OGBV amongst different marginalised groups, before then assessing how this systemic risk intersects with others identified in the DSA, such as the risk to civic discourse. For example, online platforms conducting the assessments, and organisations who will later on conduct audits on these efforts, could ask questions such as: Are there additional variables to consider when developing content moderation mechanisms to address online OGBV? Do we need to provide more opportunities for users to give context to their experiences for example? Or do our current community standards address or maintain structural inequalities?

Gendered Disinformation

Gendered disinformation, as an example, flows from the same patriarchal context in which people experience OGBV and is often targeted at journalists, advocates and political candidates. Gendered disinformation characterises women candidates as not being qualified, lacking the requisite knowledge, intelligence, or experience for the role, or as persons who lie, are too emotional for the task, prone to aggression, or lacking sanity. Once again women of colour are more likely to be the subject of disinformation when compared to other women or to men of colour and this disinformation is likely to include or be accompanied by racial discrimination. Gendered disinformation therefore is based on misogyny but can simultaneously intersect with discrimination based on racism, ableism, religious identity etcetera and poses a risk to free expression, human dignity and to women’s participation in civic discourse, all of which are specifically identified within the risk assessment provision of the DSA.

Online platforms developing these assessments, and indeed the European Commission who will assess and enforce these evaluations, must engage in a cross-sectional manner, with consistent civil society engagement. Primarily, the assessments that will be conducted in relation to each of the systemic risks identified should be followed by subsequent analyses in which findings are cross-referenced. For example: Do the demographics who have been identified as most vulnerable correlate/overlap?; Are the impacts of one systemic risk resulting in the direct experience of another? In brief, the Article 34 risk assessments should only be considered concluded when the analyses on each systemic risk are then cross-reviewed in conjunction with one another.

Residual Impact

The incorporation of an intersectional approach within the risk assessments and mitigation measures would positively impact the broader provisions governing VLOPs in the DSA. For example, efforts to improve content moderation mechanisms can avoid previously documented errors, which led to women of colour being at increased risk of over-enforcement, whilst the abuse they face remains largely unaddressed by the reporting mechanisms in place. Similarly, analyses on how systematic risks impact communities differently will aid moderators managing Internal Compliant Mechanisms (Article 20) and entities engaged as Out-of-Court Dispute Settlement Bodies (Article 21) in reducing the risk of reaching inappropriate resolutions, which fail to address the unique impacts of these experiences. Moreover, a more comprehensive understanding of how a person may have to contend with multiple sources of oppression can contribute to more equitable treatment for marginalised communities within these bodies, who continue to face discriminatory treatment or secondary victimisation such as victim blaming within institutional or judicial contexts.

Civil society can assist in ensuring an intersectional approach in all these areas is adopted; the final Regulation in fact includes several specific references to the need for civil society consultation, particularly within the due diligence provisions. A concrete example are the provisions related to access to data (Article 40) which includes civil society organisations amongst the entities who can conduct research. Researchers may for example request cross-sectional data points as one way of developing comprehensive analyses on issues such as OGBV and its acute impact on marginalised communities. The research community has emphasised that, without data, it is unclear whether mitigation efforts like blocking accounts actually makes a difference in the behaviour of those posting abusive content. Access to such data and the subsequent research developed is vital as marginalised communities already face an uphill battle in access to justice.

Conclusion

A ‘one size fits all’ approach to the DSA’s due diligence obligations, most notably the risk assessments and subsequent content moderation adaptations, will result in the considerable efforts placed in defining these obligations being a wasted endeavour. In the case of those most gravely impacted by OGBV, such an approach would fail victims. The European Commission needs to deeply and urgently reflect upon how it will harmonise the vast regulatory framework it has established. In this case, the Directive on Violence Against Women and Domestic Violence will bring certain forms of OGBV into the purview of illegal content, and the mandatory due diligence obligations of the DSA, to ensure these combined efforts do not  prove empty.

In short, the assessment of systemic risks, subsequent mitigation measures and the mechanism put in place to ensure access to redress must all be developed using an intersectional methodology and stakeholder consultation must be consistent and meaningful. Concretely, a formal mechanism by which civil society can actively participate, evaluate and provide recommendations for improved enforcement and implementation should be established. The EU cannot purport to be the global regulatory leader in online content governance if it subsequently fails to enforce, and make useful, the very provisions of the DSA that make the Regulation revolutionary.


SUGGESTED CITATION  Allen, Asha: An Intersectional Lens on Online Gender Based Violence and the Digital Services Act, VerfBlog, 2022/11/01, https://verfassungsblog.de/dsa-intersectional/, DOI: 10.17176/20221101-215626-0.

One Comment

  1. Emaediong Akpan Sun 8 Sep 2024 at 01:52 - Reply

    This is very insightful and is a very important resource for my current research

Leave A Comment

WRITE A COMMENT

1. We welcome your comments but you do so as our guest. Please note that we will exercise our property rights to make sure that Verfassungsblog remains a safe and attractive place for everyone. Your comment will not appear immediately but will be moderated by us. Just as with posts, we make a choice. That means not all submitted comments will be published.

2. We expect comments to be matter-of-fact, on-topic and free of sarcasm, innuendo and ad personam arguments.

3. Racist, sexist and otherwise discriminatory comments will not be published.

4. Comments under pseudonym are allowed but a valid email address is obligatory. The use of more than one pseudonym is not allowed.




Explore posts related to this:
DSA, Digital Services Act, fundamental rights, gender based violence, intersectionality, online harms


Other posts about this region:
Europa