Accessing Information about Abortion
The Role of Online Platforms Under the EU Digital Services Act
The U.S. Supreme Court decision of 24 June 2022 overruled a half century of precedent supporting a constitutional right to abortion across the U.S. established in 1973 in Roe v. Wade. Essentially, Dobbs v. Jackson Women’s Health Organization left the decision on abortion to individual states. In nine of them, legislation banning or severely restricting abortion was already in place and kicked in as of July 1.1) The ruling, although astonishing, was not necessarily a surprise, after its draft had leaked a few weeks earlier. But to the surprise of many, almost immediately, Facebook and Instagram started removing posts informing about access to abortion pills, the Associated Press and Vice first reported.
This blog post looks closer at the online platforms’ reaction to the ruling and analyses the situation from the perspective of European law. In particular, it addresses the issue of access to information about abortion in light of the jurisprudence of the European Court of Human Rights and assesses whether the newly adopted EU instrument the Digital Services Act can help preventing such overreaction by platforms.
No Aiding or Abetting
The new decision of the Supreme Court has raised multiple questions not only about access to abortion but also about access to information about abortion. There are multiple attempts – at state level – to discourage any form of help for obtaining abortion, including through restricting access to information. Texas has recently enacted legislation, known as SB 8, that enables individuals to sue a person or institution for facilitating abortions. Fears of a chilling effects on free speech surrounding information about abortion process are not unreasonable. It’s likely that other conservative states will attempt to pass similar laws. The South Carolina state Senate is considering a bill that would criminalize hosting abortion-related information online. The law’s ban on “knowingly or intentionally aid[ing or] abet[ting]” an abortion “includes, but is not limited to knowingly and intentionally:
(1) providing information to a pregnant woman, or someone seeking information on behalf of a pregnant woman, by (…) internet (…) regarding self-administered abortions or the means to obtain an abortion, knowing that the information will be used, or is reasonably likely to be used, for an abortion;
(2) hosting or maintaining an internet website, providing access to an internet website, or providing an internet service purposefully directed to a pregnant woman who is a resident of this State that provides information on how to obtain an abortion, knowing that the information will be used, or is reasonably likely to be used for an abortion.”
According to the UCLA law professor Eugene Volokh, this wording suggests that the law will not be limited to information about abortions that would be (illegally) performed in South Carolina, but will cover communication of information about abortions generally, not just in-state. This, together with the overarching scope of an “internet website (…) purposefully directed to a pregnant woman”, to Volokh seems “pretty clearly unconstitutional”.
As the US Attorney General Merrick Garland said in the statement “women who reside in states that have banned access to comprehensive reproductive care must remain free to seek that care in states where it is legal”. Additionally, “under fundamental First Amendment principles, individuals must remain free to inform and counsel each other about the reproductive care that is available in other states”, he added. Laws such as the South Carolina bill are therefore likely to violate both the First Amendment and Section 230 of the Communications Decency Act (CDA).
Under Section 230, internet platforms are largely shielded from liability for what their users post online and, importantly, are protected against criminal liability from state laws. As Evan Greer and Lia Holland point out, Section 230, is “the last line of defense keeping reproductive health care support, information, and fundraising online”.
Online Platforms’ Moderation of Abortion-related Content
Vice reported that Meta was taking down posts offering the shipment of abortion pills within seconds after the Dobbs decision. The justification seems to be that such posts “go against [Meta’s] Community Standards on drugs” and that abortion pills constitute ‘restricted goods’ like firearms, marijuana or animals. “Content that attempts to buy, sell, trade, gift, request or donate pharmaceuticals is not allowed“, Meta spokesperson Andy Stone’s tweeted. Theoretically, the same policy applies to guns. It has not been explained, however, why the posts offering to sell a gun and weed were not removed. Interestingly, little before Dobbs, it was reported that Facebook follows a “10 strike rule“ allowing users to violate prohibited gun sales rule 10 times before taking action. It has also been reported that posts offering the sale of abortion pills with intentional typos have not been removed.
While Meta claims that the removal of posts offering abortion pills were “instances of incorrect enforcement” subject to “correcting”, there is a growing empirical evidence of platforms’ over-removal of content. Recent example of the deletion of hundreds of posts condemning the eviction of Palestinians from the Sheikh Jarrah neighbourhood of Jerusalem shows that platforms deal with socio-legal complexities with a censoring heavy hand. In case of posts on access to abortion, online intermediaries (and their upload filters) do not know the context and underlying facts, so the easiest and most risk-avoidant path is the “if in doubt, take it down” approach. As a result, platforms take part in a race to the bottom at the expense of freedom of expression: to be on a safe side, they comply with the most restrictive state laws and uniformly apply these standards across the platform.
Moreover, since the Dobbs decision, Facebook and Instagram have also been flagging content promoting reproductive rights. CNET found multiple examples of what appears to be content erroneously flagged: Instagram had labelled a photo of a pink cake bearing the message “pro-abortion” in white icing as “sensitive” for possibly containing “graphic or violent content”. Other examples include labelling as potentially containing “graphic or violent content” a map posted by Planned Parenthood showing where abortion is legal or a poster promoting animated documentary about abortion by Asha Dahya. In a post shared shortly after Dahya’s tweet, Instagram’s PR Twitter account wrote that such “sensitivity screens” are a “bug” and Instagram is “working” to fix them. Even though such “soft” moderation is not the same as content removals, it allows platforms to exercise their normative judgment about the content. Using this approach, platforms nudge their users into believing that content on reproductive rights is “sensitive” or “graphic”. Often, it leaves content creators unable to act against such mislabelling.
Abortion Under the ECHR
In Europe, laws on abortion vary significantly between countries. For example, Malta and Poland have the strictest abortion laws in Europe, allowing none, or almost none exceptions to the general ban (see examples here and here). Laws of these two countries explicitly criminalise abortion (as well as facilitating abortion). At the EU level, the right to abortion is not recognized in the Charter of Fundamental Rights of the European Union. The European Parliament, however, recently issued a resolution that proposes to change that.
The European Court of Human Rights (ECtHR) has addressed the topic of abortion on several occasions. In A, B and C v. Ireland, the Court held that Art. 8 of the ECHR cannot be interpreted as conferring a right to abortion. However, the ECtHR found that States have a positive obligation to create a procedural framework enabling a pregnant woman to effectively exercise her right of access to legal abortion (Tysiac v. Poland and R.R. v. Poland). In P. and S. v. Poland, the Court held that the authorities’ failure to provide access to reliable information on the conditions and procedures enabling pregnant women and girls, including victims of rape, to effectively access lawful abortion had been a violation of Art. 8 (right to respect for private and family life) of the Convention (see also A., B. and C. v. Ireland).
Access to information about abortion was prominently addressed in the 1992 ECtHR judgement Open Door and Dublin Well Woman v. Ireland. The case concerned two non-profit organizations providing counselling to pregnant women including information concerning abortion facilities abroad. The ECtHR held that the Ireland Supreme Court’s injunction restraining the organization’s activities limited the freedom to receive and impart information under Art. 10 of the ECHR in a two-fold manner. First, the injunction violated applicants’ rights to impart information. Second, there was an interference with the right to receive information with respect to services which are lawful in other States and may be crucial to a woman’s health and well-being (para 72). The Court accepted that Ireland had a legitimate aim in protecting the morals valued in the country and embodied in national legislation prohibiting abortions in force at that time. It added, however, that the State’s discretion in the field of the protection of morals is not unfettered and unreviewable (para 68). But most importantly, the ECtHR held that a continual restraint on the provision of information concerning abortion facilities abroad, “regardless of age or state of health or their reasons for seeking counselling on the termination of pregnancy” (para 73), was too broad, and thus disproportionate to the aims pursued (para 73-80). This means, in short, that a complete ban on information about abortion, irrespective of different circumstances of people who might be seeking this information, is considered disproportionate. It was, therefore, a violation of the right to freedom of expression and access to information.
Judgments of ECtHR, however, are directed towards States. But how to assess a situation in which online platforms voluntarily remove or restrict content informing about (access to) an abortion? Could the newly adopted EU law on digital services help address this issue?
Digital Services Act to the Rescue?
The Digital Services Act (DSA) adopted by the European Parliament on 5 July 2022 aims to tackle the power of platforms and their content moderation practices. The instrument addresses removals of illegal content, as well as content that platforms ban in their internal policies. The DSA, however, also provides protections from over-removal of content that is not illegal (and not violating internal policies). Which provisions of the DSA would come into play, either to allow the removal of abortion-related information or to prevent such an effect?
Art. 8 DSA
First, it should be highlighted that the DSA does not harmonise what content or behaviour is illegal. According to Art. 2 (g) ‘illegal content’ means “any information, which, in itself or in relation to an activity, including the sale of products or the provision of services, is not in compliance with Union law or the law of any Member State, irrespective of the precise subject matter or nature of that law.” This means that what is illegal depends on the national law of Member States. Further, under the DSA, national judicial or administrative authorities, including law enforcement authorities, will be able to issue orders against content considered illegal (Art. 8). The orders to act against illegal content will have to comply with a number of conditions, including a reference to the legal basis under Union or national law for the order, and a statement of reasons explaining why the information is illegal content, also with reference to one or more specific provisions of Union or national law in compliance with Union law.2)
In the context of access to information about abortion and abortion pills, this means that in a country where access to abortion (but also information about abortion) is restricted, national authority could issue an order to remove such content. Arguably, national legislation prohibiting dissemination of information about abortion would be, most likely, at odds with the ECtHR judgment in Open Doors. Some countries, however, choose to ignore the ECtHR judgments, so it is not an impossible scenario. The DSA would not provide any solution to this situation.
Another question is whether an order coming from a country with the strictest abortion law would have to be implemented in all EU countries? Husovec and Laguna point out that such difficult questions arise when Member States have opposing views on the legality of content or user behaviour and when the laws of two co