28 July 2022

Accessing Information about Abortion

The Role of Online Platforms Under the EU Digital Services Act

The U.S. Supreme Court decision of 24 June 2022 overruled a half century of precedent supporting a constitutional right to abortion across the U.S. established in 1973 in Roe v. Wade. Essentially, Dobbs v. Jackson Women’s Health Organization left the decision on abortion to individual states. In nine of them, legislation banning or severely restricting abortion was already in place and kicked in as of July 1.1) The ruling, although astonishing, was not necessarily a surprise, after its draft had leaked a few weeks earlier. But to the surprise of many, almost immediately, Facebook and Instagram started removing posts informing about access to abortion pills, the Associated Press and Vice first reported.

This blog post looks closer at the online platforms’ reaction to the ruling and analyses the situation from the perspective of European law. In particular, it addresses the issue of access to information about abortion in light of the jurisprudence of the European Court of Human Rights and assesses whether the newly adopted EU instrument the Digital Services Act can help preventing such overreaction by platforms.

No Aiding or Abetting

The new decision of the Supreme Court has raised multiple questions not only about access to abortion but also about access to information about abortion. There are multiple attempts – at state level – to discourage any form of help for obtaining abortion, including through restricting access to information. Texas has recently enacted legislation, known as SB 8, that enables individuals to sue a person or institution for facilitating abortions. Fears of a chilling effects on free speech surrounding information about abortion process are not unreasonable. It’s likely that other conservative states will attempt to pass similar laws. The South Carolina state Senate is considering a bill that would criminalize hosting abortion-related information online. The law’s ban on “knowingly or intentionally aid[ing or] abet[ting]” an abortion “includes, but is not limited to knowingly and intentionally:

(1) providing information to a pregnant woman, or someone seeking information on behalf of a pregnant woman, by (…) internet (…) regarding self-administered abortions or the means to obtain an abortion, knowing that the information will be used, or is reasonably likely to be used, for an abortion;

(2) hosting or maintaining an internet website, providing access to an internet website, or providing an internet service purposefully directed to a pregnant woman who is a resident of this State that provides information on how to obtain an abortion, knowing that the information will be used, or is reasonably likely to be used for an abortion.”

According to the UCLA law professor Eugene Volokh, this wording suggests that the law will not be limited to information about abortions that would be (illegally) performed in South Carolina, but will cover communication of information about abortions generally, not just in-state. This, together with the overarching scope of an “internet website (…) purposefully directed to a pregnant woman”, to Volokh seems “pretty clearly unconstitutional”.

As the US Attorney General Merrick Garland said in the statement “women who reside in states that have banned access to comprehensive reproductive care must remain free to seek that care in states where it is legal”. Additionally, “under fundamental First Amendment principles, individuals must remain free to inform and counsel each other about the reproductive care that is available in other states”, he added. Laws such as the South Carolina bill are therefore likely to violate both the First Amendment and Section 230 of the Communications Decency Act (CDA).

Under Section 230, internet platforms are largely shielded from liability for what their users post online and, importantly, are protected against criminal liability from state laws. As Evan Greer and Lia Holland point out, Section 230, is “the last line of defense keeping reproductive health care support, information, and fundraising online”.

Online Platforms’ Moderation of Abortion-related Content

Vice reported that Meta was taking down posts offering the shipment of abortion pills within seconds after the Dobbs decision. The justification seems to be that such posts “go against [Meta’s] Community Standards on drugs” and that abortion pills constitute ‘restricted goods’ like firearms, marijuana or animals. “Content that attempts to buy, sell, trade, gift, request or donate pharmaceuticals is not allowed“, Meta spokesperson Andy Stone’s tweeted. Theoretically, the same policy applies to guns. It has not been explained, however, why the posts offering to sell a gun and weed were not removed. Interestingly, little before Dobbs, it was reported that Facebook follows a “10 strike rule“ allowing users to violate prohibited gun sales rule 10 times before taking action. It has also been reported that posts offering the sale of abortion pills with intentional typos have not been removed.

While Meta claims that the removal of posts offering abortion pills were “instances of incorrect enforcement” subject to “correcting”, there is a growing empirical evidence of platforms’ over-removal of content. Recent example of the deletion of hundreds of posts condemning the eviction of Palestinians from the Sheikh Jarrah neighbourhood of Jerusalem shows that platforms deal with socio-legal complexities with a censoring heavy hand. In case of posts on access to abortion, online intermediaries (and their upload filters) do not know the context and underlying facts, so the easiest and most risk-avoidant path is the “if in doubt, take it down” approach. As a result, platforms take part in a race to the bottom at the expense of freedom of expression: to be on a safe side, they comply with the most restrictive state laws and uniformly apply these standards across the platform.

Moreover, since the Dobbs decision, Facebook and Instagram have also been flagging content promoting reproductive rights. CNET found multiple examples of what appears to be content erroneously flagged: Instagram had labelled a photo of a pink cake bearing the message “pro-abortion” in white icing as “sensitive” for possibly containing “graphic or violent content”. Other examples include labelling as potentially containing “graphic or violent content” a map posted by Planned Parenthood showing where abortion is legal or a poster promoting animated documentary about abortion by Asha Dahya. In a post shared shortly after Dahya’s tweet, Instagram’s PR Twitter account wrote that such “sensitivity screens” are a “bug” and Instagram is “working” to fix them. Even though such “soft” moderation is not the same as content removals, it allows platforms to exercise their normative judgment about the content. Using this approach, platforms nudge their users into believing that content on reproductive rights is “sensitive” or “graphic”. Often, it leaves content creators unable to act against such mislabelling.

Abortion Under the ECHR

In Europe, laws on abortion vary significantly between countries. For example, Malta and Poland have the strictest abortion laws in Europe, allowing none, or almost none exceptions to the general ban (see examples here and here). Laws of these two countries explicitly criminalise abortion (as well as facilitating abortion). At the EU level, the right to abortion is not recognized in the Charter of Fundamental Rights of the European Union. The European Parliament, however, recently issued a resolution that proposes to change that.

The European Court of Human Rights (ECtHR) has addressed the topic of abortion on several occasions. In A, B and C v. Ireland, the Court held that Art. 8 of the ECHR cannot be interpreted as conferring a right to abortion. However, the ECtHR found that States have a positive obligation to create a procedural framework enabling a pregnant woman to effectively exercise her right of access to legal abortion (Tysiac v. Poland and R.R. v. Poland). In P. and S. v. Poland, the Court held that the authorities’ failure to provide access to reliable information on the conditions and procedures enabling pregnant women and girls, including victims of rape, to effectively access lawful abortion had been a violation of Art. 8 (right to respect for private and family life) of the Convention (see also A., B. and C. v. Ireland).

Access to information about abortion was prominently addressed in the 1992 ECtHR judgement Open Door and Dublin Well Woman v. Ireland. The case concerned two non-profit organizations providing counselling to pregnant women including information concerning abortion facilities abroad. The ECtHR held that the Ireland Supreme Court’s injunction restraining the organization’s activities limited the freedom to receive and impart information under Art. 10 of the ECHR in a two-fold manner. First, the injunction violated applicants’ rights to impart information. Second, there was an interference with the right to receive information with respect to services which are lawful in other States and may be crucial to a woman’s health and well-being (para 72). The Court accepted that Ireland had a legitimate aim in protecting the morals valued in the country and embodied in national legislation prohibiting abortions in force at that time. It added, however, that the State’s discretion in the field of the protection of morals is not unfettered and unreviewable (para 68). But most importantly, the ECtHR held that a continual restraint on the provision of information concerning abortion facilities abroad, “regardless of age or state of health or their reasons for seeking counselling on the termination of pregnancy” (para 73), was too broad, and thus disproportionate to the aims pursued (para 73-80). This means, in short, that a complete ban on information about abortion, irrespective of different circumstances of people who might be seeking this information, is considered disproportionate. It was, therefore, a violation of the right to freedom of expression and access to information.

Judgments of ECtHR, however, are directed towards States. But how to assess a situation in which online platforms voluntarily remove or restrict content informing about (access to) an abortion? Could the newly adopted EU law on digital services help address this issue?

Digital Services Act to the Rescue?

The Digital Services Act (DSA) adopted by the European Parliament on 5 July 2022 aims to tackle the power of platforms and their content moderation practices. The instrument addresses removals of illegal content, as well as content that platforms ban in their internal policies. The DSA, however, also provides protections from over-removal of content that is not illegal (and not violating internal policies). Which provisions of the DSA would come into play, either to allow the removal of abortion-related information or to prevent such an effect?

Art. 8 DSA

First, it should be highlighted that the DSA does not harmonise what content or behaviour is illegal. According to Art. 2 (g) ‘illegal content’ means “any information, which, in itself or in relation to an activity, including the sale of products or the provision of services, is not in compliance with Union law or the law of any Member State, irrespective of the precise subject matter or nature of that law.” This means that what is illegal depends on the national law of Member States. Further, under the DSA, national judicial or administrative authorities, including law enforcement authorities, will be able to issue orders against content considered illegal (Art. 8). The orders to act against illegal content will have to comply with a number of conditions, including a reference to the legal basis under Union or national law for the order, and a statement of reasons explaining why the information is illegal content, also with reference to one or more specific provisions of Union or national law in compliance with Union law.2)

In the context of access to information about abortion and abortion pills, this means that in a country where access to abortion (but also information about abortion) is restricted, national authority could issue an order to remove such content. Arguably, national legislation prohibiting dissemination of information about abortion would be, most likely, at odds with the ECtHR judgment in Open Doors. Some countries, however, choose to ignore the ECtHR judgments, so it is not an impossible scenario. The DSA would not provide any solution to this situation.

Another question is whether an order coming from a country with the strictest abortion law would have to be implemented in all EU countries? Husovec and Laguna point out that such difficult questions arise when Member States have opposing views on the legality of content or user behaviour and when the laws of two countries demand contradictory actions. The DSA resolves these potential conflicts with an important limitation of territorial scope. The territorial scope of orders to act against illegal content should be clearly set out on the basis of the applicable rules of Union and national law, including the Charter, and, where relevant, general principles of international law. Moreover, the territorial scope of an order should be limited to what is strictly necessary to achieve its objective.3) The DSA does not regulate the territorial scope of orders or their cross-border enforcement. According to Recital 31, “in particular in a cross-border context, the effect of the order should in principle be limited to the territory of the issuing Member State, unless the illegality of the content derives directly from Union law or the issuing authority considers that the rights at stake require a wider territorial scope, in accordance with Union and international law, while taking into account the interests of international comity.”4)

Art. 12 DSA

Another relevant provision of the DSA can be found in Art. 12. The Article states that online platforms may impose restrictions on content provided by their users but these restrictions have to be described in platforms’ terms and conditions. Art. 12.1 further clarifies that the information provided in terms and conditions “shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making, and human review as well as rules of procedure of their internal complaint handling system”.

In other words, platforms are free to decide what kind of content they do not wish to host, even if this content is not actually illegal. They have to, however, make it clear to their users. They also have to inform them of any significant change to the terms and conditions (Art. 12.1b). This is, of course, in line with the right to conduct business and the platforms’ own right to freedom of expression. In the context of abortion-related information, this means that platforms could simply decide that such content is no longer welcome in the service they provide. Such a move would be consistent with their approach to the removal of abortion-related content in the post-Dobbs USA.

An interesting addition can be found in Art. 12.2 of the DSA, which says that when applying and enforcing their own restrictions platform providers “shall act in a diligent, objective and proportionate manner”. In particular, they have to give due regard to the rights and legitimate interests of all parties involved, including the fundamental rights of their users, “such as the freedom of expression, freedom and pluralism of the media, and other fundamental rights and freedoms as enshrined in the Charter”. It is not yet fully clear how this provision will be interpreted (see also here). It does not say that certain types of content cannot be removed (or blocked). It emphasizes, instead, the importance of proper balancing between different fundamental rights and freedoms indicating that platforms should respect the rights of their users, for example their right to access information. Art. 12.2, therefore, makes platforms (thus private entities) responsible for protecting fundamental rights of their users and all other parties involved. This horizontal effect, however, will have to be tested in the long run. It is unclear to which extent users can appeal directly to their fundamental rights, e.g. the freedom to receive and impart information under article 10 of the ECHR, in a complaint procedure against a platform which restricted abortion-related content. Who would have such a (potential) right. Platforms will have to weigh “the rights and legitimate interests of all parties involved”. Does this mean users whose content was restricted or does it include third parties whose right to receive information about abortion was also affected as a result of platform content moderation decision? Does this include the right of the general public to uphold national rules on morality?

Finally, the operationalization of this provision will be a major challenge, as platforms have to ‘act in a diligent, objective and proportionate manner’ and take ‘due regard’ to fundamental rights not only in cases of content removal, but also when restricting the availability, visibility, and accessibility of information. For instance, how would such due regard be taken when platforms use algorithmic tools to shadow ban information about reproductive rights?

Conclusions

Dobbs’ aftermath is yet another example showing that the power of platforms, especially when used to reinforce state power, can have a disastrous effect on the exercise of fundamental rights. In the U.S., there are growing concerns of how the decision affects the right to privacy (see here and here). In the EU, the DSA was supposed to create a safe digital space where the fundamental rights of users, including freedom of expression, are protected. It is, arguably, easier said than done. Despite its ambitious goals, the Regulation is not exactly a magic cure to censorship, whether by state or private entities. Orders to remove content defined as illegal by one state only can be effectively issued. Platforms get little guidance on how to act when Member States have opposing views on the legality of content (such as aiding/abetting abortion or information about access to abortion pills). However, unlike the rules on removing terrorist content online, the DSA limits the scope of the orders against illegal content under Art. 8 to the territory of the issuing State (in principle). The platforms, moreover, may choose to restrict content which is not illegal based on their own terms and conditions. Although the reference to fundamental rights in Art. 12.2 is important, it is not an obligation to host any type of content. Without specifying what exactly it entails, and how to operationalize it in algorithmic content moderation decisions, the significance of Art. 12.2 may be lost in the sauce. The exact scope, enforcement and a balance between different fundamental rights and interests of parties involved, such as, for instance, users’ freedom of expression and platforms’ freedom to conduct a business, will be determined in practice. It will be interesting to see how platforms operationalize the provisions and how they are interpreted by the courts.

References

References
1 Alabama, Arkansas, Missouri, Ohio, Oklahoma, South Carolina, South Dakota, Tennessee and Texas.
2 Article 8 (1) (i) (a) (b) of the DSA.
3 Article 8 (1) (i) (b) of the DSA.
4 Recital 31 of the DSA.

SUGGESTED CITATION  Kuczerawy, Aleksandra; Dutkiewicz, Lidia: Accessing Information about Abortion: The Role of Online Platforms Under the EU Digital Services Act, VerfBlog, 2022/7/28, https://verfassungsblog.de/accessing-information-about-abortion/, DOI: 10.17176/20220728-181720-0.

Leave A Comment

WRITE A COMMENT

1. We welcome your comments but you do so as our guest. Please note that we will exercise our property rights to make sure that Verfassungsblog remains a safe and attractive place for everyone. Your comment will not appear immediately but will be moderated by us. Just as with posts, we make a choice. That means not all submitted comments will be published.

2. We expect comments to be matter-of-fact, on-topic and free of sarcasm, innuendo and ad personam arguments.

3. Racist, sexist and otherwise discriminatory comments will not be published.

4. Comments under pseudonym are allowed but a valid email address is obligatory. The use of more than one pseudonym is not allowed.




Explore posts related to this:
Abortion, DSA, Dobbs, Freedom of Expression, Platform Regulation


Other posts about this region:
Europa, USA