Articulating Legitimacy through Policy Recommendations
Meta’s Oversight Board and the Constitutionalisation of Platform Governance
On 6 December, Meta’s Oversight Board (OB) issued its policy advisory opinion on ‘cross-check’, a content moderation system used by the company to avoid the erroneous removal of content shared by highly influential users on its platforms. Although previously investigated by the OB under the auspices of the Trump de-platforming case, the issues with the system’s performance became prominent only after the document leak known as the Facebook Files, disseminated by whistle-blower Frances Haugen.
The opinion showcases important considerations for the authority and independence of the OB as a representation of societal constitutionalism (Maroni, 2019; Gradoni, 2021; Golia, 2021). The OB was established following a liberal constitutional narrative, and the institution’s opinions have hinged on broader issues concerning governance, legitimacy and accountability within Meta. Despite the opinion’s directness in calling the company out for its disproportionate attention paid to corporate interests to the detriment of its human rights commitments, the OB’s decision presents an underlying duplicity, as it criticises policy and design choices replicated in the OB’s own architecture. This curtails the institution’s capacity to enhance accountability and legitimacy.
The Oversight Board: A liberal and normative constitutional framework
The establishment of the OB followed a period of intense scrutiny and legitimacy crises of Meta’s Facebook and Instagram platforms. In 2017, following the scandals concerning Facebook’s undue influence in the 2016 US general election and the UK referendum on Brexit, Zuckerberg pledged to address the company’s shortcomings. He promised to ‘develop the social infrastructure to give people the power to build a global community’.
The OB was established a few years later, replicating a liberal constitutional narrative for legitimacy-enhancing purposes and surrounded by multiple justifications. This Supreme Court with policy privileges reflects constitutionalism to the extent it perceives the capacity for the rule of law to limit the exercise of power in a social and political environment, a specific characteristic of liberal constitutionalism (Loughlin, 2022). This approach follows what Teubner describes as the use of state constitutionalism as a historical model that influences partial societal constitutions in their processes of generalisation and re-specification (Teubner, 2017) but limits the institution’s capability of enhancing legitimacy and transparency in platform governance.
Cross Check: Legitimacy and Corporate Interest
The system of ‘cross-check’ involves a content moderation procedure in which content posted by accounts from public and highly influential users is sent to additional review by human moderators instead of the usual algorithmic systems in place. The issue with the system lies in the fact that a high percentage of the content sent for additional review is not assessed expeditiously. Without the completion of this two-step verification, possibly violating content from highly influential users remains on the platform for longer periods. In effect, users inserted into the cross-check procedure are indirectly given the privilege of being protected from automated decisions, alongside receiving a more scrutinised application of the community standards that consider discretionary policies and enforcement.
The OB’s first investigation of the cross-check system emerged in the decision concerning the de-platforming of former US President Donald Trump (decision 2021-001—FB-FBR). Then, the OB recommended that Meta give more transparency for different content moderation procedures applied to political figures, stressing that different procedures may lead to different substantial outcomes. For this, the OB recommended that Facebook disclose error rates and accuracy data demonstrating the effectiveness of ‘cross-check’. Meta did not implement this recommendation then, reassuring the OB that ‘cross-check’ only involves a small number of content moderation decisions.
Meta’s attempt to underscore ‘cross-check’ and its lack of transparency was botched by the disclosure of internal documents by whistle-blower Frances Haugen in September 2021. The Wall Street Journal reported that by 2020, ‘cross-check’ already included 5.8 million users, leading to what has been described as a ‘safe list’ of accounts by highly influential users that were able to circumvent content policy measures. Upon discovering the omission and misrepresentation of ‘cross-check’ by Meta in prior engagements, the OB issued a statement condemning the company’s attitude and demanding more extensive transparency. Following an initial meeting between the OB members and Meta executives, the company formally requested an advisory opinion addressing the system.
After having been called upon for an advisory opinion, the OB issued a total of 32 recommendations for Meta regarding ‘cross-check’. In its assessment, the system enables visibility for violating content and represents the company’s misguided focus on corporate interests at the expense of its human rights commitments (§78). The failure to track core metrics of the system impedes an assessment of its efficacy, specifically concerning the broad definitions that allow users and pages to be inserted into the system on equal footing, despite the need to prioritise expression that is relevant for human rights, such as that from users in minority groups and journalists who report from conflict zones. Overall, the OB assessment also portrays the system’s role as maintaining unequal access to discretionary policies and enforcement, resulting in a policy approach that correlates with the company’s focus on more lucrative markets.
Illustrating this, the OB summarises Meta’s briefing on the system, which reports that 42% of content reviewed through ‘cross-check’ originated from the United States or Canada, while just 9% of “monthly active people” on Facebook were from this region. This discrepancy is correlated with the fact that “average revenue per person” in the US and Canada is the highest in the world, three times larger than in Europe and about 12 times larger than in the Asia-Pacific regions (§98-99).
Equal treatment and representation
Although a preliminary assessment of this opinion, it can be surmised that this is a landmark decision in which the OB directly addresses Meta’s commitments to human rights standards established in international law and its commitments with its shareholders. The OB recommends that the company should not trump its human rights commitments to extend corporate interests, considering thus the internal politics that justified the establishment of ‘cross-check’ and, perhaps, even of the OB itself.
Douek summarises that the OB presents an opportunity for Meta to streamline decisions, define specific content moderation policies, fend off national regulation, set the standards for platform governance, and distance itself from highly political and controversial decisions in unstable social and political environments (Douek, 2019). Beyond these justifications, it must be highlighted that the OB enhances Meta’s governance output legitimacy (Schmidt, 2020), promotes trust, and fosters users’ reliance on the company’s innovations towards content moderation and user relations. These benefits are closely related to bottom-line profits, as a social environment where users feel safe expressing their opinions increases marketing audiences and revenue (Gillespie, 2020).
However, the institutional design replicated in the OB further curtails its efficacy to enhance accountability and legitimacy. Similar to the disproportionate numbers concerning the cross-check system, the OB decisions have disproportionately focused on appeals and issues of European and North American users. So far, the OB has issued 30 rulings with an almost perfect balance between cases originating from the global north and south (16 and 14 decisions, respectively). Beyond the north/south dichotomy, however, lies a further divide, as cases originating from Central and South Asia represent only 13% of the decisions issued until now, against 53% of combined decisions for the European and North American regions (33% and 20%, respectively) (Oversight Board Transparency Report – Q2,2022). When contrasting this performance with the company’s investor earnings report for the second quarter of 2022, it is possible to perceive a considerable imbalance, as Southern Asia registered roughly 469.2 million active users on Facebook in contrast to 203.7 and 269.7 million in North America and Europe, respectively. All this to say, the poison the OB’s recommendations attempt to excise from Meta’s use of ‘cross-check’ is also circulating in the external oversight institution.
A Leap or a Step for Constitutionalisation?
State constitutionalism may present a historical model that can influence the constitutionalisation of governance structures in social media platforms. However, this model also needs to comprehend constitutionalism from its social and political foundation, considering the paradoxical expected outcome of limiting totalising power by means of power itself (Teubner, 2017).
Moving from a state-based perspective to a societal one allows the exercise of the necessary institutional imagination capable of addressing the issue of the large-scale application of human rights in private environments. From a societal perspective of constitutionalism, it is necessary to critically address the possibilities and limits of the institutional devices developed to enlarge the legitimacy of platforms in their exercise of power when performing content moderation.
The OB opinion regarding ‘cross-check’ represents an important step in this process of constitutionalisation. It maintains the need for enhancing the proportionality of content moderation practices with close attention to human rights and equal protection. However, this critique must also be considered internally in the OB’s procedure and institutional design. Meta’s platform users represent multiple social and political realities. Although the company addresses its users without specific considerations to their place of origin, framing its governance as transnational in nature, policy outcomes and specific interpretations of community standards lead to different experiences for users in different regions. Engaging with this perspective in content moderation policies and the board itself might be the next step towards a more accountable social network.