Policymakers and the public are increasingly concerned about a lack of transparency and accountability in content moderation. Opaque and incontestable content moderation decisions have potential impacts on freedom of expression and media freedom, and well-known issues of discrimination and bias. Our focus here is on how Article 20 DSA can and should be interpreted going forward. Specifically, does Article 20 require a human content moderator to review every content moderation decision on request? And should it? Continue reading >>
On 21 June Meta and the US Department for Housing and Urban Development released a legal settlement that will restrict Meta’s ability to offer those clients some of its core ad-targeting products. It resolves (for now) a long-running case over discriminatory targeting of housing adverts. Meta is now prohibited from using certain targeting tools in this context, and has promised new tools to ensure more representative targeting. This US lawsuit should be a wake-up call for European regulators, reminding them that taking systemic discrimination seriously requires proactive regulatory reform and enforcement. The relevant provisions of the Digital Services Act (DSA) are largely symbolic.
Continue reading >>
In the context of the broader ‘techlash’ against the power and exploitative practices of major platforms, EU lawmakers are increasingly emphasising ‘European values’ and fundamental rights protection. But relying only on human rights to guide both social media law and academic criticism thereof is excluding other normative perspectives that place greater emphasis on collective and social interests. This is deeply limiting – especially for critical scholarship and activism that calls for the law to redress structural inequality. Continue reading >>