Search
Generic filters
13 November 2023
,

A Primer on the UK Online Safety Act

The Online Safety Act (OSA) has now become law, marking a significant milestone in platform regulation in the United Kingdom. The OSA introduces fresh obligations for technology firms to address illegal online content and activities, covering child sexual exploitation, fraud, and terrorism, adding the UK to the array of jurisdictions that have recently introduced new online safety and platform accountability regulations. However, the OSA is notably short on specifics. In this post, we dissect key aspects of the OSA structure and draw comparisons with similar legislation, including the EU Digital Services Act (DSA). Continue reading >>
0
09 November 2023
,

We Don’t Need No Education?

Artificial Intelligence doesn't know what's 'true'. Especially, generative AI models like chatbots veer from the truth, i.e. “hallucinate”, quite regularly. Chatbots simply invent information at least 3 percent of the time and sometimes as high as 27 percent. Given the (future) use of such systems in nearly all domains, we might want such systems to follow more stringent rules of accuracy. And those truth-related rules are not the only rules for AI systems that warrant societal scrutiny. How those systems are trained will be crucial. In this blog post, we argue that a new perspective is key to tackle this challenge: “Hybrid Speech Governance”. Continue reading >>
0
24 October 2023

Politisches Microtargeting vs. Rechtsaufsicht

In der letzten Woche ist bekannt geworden, dass die EU-Kommission, konkret der amtliche Account der Kommissarin für Inneres, Microtargeting auf X (vormals Twitter) nutzte, um Schwung in ein festgefahrenes Gesetzgebungsvorhaben zu bringen. Es handelt sich um eine gezielte Beeinflussung der gesellschaftlichen Debatte rund um die sogenannte „Kinderschutzverordnung“, auch bekannt als „Chatkontrolle“ durch datenbasierte Zielgruppenansprache (zur Berichterstattung und Analyse). Diese Posts sollten Druck auf mitgliedsstaatliche Regierungen ausüben, um doch noch eine Mehrheit für das Vorhaben zu beschaffen. Dieser Vorgang ist auch abseits der inhaltlichen Debatte um die „Chatkontrolle“ bemerkenswert, schließlich zeigt er neben den systemischen Risiken von Plattformen und dem Bedürfnis nach effektiver Durchsetzung von Plattformregulierung auf, dass die Kommission sich in einem Spannungsverhältnis der Funktionen als Aufsichtsbehörde und als politische Akteurin befindet und somit das systeminhärente Risiko besteht, dass sie ihre Funktion als Aufsichtsbehörde zugunsten politischer Ziele vernachlässigt. Continue reading >>
0
24 October 2023

Who Decides What Counts as Disinformation in the EU?

Who decides what counts as “disinformation” in the EU? Not public authorities, because disinformation is not directly sanctioned in the Digital Service Act (DSA) or other secondary legislation. Nor Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSes), which avoid editorial decisions to maintain their legal status as intermediaries with limited liability. Instead, the delicate task of identifying disinformation is being undertaken by other private organisations whose place of administration and activity, purpose, funding and organizational structure appear problematic in terms of the legitimacy and even legality of the fight against disinformation. This blog post maps out the relevant (private) actors, namely the ad industry, fact checking organizations and so-called source-raters. Continue reading >>
0
18 October 2023
,

A Step Forward in Fighting Online Antisemitism

Online antisemitism is on the rise. Especially since the recent terror attack by Hamas in Southern Israel, platforms like X are (mis)used to propel antisemitism. Against this backdrop, this blog post analyses the legal framework for combatting online antisemitism in the EU and the regulatory approaches taken so far. It addresses the new Digital Services Act (DSA), highlighting some of the provisions that might become particularly important in the fight against antisemitism. The DSA improves protection against online hate speech in general and antisemitism in particular by introducing procedural and transparency obligations. However, it does not provide any substantive standards against which the illegality of such manifestations can be assessed. In order to effectively reduce online antisemitism in Europe, we need to think further, as outlined in the following blog post. Continue reading >>
23 May 2023

A New European Enforcer?

As a key piece of the European Commission’s digital agenda, the Digital Services Act (DSA) is drawing a lot of attention from civil society, industry, and regulators. One particularly interesting development in that regard is the Commission’s current transformation from being the institution leading the DSA’s negotiations to the one enforcing it. This article explores the challenges faced by the Commission in this transformation. Continue reading >>
0
10 May 2023
,

Taiwan’s Participatory Plans for Platform Governance

Platform regulation is not limited to Europe or the United States. Although much debate currently focuses on the latest news from Brussels, California, or Washington, other important regulatory ideas emerge elsewhere. One particularly consequential idea can be found in Taiwan. Simply put, Taiwan wants to, tacitly, democratize platform governance. Concretely, Taiwan wanted to establish a dedicated body that would potentially facilitate far-reaching civil society participation and enable ongoing citizen involvement in platform governance. This article explains what discourses about platform governance can learn from Taiwan and how vivid democratic discourse shapes platform governance beyond traditional regulatory models. Continue reading >>
0
27 February 2023
,

Action Recommended

The DSA will have a say in what measures social media platforms will have to implement with regard to the recommendation engines they deploy to curate people’s feeds and timelines. It is a departure from the previous narrow view of content moderation, and pays closer attention to risks stemming from the amplification of harmful content and the underlying design choices. But it is too early to celebrate. Continue reading >>
0
13 February 2023
, ,

The platform-media relationship in the European Media Freedom Act

The European Media Freedom Act proposal takes aim at very large online platforms’ gatekeeping power over access to media content and aims to reshape the relationship between media and platforms. By providing media organisations a special position on platforms, however, the EMFA risks changing the media’s role and relationships with other actors in ways that run counter to its overall objective to secure media freedom. Continue reading >>
0
07 February 2023

Die Störerhaftung ist tot, lang lebe die Störerhaftung

In der rechtswissenschaftlichen Debatte melden sich erste Stimmen, die den Urteilen in den Verfahren YouTube II und Uploaded III entnehmen, dass der BGH die Störerhaftung für sämtliche Vermittlungsdienste abgeschafft habe. Mit anderen Worten könnten nun etwa auch Access Provider, Domain Registrare oder DNS-Dienste als Täter von Urheberrechtsverletzungen ihrer Nutzer haften. Diese Lesart der beiden Urteile zur Haftung von Sharing-Plattformen ist nicht nur rechtlich fernliegend, die Ausweitung der Haftung neutraler Diensteanbieter droht die Grundrechtseinschränkungen, die bereits an der Störerhaftung kritisiert wurden, zu potenzieren. Continue reading >>
0
Go to Top