Now What
Exploring the DSA’s Enforcement Futures in Relation to Social Media Platforms and Native Advertising
After lengthy political debates, the Digital Services Act (DSA) has finally been agreed upon. Now, all attention is shifting towards how the European Union’s most meaningful reform in the sphere of platform governance in the past two decades will look like in practice. The question of enforcement has already been getting considerable attention, not only in academic exchanges such as the Verfassungsblog’s earlier DSA/DMA Symposium, but also in mainstream media, with the main concern being that the resources put forth by the European Commission are too humble when compared to the DSA’s far-reaching goals. Indeed, the DSA’s nature, the nature of the markets it aims to govern, as well as the plethora of stakeholders involved in platform governance make enforcement expectations more utopic than realistic.
However, the responsible digitalization of platform compliance can, at least to a certain extent, modernize and simplify market monitoring. In this short essay, I will reflect on some of the enforcement implications of the paradigm shift proposed by the DSA with respect to its framing of illegal content.
To this end, I will first discuss the definition of “illegal content” and its extension to sectoral regulation; second, I will re-visit the discussion of native advertising and highlight how it currently falls in a grey and overly complicated applicable framework in between the DSA and sectoral regulation; and lastly, I will briefly explore a potential alignment solution I developed together with Prof. Anda Iamnitchi and Thales Bertaglia and which was published and presented as a paper at the 2022 ACM Conference on Fairness, Accountability, and Transparency (the full paper can be found here).
The Nature of the DSA: One for All and All for None
Unlike its predecessor, the E-Commerce Directive, the DSA actually defines “illegal content” in Article 3(h), using a broad definition:
“illegal content” means any information that, in itself or in relation to an activity, including the sale of products or the provision of services, is not in compliance with Union law or the law of any Member State which is in compliance with Union law, irrespective of the precise subject matter or nature of that law.
As elaborated in Recital 12 DSA, this definition fulfills the DSA’s goal of drawing an equal sign between online and offline illegality. In addition, the Recital clearly states that illegal content should “be defined broadly to cover information relating to illegal content, products, services and activities”. Included in the illustrative examples are the more traditional categories of illegal content such as child sexual abuse, but also emerging forms of illegal content such as unlawful non-consensual sharing of private images, or online stalking. In addition, the list of illustrative examples also includes “the sale of products or the provision of services in infringement of consumer protection law”, and Recital 68 DSA further specifies that even advertisements themselves may be illegal content.
The definition and subsequent examples acknowledge a new paradigm of illegality: from criminal illegality to content regulation. In contrast, the E-Commerce Directive focused on a narrower understanding of illegality, but also of content. For instance, references to consumer protection dealt more with the transparency requirements relating to e-commerce transactions, than with the actual concept of content illegality. This is understandable, since in the 2000s, online platforms were much more specialized, which is no longer the case today, as platforms increasingly copy each other’s affordances.
As a result of illegality, and in combination with the prohibition of a general obligation to monitor (Article 8 DSA), platforms may be exempted from their liability for hosting such illegal activities or content if they “act expeditiously to remove or to disable access to that content” (Recital 22). In addition, platforms have the obligation of setting up notification mechanisms available to any individual or entity to report illegal content (Article 16 DSA).
The action undertaken by the platform may be:
- voluntary (Article 7), by which platforms carry out their own investigations and measures for detecting, identifying, removing, or disabling access to illegal content; and
- mandatory (Article 9), when platforms receive an administrative or judicial order to act against illegal content.
This broader understanding of illegal content is a double-edged sword. On the one hand, it clarifies that every item of “information” on the Internet may be content, and that content may very well fall under a web of applicable rules from a content regulation perspective. I argue this to be a positive development, since the earlier private regulatory adaptations to the E-Commerce Directive have led to platforms creating visible priorities for voluntarily monitoring certain types of content (e.g., criminally illegal content), while completely ignoring other types (e.g., violations of consumer protection). For instance, in a study I conducted with Pietro Ortolani, we looked at four social media platforms (Twitter, Facebook, TikTok and Twitch) to understand what type of content could be reported by users, and we found that the predominant categories were criminal activities and intellectual property infringements. At the time when the study was conducted, the users of these platforms could not complain about content that did not abide by consumer protection standards, such as information duties or unfair practices. For instance, traders are required to disclose information such as address, contact, price and withdrawal rights details, and while these information duties have been central for marketplace governance, social media platforms have generally not developed any specific visual verification or affordances to communicate these details to their users. However, in the light of the broad definition of illegal content, it can be argued that platforms will have to enable reporting mechanisms under Article 16 for more categories of content than they have been acknowledging so far.
The DSA is thus a bridge between different types of content regulation, as can also be seen in Recital 68, which acknowledges the complementing role of the DSA vis-à-vis media and consumer regulation:
Finally, this Regulation complements the application of the [Audiovisual Media Services Directive] which imposes measures to enable users to declare audiovisual commercial communications in user-generated videos. It also complements the obligations for traders regarding the disclosure of commercial communications deriving from the [Unfair Commercial Practices Directive] (emphasis added).
On the other hand, the DSA does not really elaborate on how complementarity will look like in practice. From an interpretation perspective, it will be interesting to see how closely courts, including the Court of Justice of the European Union, will interpret the DSA in relation to its supposed complements. Perhaps even more importantly, the complementarity of the DSA in relation to other sectoral regulation will also stumble at the enforcement level. As an example, Article 40 DSA refers to an obligation of very large online platforms (VLOPs) and very large online search engines (VLOSEs) to provide “access to data that are necessary to monitor and assess compliance with this Regulation” to the European Commission or the European Board for Digital Services, the new EU-wide entity tasked with DSA oversight. However, in the case of consumer protection, national enforcement authorities also have quite wide data access rights for investigation purposes, as can be seen in the Consumer Protection Cooperation Regulation (CPC Regulation). Article 3(a) CPC Regulation states that among the investigation powers of competent authorities is also:
the power of access to any relevant documents, data or information related to an infringement covered by this Regulation, in any form or format and irrespective of their storage medium, or the place where, they are stored.
Yet the picture of how these powers will be exercised in parallel or in cooperation is far from clear. References to such overlap are not many and are not compelling (e.g., Article 57 DSA mentioning that “where appropriate, the [national] Digital Services Coordinator receiving a request by another Digital Services Coordinator to provide specific information […] may involve other competent authorities or other public authorities of the Member State in question”.)
Blind Spot under the Microscope: Native Advertising on Social Media
The main danger of the lack of clarity with respect to the complementarity of the DSA in relation to other sectoral regulation is that it will create grey areas that will lead to under-enforcement. Take for instance the influencer economy as a great example of a menu of consumer protection issues from which platform users must be protected. Since my earlier post on native advertising on this blog, a lot has happened on social media:
- In July 2022, YouTube made a deal with the well-known Canadian dropshipping platform Shopify, allowing YouTube users to purchase goods and services in real time as they watch content on the platform;
- As of June 2022, Twitter has made its Twitter Shops module available to all its merchants in the United States;
- In May 2022, TikTok launched an “industry-first ad solution” called Branded Missions that internalized the until then off-platform influencer marketing supply chain, allowing brands to offer advertising tasks to creators, mediated by the social media platform. Twitch continues to use a similar setup, called the Bounty Board, which was launched in 2018 and which allows streamers to engage in sponsorship deals without any other third parties than Twitch.
These are only a few examples that shape how social media is no longer a space solely dedicated to social networking and/or content delivery, but due to the booming monetization policies pursued by platforms, it is a transactional space full of advertising and offers for products. Unfortunately, the DSA does not show the foresight of accounting for these market changes – doing so will at least entail some creative interpretation.
According to Recital 1, the DSA acknowledges “online social networks and (emphasis added) online platforms allowing consumers to conclude distance contracts with traders” as separate categories (see also Recital 13). So how are we to qualify a platform like Instagram, which is both? Leaving aside the fact that Instagram has a Checkout function in the United States, making it a straightforward marketplace for the purpose of that jurisdiction, even the European version of the app features a “Shopping” explore section, full of content from both traditional traders (e.g., companies selling to consumers), as well as emerging traders such as influencers. The latter are considered traders firstly because of offers/invitations to treat for goods or services they provide directly to consumers (e.g., selling digital courses or selling merchandise), and secondly because they are providers of commercial services in the form of advertising, to which consumers are exposed. This has led to national consumer enforcement authorities such as the Belgian Ministry of Economy to ask influencers to abide by the information duties which traders are normally subjected to (e.g., disclosing trader identity, physical address), as a result of the application of the Consumer Rights Directive (CRD) and the Unfair Commercial Practices Directive (UCPD).
On the basis of our earlier exploration of the definition of illegal content, I would argue that not fulfilling transparency duties, or violating the prohibition of undisclosed advertorials, are clear violations of the European consumer acquis, and thus are illegal content. However, two main problems arise in this analysis. First, such transparency obligations are likely not covered by the DSA itself: if social media platforms are not interpreted as “online platforms allowing consumers to conclude distance contracts with traders”, they will not be subjected to the specific obligations such as the compliance-by-design obligation enshrined in Article 31 DSA on information duties similar to the CRD. Moreover, due to the limited definition of “advertisements” in Article 3(r) DSA, influencer marketing has been specifically kept out of the DSA’s framework for advertising, applicable to inter alia social media platforms (e.g., Article 26 DSA). According to Article 3(r), advertising has a remuneration element which involves the platforms, and this excluded as such any off-platform advertising (e.g. contracts between brands and influencers which are not mediated by the social media platform). However, looking at the examples of on-platform influencer marketing as monetization products currently pursued by platforms, some influencer marketing practices will be covered by this definition. As a result, the DSA, as a regulation designed around platform liability, will not be able to directly tackle a substantive proportion of the apparent issues. Second, the aforementioned consumer acquis provisions have not been developed for platforms, but for the traders providing the advertising or the contractual options such as the influencers themselves, in which case the scalability of monitoring requires policy choices that may reflect agency resources (e.g. only monitoring the biggest influencers due to visibility). Therefore, under national law, platforms – although at the front, left and center of social commerce – are not the focus of enforcement.
Legal Compliance API: A Middle Way
As a means to standardize data access, APIs are already embraced by social media platforms, more or less in compliance with Article 40(7) DSA. Recently, YouTube opened its API to researchers, and TikTok is planning to do the same later in the year. If you are not familiar with the API concept, imagine it as a way for two or more systems to communicate with one another. In the case of the legal compliance API proposed by my co-authors and I, one communicating system would be a social media platform, and the other a number of DSA enforcement authorities, as well as other relevant public authorities, that need to coordinate on the platform’s compliance with the actions necessary to be taken with respect to illegal content. A legal compliance API would be different than a research API, as it would be focused on the translation of the legal compliance tasks into the parameters of checking for compliance with, for instance, the hosting of content that is illegal or non-compliant with European consumer protection rules.
Article 44 DSA on standards elaborates that APIs could also be developed as voluntary standards for the submission of notices by trusted flaggers, as well as for advertising repositories (Article 39 DSA), particularly supported by the European Commission and the European Digital Services Board. Although vague, the present references to APIs in the DSA raise a concern relating to the streamlining not only of data standards, but investigation practices by enforcement authorities. As Member States race into digitalization with data units, all existing authorities enforcing European Union (and national) law will have a stake in digital investigations and enforcement including data protection authorities, media authorities, consumer authorities, etc. If the DSA enforcement does not take this into account, and in its administrative limitations, creates standards only relevant for the powers of DSA-related authorities, this will lead to potentially harmful inconsistencies in digital enforcement which will enable platforms, as Laux et al. mention, to leverage their data dominance against a crowd of uncoordinated regulators with vastly divergent capacities and practices around the implementation of European law.
It is of course highly important that any digital enforcement mechanisms that contribute to surveillance (such as market monitoring), are developed in an accountable way, taking into account the wide-reaching implications of automated decision-making, both from a systems perspective, as well as from a procedural perspective. In my opinion, there is no other alternative to digital monitoring. Platforms already have the upper hand in technology and scale, and public oversight can do little to catch up with that – but it needs to at least try and reign in some of the uncontrolled platform discretion under the scrutiny of the rule of law through well-coordinated and well-designed further digitalization.