The European Union (EU) has big plans for platform governance. The new Digital Services Act (DSA) package, delivered in late 2020, proposes new rules on digital markets, particularly on intermediary liability, while better protecting consumers and fundamental rights online (see my writing about the DSA’s general structure and relationship with the EU Consumer Acquis here). According to the Explanatory Memorandum, the DSA is intended as an umbrella instrument that is supposed to tackle a wide array of issues arising on digital markets (e.g. illegal content; smart contracts). One of its central issues relates to the proliferation of online (targeted) advertising, which the European Parliament made clear ought to be one of the areas of reform, so to create less dependence on and exploitation of algorithms toward consumers and citizens.
Tackling the harms associated with online profiling and creating more transparency on data brokerage markets is a solid, necessary policy objective. Indeed, much of Big Tech platforms’ power, market dominance and other market actors’ dependence derives from their monetization of user data for the advertising purposes. Yet in the past five years, the section of the advertising industry targeted by the DSA has slowly but steadily been complemented by a new form of advertising, now ubiquitous on social media. In a nutshell, it reflects a fascinating and complex new iteration of the gig-economy: any Internet user can monetize their online presence by sharing multimedia content, and platforms intermediate demand, supply and sometimes payments. Many types of specific business models have lent their name to the description of this new form of advertising, such as ‘influencer’ or ‘affiliate marketing’.
Figure 1 – Content monetization and business models (based on Goanta & Ranchordas; De Gregorio & Goanta)
If the bridling of harmful targeted advertising is a core objective of the DSA, the exclusion of influencer marketing is a grave oversight. Amendments introduced by the Internal Market and Consumer Protection Committee in the European Parliament may remedy this omission, but long-term, the goal must be ‘content as compliance’, in line with European consumer protection standards. Otherwise, Big Tech platforms’ sophisticated data-related business models will continue to escape encompassing regulation and hence, their power will remain unchecked.
From platform ads to human ads
When we think about digital advertising, we imagine brands around the world paying digital platforms for ad space, where they compete for user attention and engagement – an industry that can be referred to as platform ads. Brands register their ads in databases called ‘ad archives’ from where they can target selected platform demographics. The best example is Facebook’s Ad Library, where anyone can check the ads registered by Facebook to be displayed on their platforms, as an attempt by Facebook to create more transparency regarding its targeted advertising, especially after public incidents (e.g. Cambridge Analytica) emphasized the opacity of its infrastructure. An ad’s occurrence on a timeline will always be marked as ‘Sponsored’ by Facebook.
In the past decade, however, digital advertising has been generating new business models, focused on the monetization of original and authentic content, particularly on social media. Based on an increase in social media consumption, content monetization makes it profitable for Internet users to not only engage with advertising, but to become advertising.
As Google puts it, “advertising is becoming, well, less like advertising”, as the Internet has taken this industry into the “age of authenticity”, wherein resources are shifted from platform to ‘human ads’. Human ads are influencers, also called content creators, who earn revenue from social media advertising by creating authentic, relatable content for their followers. In turn, they receive money, goods or services (influencer marketing), or sales commissions (affiliate marketing). By hiring humans as ad banners, marketers and brands offer information (e.g. reviews) and explore persuasive narratives (e.g. social causes), which audiences can relate to and engage with. The popularity of such advertising approaches is undeniable. In 2021, influencer marketing is projected to reach a global market size of $13.8 billion (700% increase since 2016). However, the business of influence is also rapidly changing, with a plethora of new stakeholders emerging in the content monetization supply chain. Examples include influencer data analytics companies (e.g. Heepsy) and ambassador management platforms (e.g. Fohr), who are new categories of advertising intermediaries on digital markets, connecting brands and creators. The popularity of these new forms of digital marketing is matched by its potential risks.
The pursuit of monetization, combined with market trends towards inconspicuous “authentic” advertising, have revived a long-standing media and consumer law struggle: the misleading or deception of consumers with hidden commercials. Such undisclosed product placement or native advertising is prohibited in the European Union (e.g. via the Unfair Commercial Practices Directive), and reflect decades of regulatory reforms focused on protecting consumers from subliminal manipulation. The rationale behind this prohibition is that the law draws a line between mere commercial puffs used to make advertising more appealing and the deception of consumers. Other harms relating to human ads are beginning to emerge in the realm of political advertising. Commercial and political ads look the same, are posted by the same individuals, are displayed in the same digital space, to the same audiences, and raise the same transparency issues.
Human ads as the new prosumers
As highlighted above, content monetization through advertising is a new iteration of the gig economy, whereby content is shared, instead of cars or apartments. From a legal perspective, to say they are hard to define is an understatement. Earlier iterations of the gig economy have left us with a considerable definitional debt: the inability to redefine and enforce new forms of legal personhood to reflect the granularity of transactions taking place on digital markets. Is a seller of seven items on a peer-to-peer online market a trader in the meaning of EU consumer protection? In Kamenova, the Court of Justice of the European Union answered this question negatively, but asking courts to undertake individual tests for even 1% of all market participants is simply an impossible feat. We could argue content creators are the new prosumers, namely they are not consumers (or generally peers), but they also do not possess the bargaining power of other traders, such as platforms. To the contrary, they may sometimes become victims of platform discretion and power themselves, as I explored in this piece. This status has not gotten any statutory clarification, and therefore does not improve legal uncertainty. But why is it so important to define human ads or content creators? Clarifications in this direction can be relevant for business, tax and social security purposes, and especially to determine whether consumer protection is applicable.
While advertising through content creation is an industry that finds itself in full bloom, the way in which regulators and public authorities have tackled it so far has been ineffective for at least two reasons. First, advertising rules are a combination of (inter alia) mandatory European and national media law, consumer law (including unfair competition) and self-regulation, such as the Social Code for Youtubers set up in collaboration with the Dutch Media Authority. The enforcement landscape thus raises tensions relating to competence and sanctions. Second, the number of content creators and their supply chains are too vast to be handled systematically without more legal certainty and some automation. This is why the DSA is so central, as it complements the focus on the individual creator with platform responsibilities.
Advertising and the DSA
The DSA reflects the issue of advertising in its draft Art. 24, mandating transparency in advertising displayed by platforms – the traditional ad archives, discussed above. However, the proposal makes no acknowledgement whatsoever of new advertising business models emerging from content monetization. Moreover, the scarce and inconsistent references to influencer marketing in research contracted by the Commission for earlier regulatory fitness checks show a considerable research gap affecting the European regulator in this policy area. In terms of political advertising, the Commission will propose new transparency measures in the third quarter of 2021, to further the goals of the European Democracy Action Plan and harmonize rules on political advertising beyond soft law initiatives such as the EU Code of Practice on Disinformation.
The IMCO Report
Of course, the DSA proposal is only the beginning of what is expected to be a lengthy and by no means dull legislative procedure. The Report of the Committee on the Internal Market and Consumer Protection (IMCO Report) already brings some much-needed amendments to the Commission’s draft DSA. The draft wording of Art. 24 DSA on online advertising transparency focuses on the role of platforms in advertising transactions to which they are a party. However, human ads are popular vehicles of advertising which fall outside of this paradigm, as platforms are not part of their advertising transactions, but do contribute to the dissemination of ad content through their architecture. The IMCO report highlights this absence, and proposes three amendments to the DSA draft:
- recital 39d, which acknowledges ‘digital influencers’ and explains that platforms should make sure remunerated content is ‘clearly identifiable’, and that contractual relationships relevant to the content ought to be disclosed;
- Art. 2(1), which adds the distinction between ‘direct’ and ‘indirect’ promotion of a message under the definition of ‘advertising’;
- Art. 13c, on online advertising transparency, proposes a number of transparency duties which are supposed to harmonize the marking of ads and facilitate the monitoring of how platforms comply with such transparency duties.
These amendments improve the DSA draft from the perspective of consumer protection, since they acknowledge and include influencer marketing within advertising transparency. In this policy area, compliance translates into disclosure duties placed on platforms by the DSA. For instance, platforms could use their verification mechanisms (e.g. the blue check mark) to signal content creators’ accounts and propose affordances (e.g. the ‘paid partnership with’ label on Instagram) which such accounts should use. This would not only benefit users, but also public authorities and researchers who can better monitor the landscape of native advertising on social media, especially with the rise of social commerce.
Do the IMCO amendments solve all the problems of native advertising models, such as influencer marketing? Certainly not. Until we systematically clarify how to define the economic activity of influencers, respective articles will most likely lead to diverging interpretations. Potential definitions may take inspiration from labour standards, as a group of researchers on the creator economy has highlighted in a comment submitted to the UK Parliament call for evidence on the influencer economy (which also has a broad literature list on the content creator economy). In addition, the justification behind Art. 13c DSA shows the focus on “commercial influencer content”. But what about political ads for which influencers are hired? What should we focus on in determining the applicable rules? On the nature of the transaction, which leads to a commercial engagement? Or on the (political) nature of speech, which leads to more protection and less transparency owed to the same audiences who engage with the commercial communications of their favourite influencers?
These questions still need to be addressed. To provide clarity, we need to move away from the paradigm of ‘content as speech’ on social media, and into the era of ‘content as compliance’. Such a direction is, I would argue, supported by the IMCO amendments. One of the most important shifts which the DSA may very well bring is the tendency of the Commission to ‘fight fire with fire’, as my colleague Thibault Schrepel puts it. In other words, digital markets that generated sophisticated data-related business models and industries need to be monitored at scale, using that very same data for forensic investigation perspectives (some reflections on digital monitoring and dark patterns are available here). The DSA already proposes such investigations elsewhere (e.g. Art. 46 DSA). Legal compliance will therefore suffer from what currently seems to be an insurmountable tension of 1) aligning legal standards from a plethora of different fields that govern online content (e.g. fundamental rights, electoral law, consumer protection, criminal law); and 2) interpreting applicable standards in such a way that monitoring legal compliance at scale is not an impossible task. With these realizations in mind, it might help to see the DSA as a more modest initiative: a procedural bridge between different fields of European and national law, whose success depends on further sectoral harmonization and alignment.