24 March 2025

Regulating Social Media for Teenagers

A Definitional Minefield

Calls for social media age restrictions are growing in several EU Member States, aligning with broader discussions in Brussels about protecting consumers against ‘addictive’ online services. Recently, Australia approved a social media ban for those under 16, adding momentum to these debates. However, while such restrictions aim to protect children, defining ‘social media’ presents significant challenges that could make a ban ineffective or even counterproductive.

Any regulatory approach concerning children must prioritize their best interests (Art. 24 CFEU; Art. 3 UNCRC). Several arguments have already been made about prohibiting social media access for children under 15 or 16 years, such as the risk of isolating LGBTQIA+ kids from the communities they find online. Another argument stems from the issues around defining ‘social media’. Any definition of ‘social media’ risks being either too narrow, insufficiently protecting children, or too broad, restricting children’s freedom of expression and information (art. 13 UNCRC). Therefore, other regulatory options should be explored.

Background

Politicians in the Netherlands, France, Denmark and Norway have called for an EU-wide social media prohibition for children under a certain age, often 15 years. These politicians are inspired by Australia’s recent legislative move to ban social media for those under 16. Such restrictions generally aim to protect children from social media ‘addiction’ and exposure to harmful content, such as pornographic materials and cyberbullying.

Even before these national initiatives, Brussels had begun addressing concerns about ‘addictive’ online services. In December 2023, the European Parliament adopted a resolution highlighting how certain tech companies use design features to maximize user engagement. The Parliament urged the European Commission to propose legislation against such practices and indicated it would propose legislative measures if the topic remained unaddressed.

In response, the Commission addressed the issue in its Fitness Check of EU consumer law on digital fairness, concluding that existing laws offer insufficient protection against addictive online services. The Fitness Check did not commit to any specific follow-up actions. However, around the same time as the Fitness Check was published, European Commission President Ursula von der Leyen tasked the new Commissioner for Democracy, Justice and the Rule of Law with developing a Digital Fairness Act to tackle, among others, the addictive design of online services. Thus, addictive design is firmly on the EU’s legislative agenda.

Although the Parliament’s resolution and the Commission’s Fitness Check focus on consumers in general, they emphasize children’s particular vulnerability. As a result, national proposals for social media age restrictions will likely influence discussions on the Digital Fairness Act.

Regulatory options

There are several regulatory options available for the EU to protect children against ‘addictive’ online services. For example, instead of introducing new legislation the EU could focus on enforcing existing regulations such as the Digital Services Act (‘DSA’). The DSA already requires very large online platforms to assess and mitigate risks to children, including risks associated with the design of online interfaces that may cause ‘addictive behaviour’ (articles 34 and 35, in conjunction with recital 81, DSA). Another approach would be to concretise the standard of ‘professional diligence’ in the Unfair Commercial Practices Directive (‘UCPD’) to clarify that inducing ‘addictive’ behaviours does not align with that standard. Another option is to outlaw certain problematic design practices by blacklisting them via Annex 1 of the UCPD, which lists commercial practices that are considered unfair and therefore prohibited. Finally, banning children under a certain age from social media is another regulatory option.

However, before considering such a measure, several key questions must be addressed. What is the precise issue that needs to be addressed? Some calls for social media age restrictions are motivated by the idea that social media is causing a mental health crisis among teens – an idea that is ‘not supported by science’. However, problematic social media use does exist, with recent studies indicating that 11% of adolescents report problematic social media use. This leads to the question of whether regulations should focus solely on problematic social media use or also on high intensity use (32% of adolescents), which may not necessarily lead to negative health outcomes but could still be considered undesirable from a normative or paternalistic point of view. In that light, is ‘addiction’ the appropriate term to describe these problems? And ultimately, what exactly should be the goal of a legislative intervention – reducing social media use altogether or promoting healthier engagement?

Issues with defining ‘social media’

Calls for social media age restrictions may pre-empt the discussion about these crucial questions and the wide spectrum of regulatory options available for the Digital Fairness Act. Therefore, critically reviewing the legal feasibility of age restrictions is important.

Current EU law does not yet define the concept of ‘social media’. Some laws, such as the DSA and European Electronic Communications Code, refer to ‘social networks’ in their recitals without providing definitions. Therefore, if the EU were to enact a ban, it would have to define ‘social media’ – a task that seems very difficult to near impossible.

A potential starting point is the Digital Services Act’s definition of an online ‘platform’, which is a hosting service that stores and disseminates information at a user’s request (Art. 3(i) DSA).This definition encompasses widely recognized social media services like TikTok, Instagram, and Snapchat. However, it also includes platforms like Amazon, Google Maps, and Shein, which are not typically considered social media. Thus, the ‘online platform’ concept would be too broad for social media age restrictions legislation.

Academic definitions, such as those proposed by Ellison and boyd, describe social media as services where users 1) have profiles consisting of user-generated content or content provided by others, 2) can publicly form connections, and 3) can interact with streams of content provided by their connections.

However, this definition is too narrow for the goals of a social media ban for minors, as it excludes video-sharing services such as YouTube, which lacks user profiles but employs features that encourage prolonged engagement, such as autoplay. YouTube’s CEO Neil Mohan also does not see the service as a social media platform, emphasising it is mainly a place to connect with creators – yet its design elements make it just as engaging as traditional social media.

Ellison and boyd’s definition also excludes social media-like functionalities on messaging apps, such as Status on WhatsApp and Stories on Signal. These apps do not center on public profiles and networks and would thus not count as social media. However, status updates on WhatsApp and stories posted on Signal are only temporarily available. Such ephemeral content might make people return to the app more often because of a fear of missing out. Furthermore, even the classic one-to-one messaging functionalities on apps like WhatsApp and Signal employ design features that may influence how much you use the app. For instance, read receipts create social pressure to respond fast. These types of design features in messaging apps might induce ‘addictive’ behaviour but would not be captured with a social media ban.

Simply broadening the definition of ‘social media’ to include messaging apps or video-sharing services would lead to new problems. A ban on social media would then mean that children cannot use messaging apps or video-sharing services. However, messaging apps are important media for children to exercise their right to freedom of expression and receive information (art. 13 UNCRC). Likewise, YouTube is an important platform for children to receive news, health information, and educational content. Banning children entirely from messaging apps or video-sharing services is therefore not in their best interests.

The australian example

Australia’s recent Online Safety Amendment highlights the pitfalls of defining ‘social media’ in legislation. The law classifies ‘age-restricted social media platform’ as an electronic service that 1) enables online social interaction between two or more people, 2) allows users to link to, or interact with, other users, 3) allows users to post material on the service, 4) and meets other conditions (if any) as set out in the legislative rules. The law, however, allows that some services may be excluded. However, due to its broad scope, the Australian government has already assured that it would exclude messaging apps, online games, and YouTube from the definition – raising concerns that the law is now too narrow to effectively protect children.

Conclusion

Different arguments have been made against social media age restrictions in the public debate. This contribution has shown such restrictions are also problematic due to the difficulty of defining ‘social media’ in a way that is both practical and effective. Any definition is likely to be either too narrow, failing to protect children against ‘addictive’ online services, or too broad, excluding children from online services that are important media for exercising their freedom of expression and information. Instead of banning access to entire platforms, the EU should explore regulatory interventions targeting problematic design features. This approach would apply across social media, messaging apps, video-sharing platforms and games, ensuring that children are protected without unnecessarily restricting their communication and access to information. By shifting the focus to platform design rather than age-based prohibitions, policymakers can better balance child protection with fundamental rights.


SUGGESTED CITATION  Eskens, Sarah: Regulating Social Media for Teenagers: A Definitional Minefield, VerfBlog, 2025/3/24, https://verfassungsblog.de/regulating-social-media-for-teenagers/, DOI: 10.59704/fcc51b0d448c7436.

One Comment

  1. Aidan OSullivan Mon 24 Mar 2025 at 15:37 - Reply

    Thanks for the piece.

    However, I would disagree.

    https://www.afterbabel.com/p/phone-based-childhood-cause-epidemic

    Parents cannot wait for more and more years while academics argue. Kids have fundamental rights but the most important is safety.

Leave A Comment

WRITE A COMMENT

1. We welcome your comments but you do so as our guest. Please note that we will exercise our property rights to make sure that Verfassungsblog remains a safe and attractive place for everyone. Your comment will not appear immediately but will be moderated by us. Just as with posts, we make a choice. That means not all submitted comments will be published.

2. We expect comments to be matter-of-fact, on-topic and free of sarcasm, innuendo and ad personam arguments.

3. Racist, sexist and otherwise discriminatory comments will not be published.

4. Comments under pseudonym are allowed but a valid email address is obligatory. The use of more than one pseudonym is not allowed.




Explore posts related to this:
DSA, Social Media