Just One More Video…
Down the (Legal) Rabbit Hole of TikTok’s Addictive Design
On 6 February 2026, the European Commission disclosed its long-awaited preliminary findings regarding its investigation into whether TikTok, the social media platform used by 170 million people across the European Union, is in breach of the Digital Services Act (DSA). This marks an important step in the formal proceedings which were initiated in 2024. The announcement occurs at a time when political and public concerns about the potential harmful impact of social media platforms are at an all-time high, leading to calls to “ban” children and teenagers from those spaces in countries across the world. As “addictive” features are central to concerns leading to these contested calls, the potential of the Digital Services Act to change platform design is crucial.
What did the European Commission find in its TikTok investigation?
The Commission’s investigation finds that TikTok’s addictive design might violate the Digital Services Act. According to the Commission, TikTok failed to conduct an adequate risk assessment and evaluate how addictive design features, including infinite scrolling, autoplay, push notifications and a highly personalised recommender system, could harm the physical and mental well-being of its users, specifically minors and vulnerable adults.
Particularly, the Commission claims that the app’s design fuels the urge to keep scrolling and shifts users’ brains into an “autopilot mode” by continuously rewarding users with new content. The Commission also alleges that TikTok ignored certain indicators of compulsive use, such as how frequently users open the app and the amount of time minors spend on it at night. In this regard, it has been claimed by an EU spokesperson that TikTok is by far the most-used platform after midnight by children between the ages of 13 and 18.
The preliminary findings further accuse TikTok of implementing inadequate risk mitigation measures. According to the Commission, current measures such as screentime management and parental control tools fail to effectively reduce the risks stemming from the platform’s addictive design.
To meaningfully comply with the DSA and ultimately make the app less addictive for users, the Commission states that TikTok will have to change its basic design. The preliminary findings suggest disabling the “infinite scroll” feature, implementing effective “screen time breaks”, including at night, and adapting the recommender system.
So far, the preliminary findings, which the Commission clarified as being the result of “an analysis of TikTok’s risk assessments reports, internal data and documents and TikTok’s responses to multiple requests for information, a review of the extensive scientific research on this topic, and interviews with experts in multiple fields”, have not been published. This is expected to happen in the near future, after the redaction of the findings.
How does the DSA tackle the addictive design of Very Large Online Platforms?
Whereas the Commission’s press release does not refer to specific articles of the DSA that they assess as being breached, the breach appears to be linked to articles 34 and 35 DSA, as well as article 28 DSA.
Articles 34 and 35 DSA are applicable to so-called Very Large Online Platforms (VLOPs). A platform is designated as a VLOP by the Commission when it reaches 45 million or more monthly active users in the EU – a threshold TikTok clearly exceeds. Article 34 requires VLOPs to undertake a yearly assessment of the systemic risks in the EU stemming from the design, functioning or use of their service. Such systemic risks include:
a) the dissemination of illegal content;
b) any actual or foreseeable negative effects for the exercise of fundamental rights (including the rights of the child);
c) any actual or foreseeable negative effects on civic discourse and electoral processes, and public security; and
d) any actual or foreseeable negative effects in relation to gender-based violence, the protection of public health and minors and serious negative consequences to the person’s physical and mental well-being.
Addictive design features, such as the ones identified by the Commission, fall within the categories b) and d), especially in relation to their effects on children, as confirmed in recitals 81 and 83 of the DSA.
Following this risk assessment, Article 35 requires VLOPs to put in place reasonable, proportionate and effective mitigation measures, which are tailored to the specific systemic risks. Such measures may include, among others, adapting the design, features or functioning of the service, adapting the algorithmic and recommender system, and putting in place age verification and parental control tools. It is precisely these risk assessment and risk mitigation obligations that, in the Commission’s view, TikTok failed to adequately fulfil.
In addition, some of the features mentioned by the Commission in its press release are referred to quite extensively in the guidelines on Article 28 DSA, published by the Commission in July 2025 following a public consultation process. Article 28 DSA applies to all online platforms accessible to minors and requires them to ensure a high level of privacy, safety, and security for minors. To help platforms realise this rather abstract obligation, the guidelines list a wide range of measures that the Commission believes are needed. Relevant measures in light of the preliminary findings include, for instance, turning off autoplay of videos and push notifications by default (which should always be turned off during core sleep hours, adapted to the age of the minor) (para 57). In addition, minors must not be exposed to persuasive design features that are aimed predominantly at engagement, and that may lead to extensive use or overuse of the platform or problematic or compulsive behavioural habits. This includes the possibility of scrolling indefinitely, the automatic triggering of video content, and notifications artificially timed to regain minors’ attention (para 61). What should be available are child-friendly and effective time management tools to increase minors’ awareness of their time spent on online platforms (para 61).
What is interesting in the Commission’s press release is that they do not just refer to the potential harm for minors, but also for vulnerable adults. The Article 28 guidelines also pick up on this, stating that platforms are encouraged to adopt the measures for the purposes of protecting all users, not just minors. This reflects discussions that have been going on for some time in the area of consumer protection, arguing that vulnerability in the digital environment might be a universal state due to its design and specific characteristics. This is also central to plans for a forthcoming Digital Fairness Act. This legislative initiative aims to strengthen consumer protection online, addressing challenges such as addictive design and unfair personalisation practices. A proposal by the Commission is expected in 2026.
Or should we ban children from social media platforms?
The release of the Commission’s preliminary findings occurs at a time when governments across the EU are increasingly contemplating the introduction of what is commonly referred to as a ‘social media ban’. These initiatives typically involve setting a minimum age below which children should not be allowed to create an account on social media platforms. While such platforms already include in their terms and conditions that children under the age of 13 are not allowed on their platforms, in practice, this age limit is not adequately enforced. The recent calls for bans often propose raising this age to 15 or even 16 years. Politicians from a wide range of countries – including (but not limited to) Austria, Belgium, the Czech Republic, Denmark, Finland, France, Germany, Greece, Poland, Portugal, Slovakia and Spain – are jumping on this bandwagon because of public debates on how children’s online activities may negatively impact their well-being and mental health. An important element in those debates concerns the addictive effect of platform features. They also often allude to the social media ban for under-16-year-olds which entered into force in Australia in December 2025, although it is far too soon to understand whether this is an effective way to address the concerns.
These initiatives in EU Member States raise interesting legal questions. The Digital Services Act is a full harmonisation instrument, which means that Member States should not adopt or maintain additional national requirements relating to the matters falling within the scope of the DSA (recital 9). Although the DSA offers Member States some leeway to apply “other national legislation […] where the provisions of national law pursue other legitimate public interest objectives than those pursued” by the DSA, this leeway is limited. Relying on article 3(4) of the E-Commerce Directive, for instance, could allow for national provisions that aim to protect minors or public health for a “given information society service”. Yet, these aims hardly qualify as other legitimate public interest objectives as they are central to articles 28 and 34 DSA. Moreover, in Google Ireland, Meta Platforms, TikTok v. Kommaustria, the Court of Justice of the EU clarified that “general and abstract measures aimed at a category of given information society services described in general terms and applying without distinction to any provider of that category of services do not fall within the concept of measures taken against a ‘given information society service’ within the meaning of that provision” (para 58).
The way in which a Member State would formulate a ‘social media ban’ is thus important. This was confirmed by the French Council of State in its advice on the French legislative proposal, stating that imposing the prohibition on accessing social networks on the online platforms themselves could be seen as raising difficulties regarding the DSA, but that imposing this obligation on minors under the age of fifteen would not contravene EU law. This follows the interpretation by the Commission, found in the minutes of a meeting of the Working Group on protection of minors of the European Board of Digital Services, that “member states can set social policy measures for minimum age access, but not additional obligations on online platforms”. This national minimum age should then simply be enforced by the platforms in the context of the obligations they have under the DSA, and age assurance – mandated by the article 28 Guidelines – in particular.
Whereas it could be debated whether this was indeed a scenario originally anticipated by the EU legislator, it remains a fact that the DSA aimed to harmonise the protection of minors on platforms across EU Member States. This conflicts with a scenario where children would be allowed to engage with social media at different ages across different countries. In its November 2025 report, the European Parliament called for the establishment of a harmonised EU digital age limit for social media: 16 as the general rule, unless parents or guardians give permission, and 13 as an absolute minimum below which no child should have access to such platforms. President von der Leyen announced the creation of an expert panel tasked with developing a recommendation for an EU “digital age of majority”. According to reports, the group has been formed, and its work will be launched soon. It is crucial that a decision on this issue is evidence-based and considers the view of children themselves.
In the debate on social media bans, arguments that a ban might negatively affect children’s rights have also been raised. Not allowing children of certain ages to be present in these spaces is often seen as a simplistic answer for a complex problem which creates a false sense of security. On top of that, bans take away incentives to effectively make the platforms better, not only for children but for everyone. This is why the European Commission’s preliminary findings on TikTok are especially important. They send a strong signal that addictive features, which are at the heart of the concerns, are not acceptable. Focussing on platform design – an area where the DSA has genuine regulatory potential – rather than simply preventing most children from being there, is arguably more sustainable in the long run.
What are the next steps?
The ball is now in TikTok’s court. It has the possibility to review the investigation files and exercise its right to defence. TikTok has already rejected the preliminary findings, asserting in a statement that “the Commission’s preliminary findings present a categorically false and entirely meritless depiction of our platform, and we will take whatever steps are necessary to challenge these findings through every means available to us”. If the action TikTok takes remains inadequate, and the Commission’s views are ultimately confirmed, the platform could face fines of up to 6% of the global annual turnover of its parent company ByteDance. One can be sceptical about the deterrent effect of (even large) fines on big tech companies. But at this moment, the hope that strong enforcement of the DSA may succeed in changing the design of the platforms for the better remains intact.
Disclaimer: Valerie Verdoodt’s contribution to this blogpost was funded by the European Union. Views and opinions expressed are however those of the author(s) only and do not necessarily reflect those of the European Union or the European Research Council Executive Agency. Neither the European Union nor the granting authority can be held responsible for them. This work is supported by ERC grant KIDFLUENCER (101169786, 10.3030/101169786).”




