Abstract
In February 2026, the European Commission took the preliminary view that TikTok may have infringed the Digital Services Act not because of the content hosted on the platform, but because of the design of the service itself. At the centre of the investigation is an experience architecture based on infinite scroll, autoplay and personalised recommender systems, capable of systematically influencing users’ behaviour.
The case reflects an emerging trend in digital law: platform liability is no longer concerned solely with what is published, but also with the way in which services are designed and with the consequences that such design may have for user behaviour.
The regulatory context and the TikTok case
In February 2026, the European Commission published its preliminary findings concerning TikTok under the Digital Services Act (DSA), the regulation governing the operation of digital platforms in the European Union.
The DSA introduces a significant shift in the way in which the responsibility of online platforms is conceived. In particular, for very large online platforms, it imposes an obligation to identify, analyse and mitigate so-called systemic risks arising from the functioning of the service, including those that may affect fundamental rights, the protection of minors, and users’ physical and mental well-being.
The core of the allegation lies primarily in Articles 34 and 35 of the DSA: on the one hand, the obligation to assess systemic risks; on the other, the obligation to adopt reasonable, proportionate and effective mitigation measures. These aspects are complemented by Article 28, concerning the protection of minors online, and Article 25, insofar as certain experience architectures may come close to the logic of dark patterns, namely interface configurations capable of steering or conditioning users’ choices.
It is within this framework that the TikTok case arises. In its preliminary findings, the Commission states that it identified a possible infringement of the DSA not so much in relation to the content hosted on the platform, but rather in relation to the platform’s design itself. The official press release refers to a set of “addictive design features”, including infinite scroll, autoplay, push notifications and a highly personalised recommender system.
According to the Commission, these features may foster a dynamic of continuous reward, encouraging the continuation of interaction to the point of facilitating an automatic mode of use, described as “autopilot mode”. In this context, behaviour is not imposed, but is progressively made more likely, to the extent that the user’s ability to voluntarily discontinue the experience may be reduced.
The Commission expressly links these dynamics to the issue of users’ well-being. In particular, it argues that TikTok may not have adequately assessed the impact of its design on physical and mental health, especially with regard to minors and vulnerable users. Specific behavioural indicators are also cited — such as use during night-time hours and high frequency of access — which are regarded as relevant signs of possible problematic use.
A further element concerns the assessment of the mitigation measures already adopted. According to the Commission, TikTok does not appear to have implemented “reasonable, proportionate and effective” measures to reduce the risks arising from its design. Tools such as screen-time controls and parental-control systems, although present, appear to introduce only a limited and easily circumvented degree of friction, and therefore to have little impact on actual patterns of use.
The preliminary conclusion is particularly significant. The Commission suggests that the issue cannot be addressed through marginal interventions, but requires a deeper review of the service architecture, even to the point of contemplating the need to “modify the basic structure of its service”.
In this respect, the TikTok case represents one of the first concrete applications of a broader principle introduced by the Digital Services Act, according to which platform responsibility no longer concerns only the content platforms host, but also the way in which they construct and steer users’ experience, thereby contributing to the conditions that make certain behaviours more likely.
Persuasive architectures and the construction of experience
In order to understand why the design of a platform may acquire legal relevance, it is necessary to analyse how it operates in concrete terms. The features identified by the Commission — infinite scroll, autoplay, push notifications and personalised recommender systems — do not operate in isolation, but form a coherent architecture designed to steer the user’s interaction over time.
Infinite scroll removes a fundamental element of traditional interfaces: the stopping cue. A newspaper reaches its final page; an episode ends with closing credits; on a traditional web page, the content eventually comes to an end. In an infinite feed, by contrast, that moment never arrives. The platform does not provide a natural stopping point: the user must create one. This has significant consequences, because it may be interpreted as a transfer of the cost of interruption onto the individual, making continuation of the action the default option.
Autoplay acts upon the same mechanism, further reducing the number of decisions required in order to continue. The transition between pieces of content occurs without any explicit action, transforming platform use into a continuous flow rather than a sequence of discrete choices.
Push notifications introduce a different dynamic. Unlike the other features, they do not operate during use, but from outside it, reactivating attention at discontinuous moments. Their effectiveness depends not only on their frequency, but above all on the variability and unpredictability with which they are received — characteristics that make them particularly salient from a cognitive perspective and more difficult to ignore.
The recommender system constitutes the most complex layer of the architecture. By analysing past behaviour, it selects and orders content so as to maximise the likelihood of engagement. These systems optimise metrics such as time spent and interaction through predictive models that progressively learn from users’ actions.
Considered individually, these features may appear to be user-experience choices aimed at efficiency or convenience. It is their integration that produces a qualitatively different effect. The resulting interaction is not organised around repeated autonomous decisions, but around a continuous flow in which opportunities for choice are reduced.
Design, therefore, does not impose behaviour. Formally, the user retains the ability to stop using the application at any time. However, that ability exists within an environment designed to steer behaviour systematically.
In the TikTok case, this level of design may constitute the functional core of the service. The user experience does not depend solely on the content displayed, but on the way in which that content is selected, organised and continuously renewed.
It is on this level that the Commission’s assessment operates. If users’ behaviour is systematically influenced by the configuration of the interface, design cannot be regarded as a neutral element. Rather, it becomes a central component in determining the effects of the service.
Neurocognitive mechanisms and reinforcement dynamics
Many people will have experienced this at least once: opening the app “just for five minutes” and finding themselves, much later, still there, scrolling through content almost without noticing. This is not necessarily a lack of willpower, nor is it random behaviour. It is a recurring pattern, encouraged by architectures designed to retain attention. But what does this depend on? Which neurocognitive mechanisms are activated?
One of the central neurocognitive mechanisms for understanding this dynamic concerns the dopaminergic system, which is involved in processes of learning, motivation and reward. Neuroscientific studies have shown that the activity of dopaminergic neurons is not linked simply to reward as such, but to the difference between expected reward and obtained reward: the so-called reward prediction error (Schultz, Dayan, Montague, A Neural Substrate of Prediction and Reward, 1997).
In other words, the brain is not stimulated only by what it receives, but above all by uncertainty as to what it might receive. If every outcome were predictable, interest would tend to decline rapidly. Where, however, the outcome is uncertain, behaviour tends to be repeated, because each new attempt might lead to a reward better than expected.
This mechanism underlies so-called variable-reward systems, which have long been studied also in relation to gambling and other repetitive behaviours. In these contexts, it is not the frequency of the reward that sustains the behaviour, but its unpredictability (see Clark, Decision-Making During Gambling: An Integration of Cognitive and Psychobiological Approaches, 2010).
In the digital environment, a potentially infinite and personalised flow of content offers not only immediate gratification, but also maintains an active cycle of expectation. Each new item of content represents a possibility — not guaranteed, but plausible — of obtaining something particularly relevant or interesting. It is this anticipatory dimension, more than the individual item of content, that sustains the continuity of interaction.
More recent evidence suggests that algorithmic personalisation may amplify these effects. A neuroimaging study analysed users’ brain activity while they watched short videos, comparing content selected by an algorithm with non-personalised content. The results show that recommended videos more intensely activate certain brain areas involved in attention, in the assessment of stimulus relevance, and in reward processes.
These findings must nevertheless be interpreted with caution. The scientific literature does not support the idea of a direct and deterministic relationship between platform use and the development of clinical addiction. Rather, it converges in showing that specific configurations of experience may increase the likelihood of repetitive behaviours, especially where individual conditions of greater vulnerability are present.
From this perspective, the relevance of design does not lie in an immediate effect on individual behaviours, but in its capacity to contribute to the formation of dynamics that become consolidated over time.
It is at this level that such configurations may become relevant in the assessment of systemic risks required by the Digital Services Act. What remains to be understood is the extent to which these architectures can be redesigned without substantially altering the very functioning of the platforms themselves.
Revisionato da: Arlo Canella
Data di pubblicazione: 8 Maggio 2026
© Canella Camaiora S.t.A. S.r.l. - Tutti i diritti riservati.
È consentita la riproduzione testuale dell’articolo, anche a fini commerciali, nei limiti del 15% della sua totalità a condizione che venga indicata chiaramente la fonte. In caso di riproduzione online, deve essere inserito un link all’articolo originale. La riproduzione o la parafrasi non autorizzata e senza indicazione della fonte sarà perseguita legalmente.

Martina Di Molfetta
Graduated in Communication, Innovation, and Multimedia from the University of Pavia.
