Digital Fairness Act Unpacked: Addictive Design
Published on 5th August 2025
For almost three weeks, the public consultation on the Digital Fairness Act (DFA) has been running, allowing companies, associations, and other stakeholders to contribute their perspective to the legislative process. In the first part of our miniseries on the topics addressed by the consultation, we examined dark patterns. As the second part of our miniseries, this article provides further details on addictive designs and the current regulation of such practices.

What are Addictive Designs?
So far, the term “addictive designs” is not officially defined in legal terms. Unlike the term “dark patterns”, it is also not expressly mentioned in the recitals of any European legislative acts. While the public consultation on the DFA distinguishes between dark patterns and addictive designs in different Sections, it has not been legally clarified whether addictive designs represent a dark pattern or if both practices should be differentiated from one another. However, in contrast to dark patterns, the European Commission explicitly mentions minors as a particularly vulnerable group.
As part of the consultation, the European Commission describes addictive designs as features “that make consumers spend more time and money online than intended”.
Even without an official legal definition, some practices are consistently mentioned as examples of addictive designs. In the context of the DFA consultation, the European Commission considers the following examples of addictive designs:
- Infinite scrolling – where a page loads content with no distinct end;
- Ephemeral stories – content that disappears quickly;
- Autoplay – that allows video or audio files to play without user’s intervention;
- Applying penalties for disengagement – such as breaking a streak;
- Recommender systems – that are steered to increase the consumer’s engagement.
What is the current regulation of Addictive Designs?
Addictive designs are currently not expressly regulated, which may be due, among other things, to the difficulty of distinguishing permissible user engagement from features that are problematic under consumer protection law.
Although the EU, for example, has various legal instruments available, such as the Unfair Commercial Practices Directive (UCPD), the General Data Protection Regulation (GDPR), or the Digital Services Act (DSA), which can be used to combat dark patterns, these provisions can only be applied to addictive designs to a very limited extent. In current European regulation, the rules on dark patterns mainly focus on unfair or deceptive practices with a direct economic or data protection-related connection. Addictive designs, on the other hand, primarily target the permanent steering or retention of user attention, without necessarily having a direct link to specifically ascertainable economic consequences or (unlawful) data collection.
In detail:
- Unfair Commercial Practices Directive: According to Articles 5 to 9 of the UCPD, especially misleading and aggressive commercial practices are considered unfair and therefore prohibited. However, this only applies if they negatively influence consumer behaviour in relation to a transactional decision. The UCPD therefore only partially applies to addictive designs, since, for example, simply urging users to stay online longer than intended is not directly seen as a negative influence on economic behaviour or causing consumers to make a transactional decision they would not otherwise have made. If, in a specific individual case, this should nevertheless be the case, addictive design features can be considered unfair within the meaning of the UCPD (or the respective national implementing laws). However, this always requires a determination on a case-by-case basis.
- Digital Services Act: Depending on the specific individual case, addictive designs can be addressed based on Art. 25, which prohibits providers of online platforms from designing their online interfaces in such a way that users’ ability to make free and informed decisions is significantly impaired or hindered. However, this only applies to practices not already covered by the UCPD or the GDPR. Moreover, Art. 27 requires more transparency on “recommender systems” which determine the order in which certain information is displayed to users by online platforms. Lastly, the DSA requires online platforms to specifically protect minors online (Art. 28). This obligation has been specified in the recently issued guidelines on the protection of minors. These guidelines also include three key recommendations which can be regarded as tackling addictive designs in relation to minors as a specifically vulnerable users:
- Modifying the platforms’ recommender systems to lower the risk of children encountering harmful content or getting stuck in rabbit holes of specific content, including by advising platforms to prioritise explicit signals from children over behavioural signals as well as empowering children to be more in control of their feeds;
- Disabling by default features that contribute to excessive use, like communication "streaks," ephemeral content, "read receipts," autoplay, or push notifications, as well as removing persuasive design features aimed predominantly at engagement and putting safeguards around AI chatbots integrated into online platforms;
- Ensuring that children’s lack of commercial literacy is not exploited and that they are not exposed to commercial practices that may be manipulative, lead to unwanted spending or addictive behaviours, including certain virtual currencies or loot-boxes.
- AI Act: Art. 5(1)(a) and (b) of the AI Act prohibit subliminal, manipulative, deceptive, or vulnerability-exploiting techniques if they (are likely to) cause significant harm. Here too, addictive designs would have to be assessed on a case-by-case basis to determine whether addictive design features exceed this high threshold of significance.
- General Data Protection Regulation: Although the GDPR contains data protection principles according to Art. 5(1), in particular related to transparency and fairness, as well as Art. 25 on data protection by design and default, these provisions only come into focus if they lead to a violation of data protection standards. In the case of addictive designs, however, this is rarely the case, because, for example, automatically played videos do not affect the lawfulness of data processing. Should this arise in a specific case, the GDPR can be applied to address addictive design features. However, this requires a concrete individual examination, and the core of the allegation would then be more about unlawful data processing than any potential addictive character.
What is the European Commission discussing?
The part of the consultation that deals with addictive designs seeks to determine whether, from the public’s perspective, existing EU regulations adequately govern addictive designs, whether additional regulatory or non-regulatory measures (e.g. guidance) are required, or whether more effective enforcement of existing rules is needed.
In the consultation, the following specific regulatory approaches are explicitly up for discussion:
- Control options: Consumers should have more control over addictive design features, e.g. to be able to switch off the features they don’t want or to choose the criteria for the recommendations they receive online (i.e. how the algorithm provides them with content)
- Default settings: Addictive design features should be switched off by default, allowing consumers to opt in if they wish. Alternatively, this default deactivation could be introduced for minors only, allowing them, potentially with parental approval, to opt in if they wish.
- Prohibition for minors: Certain addictive design features should even be prohibited for minors.
Classification of the intended regulatory approaches
When classifying the regulatory approaches considered by the Commission, a differentiated view is required. Many of the functions now classified as potentially addictive are an established part of today’s digital user experiences. A blanket ban on such mechanisms would be too restrictive, as it would disproportionately curtail legitimate design strategies without necessarily leading to a higher level of consumer protection.
Addictive designs that, under certain circumstances, have specific effects are already subject to existing legislative acts and can be sanctioned under them. For example, an addictive design can constitute an unfair commercial practice under the UCPD if it negatively influences consumers’ transactional decisions, or it can lead to GDPR sanctions if it involves a violation of data protection law when processing personal data.
Instead of introducing new regulatory obligations, European lawmakers could initially focus on non-legislative measures. Sector-specific guidelines can help clarify which design elements are to be classified as unfair, manipulative, or harmful under what conditions. At the same time, such clarifications would provide guidance for practice without unnecessarily interfering with creative or economic freedoms. A more precise differentiation between dark patterns and addictive designs would also be useful in such guidelines.
The classification of addictive designs requires a factual, risk-based assessment. Blanket bans are not conducive to achieving this. Instead, it should come down to a concrete case-by-case assessment, taking into account the purpose, effect, and context of the design. When features genuinely impair decision-making freedom or a violation of existing legal norms, suitable instruments are already available. In our view, consistent enforcement of these rules and the development of clear and binding guidance provide a better path to greater digital fairness.