Digital Regulation

Digital Fairness Act Unpacked: Dark Patterns

Published on 31st July 2025

Two weeks ago, the public consultation on the Digital Fairness Act (DFA) opened, allowing companies, associations, and other stakeholders to contribute their perspectives to the legislative process. As part of the consultation, the European Commission is addressing dark patterns. As the first part of our miniseries on the topics touched by the consultation, this article provides further details on dark patterns and the current regulation of such practices.

Icon

Although the term “dark patterns” is expressly mentioned in recital 67 of the Digital Services Act (DSA), there is still no official legal definition available which would go beyond the scope of dark patterns on online platforms. In the consultation, the Commission uses a definition that reminds of the Unfair Commercial Practices Directive (UCPD) and its link to transactional decisions a consumer would otherwise not have taken. The Commission describes dark patterns as “unfair commercial practices deployed through the design of digital interfaces that can influence consumers to take decisions they would not have taken otherwise”.

Even without a precise definition, practices that are regularly mentioned as examples of a dark pattern are often the same.[1] Therefore, the list of practices to be evaluated under the DFA consultation comes as no surprise:

  • Click fatigue – a technique that forces consumers to click through too many steps in order to be able to make the desired choice;
  • Creating the false impression that the consumer does not have another option apart from the (prominently featured) one that the trader prefers;
  • Nagging – repeatedly requesting or urging the consumer to make a particular choice;
  • Pressuring the consumer through urgency and scarcity claims (e.g. a countdown timer) even when the respective offer or available stock is clearly limited in time;
  • Confirm-shaming – pressuring the consumer towards a particular choice through emotive language or shaming;
  • Sneaking into the online basket – adding new products or services to the shopping basket when the consumer is about to complete a purchase without them knowing or consenting;
  • Features leading to a different result than normally expected (e.g. a button marked with “cancel the contract” leads to a page showing the benefits of that contract);
  • Ambiguous language in the presentation of choices to consumers, like using double negatives;
  • Presenting choices in a leading manner, for example, prioritising an option for a given choice by using a brighter colour or larger font.

The consultation seeks to determine whether additional measures should be considered to address dark patterns – such as non-regulatory measures (e.g. guidance), new binding rules, or more effective enforcement of existing rules. In this regard, the Commission specifically inquired whether certain types of dark patterns require special regulatory measures. Therefore, different forms of dark patterns might be subject to a differentiated approach.

  • Non-regulatory measures (e.g. guidance)

    Guidance on dark patterns already exists. For example:

    • In relation to dark patterns in the B2C context, the (non-binding) Commission’s Notice on the interpretation and application of the UCPD[2] contains a dedicated section on how the regulatory framework on unfair commercial practices applies in relation to practices that fall under the notion of dark patterns. In this section 4.2.7, the Commission explains:
      The UCPD applies to any ‘unfair commercial practice’ that meets the requirements of the material scope of the Directive, regardless of their classification. If dark patterns are applied in the context of business-to-consumer commercial relationships, then the Directive can be used to challenge the fairness of such practices, in addition to other instruments in the EU legal framework, such as the GDPR.
    • In relation to dark patterns under the General Data Protection Regulation (GDPR), the European Data Protection Board published Guidelines on deceptive design patterns in social media platform interfaces in 2022.[3]
    • When it comes to dark patterns on online platforms, Art. 25(3) DSA provides the Commission with powers to issue guidelines on specific dark patterns and how they should be addressed under the DSA. So far, the Commission has not made use of these powers.
  • New binding rules
    There are already numerous laws within the EU that can be applied to address dark patterns.

    The Commission has expressed its interpretation of the applicability of existing laws in its Notice on the interpretation and application of the UCPD as well as its behavioural study on dark patterns.[4] Some of the existing laws even mention manipulative designs explicitly or come with corresponding recitals to support respective interpretation of the law.

    For example:
    • Unfair Commercial Practices Directive: Particularly Art. 6 and 7 on misleading practices, Art. 8 and 9 on aggressive practices as well as several practices listed in Annex 1 (the blacklist) to avoid practices that negatively influence consumer behaviour in relation to transactional decisions.
    • General Data Protection Regulation: Particularly the data protection principles under Art. 5(1), related to transparency and fairness, Art. 25 on data protection by design and default or Art. 4(11) and Art. 7 regarding the conditions of consent.
    • Consumer Rights Directive: Particularly Art. 6 on information obligations and Art. 8 on the formal requirements, preventing material information from being obscured or ambiguous. Moreover, Art. 8(3) on the label for purchase buttons aims to ensure that consumers are aware if they enter a binding contract that requires them to pay a price. This is complemented by Art. 22 which requires express consent (opt-in without preselection) for additional costs. Art. 27 also exempts the consumer from providing any consideration (obligation to pay) in cases of unsolicited supply or provision and sets out that in such cases, the absence of a response from the consumer following an unsolicited supply or provision shall not constitute consent, which addresses the practices of sneaking into the online basket.
    • Digital Services Act: Art. 25 and Recital 67 govern dark patterns on online platforms which are not already covered by the UCPD or the GDPR (quite limited in scope but could apply e.g. when it comes to B2B online platforms).
    • Digital Markets Act: Particularly Art. 5(2) relating to obligations for gatekeepers regarding easy consent withdrawal and Art. 13 which prohibits designated gatekeepers from circumventing the obligations in the DMA through contractual, commercial, technical behaviour or any other means, which includes the use of dark patterns to unfairly steer consumer decisions.
    • Artificial Intelligence Act: Particularly Art. 5(1)(a) and (b) prohibiting subliminal, manipulative or deceptive techniques exploiting vulnerabilities.
    • Data Act: Recital 38 mentions the prohibition of the use of dark patterns in relation to third parties and data holders when they are designing their digital interfaces and defines dark patterns as “design techniques that push or deceive consumers into decisions that have negative consequences for them. Those manipulative techniques can be used to persuade users, in particular vulnerable consumers, to engage in unwanted behaviour, to deceive users by nudging them into decisions on data disclosure transactions or to unreasonably bias the decision-making of the users of the service in such a way as to subvert or impair their autonomy, decision-making and choice”. Article 4(4) prevents data holders from making the exercise of choices or rights by the user unduly difficult, including by offering choices to the user in a non-neutral manner or by subverting or impairing the autonomy, decision-making or choices of the user via the structure, design, function or manner of operation of a user digital interface or a part thereof.
    • The Consumer Rights Directive (CRD) was also amended via an update to the Financial Services Directive[5] regarding a withdrawal function for all distance contracts (Art. 11a new CRD) and introducing a ban on dark patterns for user interfaces where financial services contracts can be concluded at a distance (Art. 16e new CRD) to the extent these dark patterns are not already in scope of the UCPD and the GDPR. These changes to the CRD need to be transposed into national law by 19 December 2025 and shall be applied from 19 June 2026.
  • More effective enforcement measures
    It is therefore difficult to determine whether enforcement measures are already working as this would require identifying which enforcement measures in the past specifically tackled dark patterns.

    Many of the existing legal obligations can be considered as laws against dark patterns, but are not necessarily solely or expressly a ban of such practices. As an example: If a trader faces enforcement action because they did not properly inform consumers about material information (Art. 6, Art. 8 CRD), their practice could be considered a dark pattern in the form of ambiguous language, presenting choices in a leading manner or – when it comes to the information about cancellation or withdrawal rights – roach motel. At the same time, the trader could have just missed disclosing relevant information in a timely manner or did not present the information in a transparent manner. The result is a violation of the CRD – but whether or not this enforcement measure was also tackling dark patterns is not always clear and is not tracked accordingly.

    If there was indeed a lack of enforcement – even taking into account the aforementioned difficulties of tracking whether an enforcement measure was taken to address dark patterns – this could be for a variety of reasons. There is still no proper legal definition of dark patterns, but such legal definition is also difficult to achieve, given that whether or not a specific practice is considered a dark pattern requires (a) a certain level of intent and (b) a (potentially) negative outcome for the consumer. A certain level of intent is required as “manipulation”, “deceit”, “steering”, “tricking”, “sneaking” etc. require a conscious decision by a trader. This element is difficult to prove. A (potentially) negative outcome for the consumer is required, as not all techniques which promote a certain consumer behaviour are bad, otherwise all offers, marketing and advertising measures would need to be prohibited. Regulating dark patterns should increase the level of consumer protection and not hinder both consumers and traders from interacting with each other if these interactions have a positive outcome for both parties. It is therefore crucial that this can be determined on a case-by-case basis.


 


[1] E.g. in the Commission‘s behavioural study on dark patterns (https://op.europa.eu/en/publication-detail/-/publication/606365bc-d58b-11ec-a95f-01aa75ed71a1/language-en), the Commission’s Notice on the interpretation and application of the UCPD (https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A52021XC1229%2805%29&qid=1640961745514) or in Art. 25(3) DSA.

[2] Commission’s Notice on the interpretation and application of the UCPD, https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A52021XC1229%2805%29&qid=1640961745514.

[5] Directive (EU) 2023/2673.

* This article is current as of the date of its publication and does not necessarily reflect the present state of the law or relevant regulation.

Interested in hearing more from Osborne Clarke?