Digital Regulation

Digital Fairness Act Unpacked: Unfair personalisation practices

Published on 26th August 2025

For almost six weeks, the public consultation on the Digital Fairness Act (DFA) has been running, allowing companies, associations, and other stakeholders to contribute their perspectives to the legislative process. In our miniseries on the topics addressed by the consultation, we examined dark patterns, addictive designs, and specific features in digital products. This article provides further details on unfair personalisation practices and the current regulation of such practices at EU level. 

Icon

What are personalisation practices?

Personalisation practices in the digital industry are usually based on profiling techniques, i.e., personalizing offers to consumers based on their profile. The General Data Protection Regulation defines “profiling” as any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person's performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements.

Personalisation is widely used in the digital industry for ranking recommendations or search results, for advertising, and sometimes for pricing. The European Commission is concerned about unfair practices, when personalisation is about exploiting users’ vulnerabilities, i.e. personalised advertising practices that uses special categories of personal data (i.e. sensitive data, such as racial or ethnic origin, political opinions, religious or philosophical beliefs, or health data) or that uses information on consumers’ individual vulnerability (e.g. age, emotional or financial distress, mental infirmity etc.).

What is the current regulation of personalised practices?

In its Digital Fairness Fitness Check, which was a first step towards the DFA consultation, the European Commission reached the conclusion that EU consumer law cannot be considered sufficiently effective or clear in addressing the multifaceted concerns regarding commercial personalisation.

Personalised practices in a B2C context are, however, already subject to several scattered regulations and guidance. Although some of these regulations are broadly applicable, other requirements only apply to specific operators, such as online platforms (e.g. marketplaces, social networks, etc.) or intermediary service providers having a certain reach (i.e. very large online platforms or very large online search engines pursuant to the Digital Services Act, or gatekeepers pursuant to the Digital Markets Act) or only to some specific practices (e.g. personalised pricing). This section provides an overview of the already applicable rules.

General requirements

Data Protection requirements - As personalisation practices involve the processing of personal data, they are subject to standard data protection rules under the General Data Protection Regulation (“GDPR”). In particular, the GDPR provides specific transparency requirements regarding the existence of automated decision-making, including profiling, but only when the decision produces legal effects concerning the data subject or similarly significantly affects them (Article 22). The European Data Protection Board published guidelines on this issue. The GDPR also grants data subjects the right to withdraw consent at any time, and the right to object at any time to the processing of their personal data for direct marketing purposes, including profiling to the extent it relates to such marketing.

As personalisation practices usually also involve tracking technologies, these practices will also be subject to the E-Privacy Directive.

Fairness requirements - The Unfair Commercial Practices Directive (“UCPD”) also prohibits unfair commercial practices, including misleading practices or aggressive practices causing the consumer (or likely to cause them) to take a transactional decision that they would not have taken otherwise, including through the exploitation by the trader of any specific misfortune or circumstance of such gravity as to impair the consumer's judgement. Traders must also assess fairness in light of whether the consumer belongs to a clearly identifiable group of consumers who are particularly vulnerable to the practice in question (vulnerable consumers). The European Commission guidance on the UCPD also covers personalisation practices and details some factors under which these practices may be found unfair.

Use of AI systems - The AI Act prohibits specific use cases of AI systems that involve the deployment of subliminal techniques, purposefully manipulative or deceptive techniques or the exploitation of vulnerabilities related to age, disability or a specific social or economic situation, which leads or is (reasonably) likely to lead to significant harm (Article 5).

Absence of discrimination - Finally, personalisation practices must not be based on any discriminative criteria, i.e., must not discriminate based on personal characteristics such as sex, race, gender, skin colour, marital status, etc.

Price personalisation practices

The Consumer Rights Directive (“CRD”) requires traders to disclose to consumers, in a clear and comprehensible manner, that the price is personalised based on automated decision-making.

Transparency requirements on ranking and product recommendation for marketplaces and comparison operators 

For online marketplaces and traders offering to consumers the possibility to search for products offered by different traders or by consumers, articles 7(4a) of the UCPD and 6a of the CRD provide for transparency requirements.

Requirements applicable to online platforms, VLOP and VLOSE under the Digital Services Act (“DSA”)

The Digital Services Act provides several rules applicable to online platforms only (e.g. social networks or online platforms allowing consumers to conclude distance contracts with traders), or to very large online platforms or search engines (“VLOP” and “VLOSE” respectively, meaning online platforms or search engines having a number of average monthly active recipients of their service in the Union equal to or higher than 45 million, and which are designated as such by a decision of the European Commission).

Personalised advertising practices - The DSA imposes on online platforms some transparency requirements for each specific advertisement presented on their interface, including meaningful information about the main parameters used to determine the recipient to whom the advertisement is presented (which may include personalisation factors) and, where applicable, how to change those parameters (Art. 26(1)). Additional transparency requirements apply to VLOPs and VLOSEs (Art. 39).

Article 26(2) DSA also prohibits online platforms from presenting advertisements based on profiling using some specific personal data.

Minor protection measures – The DSA also contains some rules aimed at protecting minors. In particular, Art. 28(2) DSA bans advertising based on profiling when platforms are aware with reasonable certainty that the recipient of the service is a minor.

Recommender systems - Article 27 DSA also imposes on online platforms certain transparency requirements regarding the main parameters of their recommender systems and the available options to modify or influence them. VLOPs and VLOSEs are also required to offer at least one option for each of their recommender systems that is not based on profiling (Art. 38).

VLOP and VLOSE systemic risks - Personalisation practices, whether for advertising, ranking, price, or other purposes, may be part of or influencing factors in the systemic risks (e.g. actual or foreseeable negative effects on the exercise of fundamental rights, protection of minors, serious negative consequences for the person’s physical and mental well-being etc.) that very large online platforms and search engines need to assess and mitigate under the DSA.

Requirements applicable to gatekeepers under the Digital Markets Act (”DMA”)

The Digital Markets Act limits data combinations by gatekeepers (i.e. large digital platforms providing so-called core platform services, such as online search engines, app stores, messenger services, designated as such by the European Commission) for advertising purposes.

Sector-specific provisions

Some personalisation practices are also regulated for specific industry players. For example, creditors and credit intermediaries must inform consumers in a clear and comprehensible manner when they are presented with a personalised offer that is based on automated processing of personal data (Art. 13 of the Consumer Credit Directive).

Are these rules already enforced?

There have already been some enforcement cases regarding personalisation practices at the EU level, on several grounds (consumer law, data protection, etc.).

In relation to consumer law enforcement, the Consumer Protection Cooperation Network found in March 2024 that Tinder, the dating app, was in breach of EU consumer laws because it did not inform users that they were being shown pricing personalised by automated means.

DSA rules also started to be enforced: the European Commission, for example, sent to LinkedIn a request for information aimed at verifying the compliance of its services with the Digital Services Act following a complaint submitted by civil society organisations. According to the complaint, LinkedIn may have provided advertisers with the possibility to target LinkedIn users based on special categories of personal data referred to in Article 9(1) of the GDPR, such as racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, as revealed from users' participation in LinkedIn Groups. In June 2024, LinkedIn announced that it had fully disabled the functionality allowing advertisers to target LinkedIn users with ads based on their membership in LinkedIn Groups in the EU.

What specific measures is the European Commission considering?

The part of the consultation that deals with unfair personalisation practices seeks to determine whether, from the public’s perspective, existing EU regulations adequately govern unfair personalisation practices, whether additional regulatory or non-regulatory measures (e.g. guidance) are required, or whether more effective enforcement of existing rules is needed.

In detail, the following specific approaches are expressly up for discussion:

  • granting consumers more control over personalised advertising or pricing via opt-out and opt-in mechanisms,
  • restricting personalised advertising using certain personal data (such as sensitive data) or information on consumers’ individual vulnerability in all circumstances,
  • prohibiting personalised advertising targeting minors,
  • restricting personalised pricing based on personal data/profiling when targeting vulnerable consumers (with minors being considered as vulnerable) or in general.

In line with the Digital Fairness Fitness Check findings, the European Commission is contemplating adding new rules governing personalisation practices, including by introducing a more explicit option to receive non-personalised commercial offers instead of personalised ones, and by establishing equivalent obligations in EU consumer law to those laid down for online platforms in the DSA (described above), therefore extending the scope of such obligations to all traders.

Some of the rules that the European Commission is considering modifying are very recent and have not yet been enforced, and a first step could therefore be to wait for their enforcement and only adopt new rules if enforcement cases show the need for them. Although having general rules applicable to all traders would simplify the applicable legal framework, this may not consider all the specific characteristics of each type of operator, as the personalisation practices of digital actors present varying degrees of risk depending on their activity, size, and reach. Thus, enhanced transparency requirements and the possibility for users to opt out could be sufficient to establish a high level of protection, while not generally depriving users of certain personalisation practices that are useful to them.

* This article is current as of the date of its publication and does not necessarily reflect the present state of the law or relevant regulation.

Interested in hearing more from Osborne Clarke?