EU Digital Fairness Act unpacked: 'horizontal' issues
Published on 23rd October 2025
Section 9 of the consultation on the legislation addresses five specific measures and broader 'cross-cutting' matters
The deadline for responding to the European Commission’s consultation on the Digital Fairness Act (DFA) is fast approaching – feedback can be submitted until 24 October.
The consultation has formed part of the Commission's fitness check examining whether EU consumer law is fit for purpose in the digital environment and whether new regulation is needed to ensure digital fairness for consumers.
The Commission has been seeking views on "horizontal" issues and broader "cross-cutting matters" and the possible introduction of other measures to improve consumer protection, strengthen enforcement and enhance the functioning of the single market in the digital environment.
There are five horizontal measures proposed by the Commission on which it has been consulting that are designed to address overarching challenges in the digital marketplace: mandatory age verification and estimation tools, "fairness by design", reversal of the burden of proof, amending the definition of "consumer" and Preventing exploitation of consumers' possible temporary vulnerabilities.
These measures, if implemented, have the potential to fundamentally change the approach to consumer protection in the EU, moving beyond an information-based approach – where traders and suppliers are legally obliged to provide consumers with information about their products and services – towards a more interventionist, design-focused approach from the outset.
Mandatory age verification and estimation tools
The Commission has sought views on whether digital products accessible to minors that contain certain commercial practices should be subject to the mandatory use of age verification or age estimation tools.
The protection of minors has emerged as a core theme in the EU's digital fairness agenda. Despite existing provisions in the General Data Protection Regulations (GDPR), the Artificial Intelligence Act, and Digital Services Act (DSA), significant gaps remain. Several Member States have called for stronger safeguards, including parental controls, restrictions on persuasive design and harmonised technical standards. There are also growing calls for minimum age requirements on social media access – albeit for varying age thresholds.
Article 28 of the DSA and its risk-based framework could serve as a reference, supporting lighter measures for low-risk contexts and stricter verification for high-risk ones such as online gambling. More broadly, there is a growing need for global technical standards and interoperability to avoid market fragmentation and create a level playing field for businesses operating across multiple jurisdictions.
'Fairness by design'
The Commission is exploring whether traders should ensure ‘"fairness by design"; that is, to implement technical and organisational measures to incorporate consumer protection considerations at all stages of the product or service development.
The introduction of a horizontal "fairness by design" principle would fundamentally reshape consumer protection. Rather than relying solely on disclosure and consent mechanisms, companies would be required to embed consumer safeguards into the architecture of their digital products and services from inception.
This principle would address the concept of "digital asymmetry" – the power imbalance in the digital sphere. Digital companies control what consumers see and how information is presented, using sophisticated techniques that are too complex for the average consumer to understand or counteract. This asymmetry requires a new approach that ensures robust consumer protection while preserving consumer autonomy and decision-making capacity.
This direction echoes the work done by the UK's Competition and Markets Authority, which advocates for building fairness into digital products and services from the outset – shifting responsibility to platforms not the consumer.
Embedding fairness by design would require businesses to take a proactive approach throughout product development. This includes conducting risk assessments at the design stage, identifying potential harms to consumers, particularly vulnerable persons, and implementing protective measures that are capable of evolving with the product. This approach has the potential to strengthen consumer trust and create potential competitive advantages while reducing regulatory and compliance risks.
Reversal of the burden of proof
With a view to strengthening the enforcement of consumer protection law, the Commission has sought views on reversing the burden of proof in cases where consumers, interested parties or authorities have disproportionate difficulty in obtaining information to prove a trader’s wrongdoing.
This proposal sits against a backdrop of enforcement challenges in the EU in the digital sphere – notwithstanding high consumer protection standards – particularly due to the complexities of digital technologies and AI. The "black box" nature of digital infrastructures, and the algorithms behind them, has meant that consumers face significant hurdles in proving unfair consumer practices. They often do not have access to relevant information about how algorithms work or how data is processed.
The proposal to reverse the burden of proof in cases of disproportionate difficulty represents a growing theme in EU consumer protection law. The new EU Product Liability Directive, for example, alleviates the burden of proof in cases of technical or scientific complexity.
Amending the definition of consumer
The Commission has sought views as to whether the current definition of a consumer should be amended to better reflect the reality of consumer behaviour in the digital environment.
The current legal standard defines the average consumer as "reasonably well-informed, observant and circumspect". However, this definition often fails to account for how consumers actually behave in digital environments, particularly vulnerable consumers, who may be influenced by "addictive'" design patterns and algorithms without being fully informed or digitally literate.
However, the Court of Justice of the EU "average consumer" standard and "average member of the group" and "vulnerable consumer" tools, already provide a flexible framework that can be adapted to different contexts. Introducing new benchmarks could undermine legal certainty and create confusion about which standard applies in which circumstances.
Recent reforms such as the New Deal for Consumers and the DSA are still being implemented so it may be premature to recalibrate the benchmark before these reforms are better understood and demonstrate their effectiveness (or lack thereof).
Consumer education and addressing harmonisation gaps across member states could deliver more immediate benefits without the potential risks of changing a well-established definition.
Preventing exploitation of consumers' possible temporary vulnerabilities
The final proposal seeks views on whether legislation should prevent commercial practices from targeting consumers’ possible vulnerabilities of a temporary or permanent nature; for example, sociodemographic, behavioural, financial or personal characteristics.
Modern technologies enable traders to detect when consumers are in a vulnerable state, such as due to grief or financial stress, and to tailor commercial practices to exploit these vulnerabilities. This might include targeting consumers based on behavioural patterns such as impulsive purchasing or addiction-prone behaviour.
The UK's new Digital Markets, Competition and Consumers Act offers a useful comparison. It has broadened the definition of vulnerable consumers to place greater responsibility on traders, with the revised definition encompassing a consumer's individual and personal circumstances such as financial situation, divorce, or job loss. This approach recognises that vulnerability is not limited to traditionally protected groups, but can affect any consumer in certain circumstances.
Reliably identifying "temporary vulnerabilities" involves complex feasibility and privacy issues, including constraints under the GDPR's provisions on profiling. This risks overreach, potentially capturing legitimate personalisation practices that benefit consumers.
A more nuanced approach might involve a risk-based, context-specific framework with appropriate safeguards that would distinguish between harmful exploitation of vulnerabilities and beneficial personalisation.
This is the concluding Insight in our “Digital Fairness Act unpacked” miniseries addressing each of the main areas of the consultation, including dark patterns, addictive designs, specific features in digital products, personalisation practices, social media influencers, digital contracts and simplification measures.