European Data Protection Board guidelines map how the DSA meets the GDPR
Published on 27th November 2025
The interaction of the regulatory regimes has broad implications for companies operating online across Europe
The General Data Protection Regulation (GDPR) and the Digital Services Act (DSA) are two of the most consequential regulations shaping Europe’s digital landscape. Their objectives intersect in meaningful ways, and their requirements can overlap. To reduce legal uncertainty and promote consistent enforcement, the European Data Protection Board (EDPB) has issued Guidelines 3/2025 clarifying its view on how these regimes interact in practice.
'Notice and action': content moderation with data minimisation
The DSA obliges hosting services to deploy notice-and-action mechanisms so users can flag illegal content. Inevitably, these systems involve processing personal data—from the notifier, the content poster and potentially third parties referenced in the content. The EDPB underscores strict adherence to data minimisation. Platforms should allow notifiers to provide basic identifiers such as name and email, but must process these only for DSA-defined purposes, including confirming receipt, communicating decisions and, where strictly necessary, including limited details in the statement of reasons provided to affected users.
These notification mechanisms should enable but not require identification. Controllers will need to define when disclosure of a notifier’s identity is necessary and proportionate. Recital 54 of the DSA cites notices of alleged intellectual property rights infringements as an example; as in such cases, identification may be necessary to assess the illegal character of the content. Even then, only the minimum data required will need to be revealed and notifiers must be informed in advance under article 13 of the GDPR. Futher, the EDPB confirms that complaint handling and possible account suspensions for misuse under article 23 of the DSA do not displace individuals’ rights and remedies under the GDPR.
Advertising transparency
The EDPB clarifies how the DSA’s advertising transparency rules intersect with the GDPR. Under article 26 of the DSA, platforms must disclose the identity of the party on whose behalf an ad is shown, who paid for it, the main parameters used to determine the recipient and how those parameters can be changed. Meanwhile, many targeted ads constitute profiling under the GDPR. This potentially triggers GDPR rules on automated decision-making and the obligation to provide meaningful information about the logic involved, coinciding with the transparency requirement on the main parameters used under the DSA.
While these frameworks overlap for profiling-based ads, the DSA’s transparency obligations are broader as they apply even to non-profiling advertising such as contextual placements.
The timing of when this information should be provided also differs. The GDPR requires information at or before data collection or before consent is obtained. The DSA, in turn, requires real-time, ad-level disclosures that are directly accessible from the ad itself – after any potential personal data processing for that particular ad has occurred. Platforms must therefore design transparency layers that satisfy different, complementary obligations at different touchpoints.
The EDPB further confirms that article 26(3) of the DSA imposes an absolute prohibition on ads based on profiling using special category data as defined by the GDPR. This DSA ban applies even where a controller could otherwise rely on a lawful basis under article 6(1) and a derogation under article 9(2) of the GDPR, which would allow them to use this special category data. As an example, serving ads based on inferred religious beliefs via geolocation (for example, visiting places of worship) or shopping behaviour (for example, purchasing specific food products) is prohibited under the DSA.
Deceptive design patterns: drawing the GDPR-DSA boundary
Article 25 of the DSA prohibits deceptive design patterns that impair users’ autonomous decision-making. However, the DSA carves out designs already governed by the GDPR from its scope, shifting competence to data protection authorities. The EDPB frames a two-part test to determine whether a pattern falls within the GDPR: Firstly, whether personal data are processed and, secondly, whether the pattern influences user behaviour in relation to that processing.
As such, a simple scarcity prompt like “only a few products left” falls under the DSA. But if it urges the user to “enter your email to reserve”, it implicates personal data processing and, therefore, the application of the GDPR. Deceptive designs that steer users regarding personal data will generally fail the GDPR fairness requirement the EDPB points out, because GDPR processing must be lawful, fair and transparent.
Deceptive patterns linked to addictive behaviours—such as infinite scroll, autoplay and periodic rewards— are marked as potential systemic risks by the DSA. Here as well, if these patterns rely on personal data inputs, generate new personal data or shape behaviour in contexts involving personal data, the GDPR will apply.
Protecting minors: balancing age assurance and data protection
The DSA requires that platforms accessible to minors must implement measures that ensure a high level of privacy, safety and security for young users. In practice, this may involve age assurance mechanisms that process personal data. The EDPB recognises that the DSA can provide a legal basis for processing under article 6(1)(c) of the GDPR, but only insofar as the controller can demonstrate that the processing is necessary and proportionate to meet the DSA obligation. The EDPB rejects open-ended data collection and demands a case-by-case necessity and proportionality assessment, setting a high technical and organisational bar for compliance.
To balance child safety with the fundamental right to privacy, the EDPB advises against mechanisms that enable unambiguous online identification, such as requiring government ID copies. Platforms should adopt privacy-preserving approaches that minimise data, limit retention and access and that reduce the risk of function creep.
Osborne Clarke comment
What does this then mean for platforms? The EDPB’s guidance illuminates the interplay between data protection, platform regulation and consumer protection, and it makes clear that siloed compliance approaches are not advisable. The DSA and GDPR must be implemented in a coordinated way, with data protection "by design" serving as a unifying principle. Notice-and-action workflows must be tightly scoped and transparent. Design and product teams need clear criteria for identifying when platform patterns fall under the GDPR, with fairness and transparency as guardrails. Advertising systems must deliver layered, timely disclosures that satisfy both statutes, particularly for profiling. And child-safety measures must be engineered to achieve protection objectives while avoiding unnecessary identification and data collection.
This compliance model is demanding and not always straightforward to operationalise. Nevertheless, the guidelines provide an insight into the necessity for integrated governance, accountability and transparency across product, legal and engineering functions. Organisations that treat data protection by design as central to DSA compliance will be better positioned to meet both regulatory regimes with consistency and confidence.