New European guidelines issued to protect minors in the digital environment
Published on 21st November 2025
Age assurance, privacy by default and safe recommender systems are no longer optional but central pillars of compliance
The European Commission published on 10 October the "Guidelines on measures to ensure a high level of privacy, safety and security for minors online", with the aim of assisting providers of online platforms in complying with the Digital Services Act (DSA). Specifically, these recommendations seek to clarify how to comply with article 28.1 of the DSA, which requires "appropriate and proportionate measures to ensure a high level of privacy, safety and security of minors".
While the document is formally non-binding and provides a non-exhaustive list of good practices, its regulatory impact is considerable. The Commission has explicitly stated that it will apply these guidelines to assess compliance with article 28.1 of the DSA. This stance converts the document into a de facto due diligence standard. Any platform that deviates from these key recommendations, such as failing to implement privacy by default settings, could be considered insufficiently protective, exposing itself to the sanctions provided for in the DSA.
Scope of application and fundamental principles
The obligations fall on all providers of online platforms accessible to minors. Accessibility is defined by a criterion of reasonable awareness: a provider is considered obliged if it already processes data revealing age (during registration, for example), if the platform is appealing to minors, or if independent research demonstrates the presence of minor users.
This scope extends even to services intended for adults (such as platforms disseminating pornographic content) if they have minor users due to the use of inadequate age assurance tools. The Commission clarifies that a mere statement in the terms and conditions prohibiting access to minors is not sufficient, but rather the adoption of effective technical and organisational measures to prevent such access is essential.
The guidelines are based on four general principles that must be considered holistically: appropriateness and proportionality, protection of children's rights, privacy, safety and security by design and age-appropriate design.
The best interests of the child and risk review
The ethical and legal foundation of the guidelines is article 24 of the Charter of Fundamental Rights, which stipulates that the best interests of the child must be a primary consideration. This requires platforms to carry out a detailed and child-specific risk review. The analysis must consider the size and reach of the service, the impact of measures on children's rights and the need to base themselves on the highest available standards. The reviews must incorporate the views of minors, seeking their participation, as well as those of guardians, representatives of other potentially affected groups and relevant experts and stakeholders. This review must be documented, published and serve as the basis for a mitigation plan with metrics to monitor its effectiveness.
Age assurance mechanisms
The implementation of age assurance methods is considered an appropriate and proportionate measure only when the risks to minors are high and not mitigable by less intrusive means. This includes access to pornographic content, gambling or when the service requires a minimum age by law or by its conditions.
These mechanisms must be accurate, reliable, robust, non-intrusive and non-discriminatory. The Commission promotes technologies that comply with the data minimisation standard, such as the future EU Digital Identity Wallet or the prototypes developed by the Commission itself. The choice of method must always fall on the least intrusive one that offers the same level of effectiveness.
Default settings and recommender systems
The guidelines specify the concept of age-appropriate design, requiring that services align with the developmental, cognitive and emotional needs of children. A fundamental requirement is that minors' accounts must be set to private by default to mitigate critical risks such as unwanted contact or cyberbullying. In addition, information about settings and complaints mechanisms must be presented in language and design that is accessible and adapted to them.
As regards profiling, the Commission severely restricts the use of recommender systems based on continuous monitoring. The guidelines expressly recommend avoiding profiling based on processing of behavioural data so extensive as to capture all or most of the minor's activities on the platform. This is considered a form of continuous monitoring of private life. Therefore, platforms must prioritise the use of "explicit user-provided signals" (for example, selected interests) to determine recommended content, justifying any use of implicit signals under the most rigorous criteria of minimisation and transparency.
24/7 moderation and emergency response
To ensure safety, the guidelines establish very high operational standards. Content moderation must be active and available 24 hours a day, seven days a week. Moderation teams are required to be well trained and to prioritise reports concerning the privacy, safety and security of minors, with mechanisms to flag urgency, without prejudice to the priority of trusted flaggers.
A particularly onerous requirement is the demand that at least one employee be on call at all times to respond to urgent requests and emergencies. This obliges platforms to implement rapid escalation protocols and to ensure the availability of human review to assess content or accounts that pose an imminent risk. Platforms must also apply technical solutions (such as "hash matching") and explore the use of AI classifiers to detect known or new illicit content. They are also required to restrict the ability of other users to download or take screenshots of minors' posts.
Finally, the guidelines also address protection against unfair commercial practices, recognising and seeking to protect minors from the risk associated with virtual currencies and paid loot boxes that exploit their lack of commercial literacy.
Osborne Clarke comment
The guidelines establish an operational standard for compliance with article 28 of the DSA, moving the protection of minors from the declarative level to that of decision-making in design, governance and data. The message is clear: age assurance, design by default and safe recommender systems are now pillars of compliance and not mere voluntary recommendations. Effective implementation will require substantial investments by digital platforms in technology, human resources and processes, but constitutes a necessary step towards a safer and more responsible digital ecosystem for children.