New regulation proposal on online harmful content for minors

Published on 24th June 2025

Poland's Ministry of Digital Affairs is proposing a new law aimed at limiting minors' access to harmful online content. The new draft includes a definition of "inappropriate content" and its regulatory implications. What are the key insights for digital media operators, streaming platforms, internet service providers, and content creators?

 

Proposed restrictions

The definition of "inappropriate content," currently under consultation, includes any "materials presented online that depict behaviours contrary to commonly accepted social norms, deemed demoralising, and potentially harmful to the mental, physical, or social development of minors." However, at this stage, the Ministry has proposed a few exceptions to these restrictions, which include:

  • Scientific content
  • Educational materials
  • Artistic expression
  • Content already constituting a criminal offense
     

Scope of potential obligations

The proposed regulation introduces a set of duties that will have significant operational and compliance implications for entities involved in disseminating online content. Although still under consultation, the draft provisions indicate a clear regulatory direction, emphasizing proactive protection of minors in digital spaces and providing for three key aspects of the proposed regulation:

  • Content moderation mechanisms to block or restrict access to defined harmful content.
  • Age verification tools that may become mandatory.
  • Potential liability for non-compliance, including for foreign platforms targeting Polish users. 

 

Identification and blocking system

  • The draft law requires platforms and service providers to monitor and restrict access to content harmful to minors, even if not illegal under current laws.
  • This includes materials that contravene societal norms, have a demoralizing effect, or may hinder a minor’s mental, physical, or social development.
  • This introduces preventive regulation rather than punitive measures, similar to the EU Digital Services Act but focused on child protection.

 

Age verification and filtering mechanisms

  • The law is expected to include technical requirements for implementing: age verification tools to prevent access by underage users to specific categories of content (e.g. adult content, extreme violence, drug-related themes) and content filtering systems capable of recognizing and segregating harmful materials.
  • Although the exact technical standards are yet to be defined, platforms may face pressure to adopt robust, privacy-compliant solutions — for example, third-party age assurance technologies or AI-powered moderation systems.
  • The law may also impose a duty of diligence — requiring platforms to take “reasonable and proportionate” steps to prevent minors from accessing harmful content. A failure to act could result in regulatory fines, reputational damage, and even civil liability in cases of demonstrable harm.

 

Applicability to foreign service providers

Importantly, the scope of the regulation is not limited to entities physically operating in Poland. Extraterritorial application is expected —  meaning that non-Polish platforms or service providers will be subject to these rules if:

  • They target Polish users (e.g., by offering Polish language content or accepting payments in PLN); or
  • Their services are accessible to minors in Poland, even without active marketing.

This aligns with the EU's broader digital regulatory framework, which increasingly holds foreign operators accountable for the online harms experienced by local users.

 

What's next?

  • The draft is currently undergoing internal government and public consultations. The Ministry has signaled openness to further negotiation of definitions and enforcement mechanisms.
  • If consensus on a broader definition of "patho-content" cannot be reached, the law may focus solely on pornography restrictions.

We will have to wait for the development and the outcome of the legislative process.

 

Operational Implications:

  • Platforms must implement mechanisms to block or restrict harmful content. The law requires "reasonable and proportionate" measures to prevent minors' access, with penalties for non-compliance
  • The law applies to foreign entities if they target Polish users or their services are accessible to Polish minors, aligning with the EU's digital regulatory framework
     

Consequently, international platforms —especially those in the VOD, social media, and content-sharing sectors — should begin mapping their exposure to the Polish market and assessing whether their content governance and child protection frameworks are compliant or adaptable to this regulatory environment.

Connect with one of our experts