Regulatory Outlook

Digital regulation | UK Regulatory Outlook April 2026

Published on 30th April 2026

Online safety and age assurance: UK government makes amendments to the Crime and Policing Bill to address online pornography | European Commission presents the final EU age verification app |  UK Online Safety Act updates : Transparency reporting under the Online Safety Act | Ofcom's proposed updates to illegal harms regulatory documents: tackling self-harm and cyberflashing | Ofcom's year-two illegal harms risk assessments | Protecting women and girls online  

Online safety and age assurance 

UK government makes amendments to the Crime and Policing Bill to address online pornography 

The government has announced amendments to the Crime and Policing Bill to: 

  • Make it a criminal offence to process or publish pornography depicting incest or adults pretending to be children. Both offences would be designated as priority offences under the Online Safety Act 2023 (OSA).
  • Make senior tech executives criminally liable (with sanctions including fines or imprisonment) if their platforms fail to comply with Ofcom's enforcement decisions requiring the removal of non-consensual intimate images. 

These amendments form part of the government's mission to tackle violence against women and girls. 

European Commission presents the final EU age verification app 

In April 2026, the European Commission presented the final EU age verification app, describing it as "the most privacy-preserving and user-friendly solution". The app, which allows users to prove their age when accessing online services, is technically ready but has not yet been made available. 

In July 2025, the Commission published guidelines on the protection of minors under the Digital Services Act, which, among other things, recommend the use of age-verification technologies to restrict access to adult content. At the same time, it released the first version of an age-verification blueprint, which it provided to Member States to test and develop. In October 2025, the Commission released a second version of the blueprint. See this Regulatory Outlook for more information.  

Once available, users of the app will be able to set up an identity, using a passport or ID card, and, according to the Commission, verify their age without sharing any personal information, thanks to the cryptographic "zero-knowledge proof" method that the app deploys. The app works on any device and is fully open source to allow it to be used across the world. Online platforms can, according to the Commission, "easily rely on our age verification app".  

Seven frontrunner Member States – France, Denmark, Greece, Italy, Spain, Cyprus and Ireland – are piloting the app and plan to integrate it into their national eID wallets. All Member States will be able to customise the app before offering it to their citizens.  

ICO and Ofcom joint statement on the overlap between online safety and data protection in relation to age assurance 

See data law section.  

UK Online Safety Act updates  

Transparency reporting under the Online Safety Act 

Under the OSA, providers of categorised services will be required to publish annual transparency reports based on requirements that Ofcom will set out in formal transparency notices. Ofcom intends to publish its categorisation register in summer 2026.  

In July 2025, Ofcom published its Transparency Guidance, setting out its approach to transparency reporting and how it will decide what information service providers must include in their reports. It has also committed to engaging with civil society organisations, researchers and the wider public with relevant expertise throughout each transparency reporting cycle, including during the development of the transparency notices. 

In line with this commitment, it is seeking views, until 30 April 2026, from civil society organisations and the public on the information they would like service providers to publish in their transparency reports. Ofcom will use this input to inform the transparency notices it is currently developing. 

Ofcom's proposed updates to illegal harms regulatory documents: tackling self-harm and cyberflashing 

In March, Ofcom consulted on proposed changes to its codes of practice and guidance due to the addition of: (i) making, encouraging or assisting serious self-harm; and (ii) cyberflashing to the list of priority offences under the OSA (see this Regulatory Outlook for more details). The consultation closed on 24 April 2026.  

Among other things, Ofcom proposed to: 

  • Merge the new priority self-harm offence with the existing offence of encouraging or assisting suicide, creating a single category of illegal harm, "suicide and self-harm". This will involve updates being made to Ofcom's Risk Assessment Guidance and Illegal Content Codes of Practice.
  • Include cyberflashing as a new, separate type of illegal harm, with corresponding updates to the Guidance and Codes.
  • Extend existing measures under the codes, where relevant, so that they also apply to self-harm and cyberflashing.  

In practice, consolidating the suicide and self-harm offences into a single category will require providers to assess both harms together and assign one overall risk level for "suicide and self-harm", while conducting a separate risk assessment for cyberflashing. These changes mean that providers will need to revisit and update their existing risk assessments. 

Ofcom's year-two illegal harms risk assessments 

Under the OSA, service providers must assess and mitigate the risk of people in the UK encountering illegal content. Platforms likely to be accessed by children must also assess and mitigate the risk of under 18s being exposed to certain types of harmful material. Providers are obliged to review these risk assessments at least once a year and update them before making any significant change to their service's design or operation. Such reviews must also be conducted if Ofcom makes any significant changes to its regulatory documents (as above). 

Services that were in-scope of the OSA when the illegal content and child protection duties came into effect last year had to complete their first illegal content and child safety risk assessments in March and April 2025 respectively. With the annual deadline rolling round again, Ofcom has issued formal requests for information to 30 providers covering 43 services, requiring them to submit their illegal content and child safety risk assessments by 31 July 2026. This forms part of Ofcom's monitoring of industry compliance. Where it has concerns about any of the risk assessments it receives, it will work with the relevant provider to address them. Failure to provide a risk assessment when required to do so may result in fines. 

Protecting women and girls online  

Ofcom published dedicated guidance on safety for women and girls online in November 2025, setting out how in-scope service providers can tackle content and activities that disproportionately affect women and girls. The guidelines describe nine actions providers can take and highlight good practice in this area. See this Regulatory Outlook for more information. 

In March 2026, the UK government wrote to online service providers saying that it expects all platforms to implement Ofcom's guidance "by the end of this year at the latest".

View the full Regulatory Outlook

Interested in hearing more? Read all the articles in our Regulatory Outlook series

Expand
Receive Regulatory Outlook each month

A round-up of upcoming regulatory developments – straight to your inbox

* This article is current as of the date of its publication and does not necessarily reflect the present state of the law or relevant regulation.

Connect with one of our experts

Interested in hearing more from Osborne Clarke?