Digital regulation | UK Regulatory Outlook February 2026
Published on 26th February 2026
UK: Online safety and age assurance | UK government to consult on social media ban for children | Online Safety Act updates - New priority offences; Ofcom expedites decision on measures to block non-consensual intimate images; call for evidence on a statutory report on content harmful to children; super-complaints regime; enforcement | Media Act updates - Prominence code of practice; statement on listed events regime | EU: AVMSD review | European Media Freedom Act guidelines
UK updates
Online safety and age assurance
The UK government is tightening its regulation in the online safety space, focusing particularly on protecting women and children from online harms, and has recently announced a series of new measures:
- Non-consensual intimate images. On 19 February 2026, the government announced an amendment to the Crime and Policing Bill, currently progressing through Parliament, that will require regulated service providers to take down non-consensual intimate images within 48 hours of being notified of them. Failure to comply will be subject to the Online Safety Act 2023 (OSA) penalty regime (up to 10% of worldwide turnover and a potential block on the service). Following this announcement, Ofcom decided to expedite its decision on measures to block non-consensual intimate images (see below). The government will also publish guidance for internet providers on how to block access to websites hosting this type of content to capture sites that fall outside the scope of the OSA.
- AI chatbots and OSA. On 15 February 2026, the government announced plans to close the current "legal loophole" in the OSA regarding AI chatbots to ensure that providers are subject to the illegal content duties under the OSA. As Ofcom explains, chatbots are not subject to the OSA where they: (i) only allow people to interact with the chatbot itself and no other users (in other words, they are not user-to-user services); (ii) do not search multiple websites or databases when giving responses to users (in other words, they are not search services); and (iii) do not generate pornographic content. The government intends to legislate for this through amendments to the Crime and Policing Bill.
- Taking action faster. The government has also announced that, once its consultation on children's wellbeing online is completed (which includes proposals to introduce a ban on social media for under-16s – see below), it will put measures in place to ensure that it will be in a position to legislate for a ban or any other requisite safety measure (for example, restricting infinite scrolling and tackling addictive design) "within months" instead of "waiting years for new primary legislation". It plans to do this via amendments to the Children's Wellbeing and Schools Bill, which is currently progressing through Parliament.
UK government to consult on social media ban for children
Following Australia's introduction of a ban on social media access for under-16s in December 2025, several EU Member States – including France, Spain and Germany – are also now considering, or have already taken steps towards, implementing similar measures. The UK government, which initially opposed such a ban, has not announced concrete plans to follow suit but has confirmed its intention to consult on it. The consultation, expected in March 2026, will also look at:
- Ways to improve the accuracy of age assurance for children to support the enforcement of minimum age limits.
- Raising the digital age of consent, which is currently set at 13 years old (being the minimum age at which children can access information society services under the UK GDPR).
- Removing or limiting functionalities which drive addictive or compulsive use of social media, such as "infinite scrolling".
- Restrictions on children's use of AI chatbots.
- Options to impose age restrictions or limitations on children's use of virtual private networks (VPNs) where their use undermines safety protections provided under the OSA (and potentially elsewhere).
- Further interventions to support parents in helping their children navigate the digital landscape, including additional guidance, simplified parental controls and ensuring that data is preserved following a child's death.
On 21 January 2026, during the report stage of the Children's Wellbeing and Schools Bill, the House of Lords backed an amendment that would, among other things, ban under-16s from using social media. The Lords' amendments are due to be considered by the House of Commons (date to be announced), so it remains to be seen whether the government will accept this amendment.
In the meantime, the government has launched a "You Won't Know until You Ask" campaign, designed to provide parents with practical guidance on engaging with their children in relation to online content, including advice on safety settings and age-appropriate strategies to tackle misinformation and harmful content.
Online Safety Act updates
New priority offences
On 8 January 2026, the Online Safety Act 2023 (Priority Offences) (Amendment) Regulations 2025 came into force, creating two new priority offences under the Act: (i) making, encouraging or assisting serious self-harm; and (ii) cyber flashing (sending, sharing or threatening to share intimate images). Both were made criminal offences under or by the OSA, but making them priority offences as well means that regulated online platforms now have to remove such content from their services when they become aware of it and take steps to prevent it from appearing in the first place.
The technology secretary has also said that creating, or requesting the creation of, purported intimate images of an adult without consent (including AI-generated deepfakes), which as of 6 February 2026 is a criminal offence under the Data (Use and Access) Act 2025, will also be made a priority offence under the OSA in due course.
Ofcom expedites decision on measures to block non-consensual intimate images
Coinciding with the government's related amendments to the Crime and Policing Bill (see above), and in response to increased pressure to accelerate online protections for women and children, Ofcom has announced that it will speed up its decision-making process on the new safety measure it proposed adding to its codes of practice on the use of "hash matching" to detect non-consensual intimate images and prevent such images from reaching users. This proposal was outlined in a 2025 Ofcom consultation, which contained a collection of proposals for additional safety measures potentially to be added to the codes of practice.
Ofcom will now announce its final decision on "hash matching" in May 2026. Subject to the government securing the passage of the requisite regulations through Parliament, Ofcom expects any new code measures to come into effect in summer 2026. Ofcom's decisions on the remaining proposed safety measures will follow in the autumn.
Call for evidence on a statutory report on content harmful to children
Ofcom has opened a call for evidence on a statutory report on content harmful to children that it is required to publish under the OSA every three years. The first report is due by 26 October 2026.
The report will set out a review of the incidences of content harmful to children appearing on regulated services, building on the assessment Ofcom set out in its April 2025 Children's Register of Risks.
It is seeking evidence on:
- Incidences of content harmful to children on regulated user-to-user, search and/or combined services.
- The harm, whether physical or psychological, that children in the UK suffer or may suffer as a result of encountering such content.
- Whether it would be appropriate to amend the categories of primary priority content (such as pornographic, suicide, self-injury and eating disorder content) and priority content (such as abusive, violent or dangerous content) currently listed in the OSA and, if so, what amendments would be appropriate.
Ofcom is particularly interested in data concerning the incidences of content harmful to children collected since the children's safety duties came into force in July 2025.
The call for evidence closes on 10 March 2026.
Super-complaints regime
On 10 February 2026, Ofcom published its final statement and guidance on the online safety super-complaints regime, in force since 31 December 2025. The super-complaints regime aims to ensure that eligible entities can raise complaints with Ofcom about "systemic issues" relating to existing or emerging online harms (see this Regulatory Outlook for background).
Enforcement
In response to pressure from the government and civil society organisations, Ofcom is stepping up enforcement of the OSA, focusing primarily on making sure that adult sites have age assurance measures in place.
Since the age assurance rules under the OSA came into force in July 2025, Ofcom has launched more than 80 investigations into adult sites and has issued the first fines under the OSA, including a £1 million fine for failure to implement highly effective age assurance. Full details of Ofcom's enforcement action under the OSA are available here.
Ofcom has also joined forces with international online safety regulators through the Global Online Safety Regulators Network to highlight the importance of age checks in protecting children online.
It is also currently monitoring the measures being taken by providers of file-sharing and file-storage services that present risks of harm to UK users from image-based child sexual abuse material (CSAM), as part of its enforcement programme.
Media Act updates
Prominence code of practice
On 14 January 2026, Ofcom published for consultation a proposed Code of Practice on recommendations for connected TV providers to ensure compliance with the Media Act's prominence and availability requirements. It is also consulting on draft guidance on prominence and availability regarding the contractual arrangements between providers of designated connected TV platforms and public service broadcasting providers. The deadline for responses is 25 March 2026.
Statement on listed events regime
On 29 January 2026, Ofcom published a statement containing its proposals for definitions of certain terms under the new listed events regime under the Media Act. It also published a revised Code of Practice on listed and designated events.
It is now consulting on proposed regulations to reflect the decisions it has made, as well as on proposals to remove the conditions in relation to listed events in broadcast licences.
The deadline for responses to both consultations is 2 March 2026. According to Ofcom's implementation roadmap, the new listed events regime is due to come into force by the end of March 2026.
EU updates
AVMSD review
The European Commission has launched a public consultation to assess the impact of the Audiovisual Media Services Directive (AVMSD) and explore options for its review. The consultation is organised around four pillars: (i) scope and enforcement, (ii) audiovisual commercial communications, (iii) protection of viewers and (iv) strengthening of media diversity in the internal market.
This follows the Commission’s plans set out in its work programme 2026 and the 2030 Consumer Agenda and forms part of the Commission's commitments in the European Democracy Shield, which aims to strengthen the EU media sector.
The Commission has said it is seeking to evaluate the AVMSD in light of significant developments in the audiovisual media landscape since the directive was last revised in 2018, including the growth of influencers, new distribution technologies and the shift towards online, personalised, shorter-form and less curated formats.
The consultation has a broad scope, but three main themes emerge: (i) influencer regulation, (ii) distribution and platform fairness and (iii) regulatory simplification and interplay with horizontal regulation.
Unlike the direction of travel in the UK, the Commission does not explicitly suggest reforming the existing two-tier regulation of audio-visual media services – to align video-on-demand rules with the stricter linear regime. There is, however, a focus on whether current rules should be "streamlined" given current distribution and commercial models – so, similar proposals might arise based on responses to the consultation.
The consultation is open until 1 May 2026. Under Article 33 of AVMSD, the Commission must present findings of its evaluation, accompanied where appropriate by proposals for its review, by 19 December 2026.
The Commission is considering three policy options: (i) no change to the AVMSD, (ii) targeted amendments (for example, on scope to clarify applicability of the rules to new market players such as influencers and rules on prominence) or (iii) a full review and transformation into a directive or regulation.
European Media Freedom Act guidelines
The Commission has published guidelines under Article 18(1) of the European Media Freedom Act (EMFA). Article 18 of the EMFA is designed to protect freedom of expression and counter the unjustified removal of media service providers' (MSPs) content by very large online platforms (VLOPs) (as defined by the Digital Services Act) from their services. VLOPs, among other things, are required to notify MSPs in advance when they intend to remove journalistic content and clearly explain the reasons for their decision.
The publication of the guidelines follows the Commission's previous consultation – see this Insight for more information.