Digital regulation | UK Regulatory Outlook July 2025
Published on 23rd July 2025
UK: Ofcom OSA fees and penalties | Pornography access age checks | Ofcom online safety consult and plans | Ofcom consults on analogue commercial radio under Media Act 2024 | EU: Accessibility Act application | Protecting media service providers on online platforms | Data access for researchers under DSA | Regulation for transparency reporting under DSA | EC European Commission guidelines and age-verification to protect minors online under DSA

UK updates
Online safety updates
Ofcom sets out how it intends to set fees and penalties under the OSA
Ofcom has published its policy statement on implementation of the online safety fees and penalties regime under the Online Safety Act 2023 (OSA) following consultation (see this Regulatory Outlook for background). Under the OSA, Ofcom's costs for its regulatory work are to be covered by providers of regulated services under the regime, which must be implemented by Ofcom and the secretary of state for the Department for Science, Innovation and Technology. The fees and penalties regime will also set the maximum level of penalties that Ofcom can impose for breach.
Ofcom has decided on:
- Qualifying worldwide revenue (QWR). QWR will be used to assess the threshold at or above which providers will be required to pay fees, and the maximum penalty caps that Ofcom can apply where it finds a provider in breach (Ofcom can impose a penalty of up to 10% of its QWR or £18 million, whichever is greater). The draft Online Safety Act 2023 (Qualifying Worldwide Revenue) Regulations 2025, which were laid before Parliament on 26 June 2025, cover how a provider's QWR is determined. Ofcom's policy statement sets out the reasoning behind Ofcom's decision to define it as the "total amount of revenue the provider receives that is referable to relevant parts of a regulated service".
- QWR for penalty caps in case of joint and several liability. The OSA gives Ofcom the power to use a different definition of QWR for a "group of entities" in order to calculate the maximum penalty in situations where a provider and one or more undertakings within its group are jointly and severally liable for a breach. For this purpose, Ofcom has decided to define QWR as the "total of all worldwide revenues received by the provider and its group undertakings in the most recent complete accounting period, whether or not that revenue is referable to a regulated service".
- QWR threshold advice. Under the OSA, Ofcom must advise the secretary of state on setting the QWR threshold, at or above which figure providers of regulated services will be required to pay fees). Ofcom has advised setting it at £250 million, but considers a range of £200 million to £500 million appropriate. The secretary of state will decide the final threshold.
- Exemptions. Providers with UK-referable revenue under £10 million in a qualifying period will be exempt from notifying Ofcom of their QWR or paying fees (subject to approval by the secretary of state).
- Statement of Charging Principles (SCP). Before Ofcom can start charging fees, it must put in place a SCP, setting out the principles it will apply to setting fees. Ofcom's policy statement sets out its decision on the approach it will take. Ofcom intends to consult separately on the SCP later this year.
- Notification process. Providers of regulated services are required to notify Ofcom if they are liable to pay fees. The Online Safety Act 2023 (Fees Notifications) Regulations 2025, which come into force on 14 September 2025, detail the information that providers must give Ofcom and how it should be provided. In Q3 2025, Ofcom expects to publish a consultation on additional notification guidance, which will set out further information on the notification process and the required documentation.
In its policy statement, Ofcom also sets out the timeline for implementation of the fees and penalties regime. It does not expect to issue invoices for the 2026/27 charging year until Q3 of 2026, following a four-month notification window and verification by Ofcom of providers' QWR.
Major pornography providers agree to age checks from 25 July
By 25 July 2025, all services in scope of the OSA that allow pornography on their sites or that are dedicated adult sites must implement "highly effective age assurance" to ensure that children are not able to encounter pornographic content. Ahead of this deadline, major porn providers operating in the UK has confirmed to Ofcom that they will introduce effective checks in compliance with the OSA. Sites and apps that publish their own pornography are already required to protect children from it.
Ofcom's consultation on codes of practice for online safety
While many in-scope services are still putting into effect the first versions of Ofcom's codes of practice on illegal content and on child protection under the Online Safety Act 2023 (OSA), with the next deadline for implementation being 25 July 2025 for compliance with the OSA's child protection duties, Ofcom has, as telegraphed, published a new consultation proposing additional safety measures for the codes to make services safer by design.
The new consultation seeks views from stakeholders on more targeted safety measures that focus on: tackling harms at source through the use of proactive automated technologies in relation to a wider range of content; stopping illegal content going viral through the use of crisis response protocols; and strengthening protections for children through increased use of highly effective age assurance (HEAA).
Some key recommended measures from the consultation include:
- Human moderation for livestreaming content on a 24-hour basis and HEAA to restrict the ability of other users to interact with livestreams from children (for example, by posting comments or reactions and making "gifts" to the streamer).
- Extending proactive content moderation requirements for some services by stipulating the use of automated tools that will accurately, effectively, and without bias, detect certain illegal content and content harmful to children, where this is technically feasible. This goes beyond the original codes, which require the detection of image-based child sexual abuse material (CSAM), hash matching technology and CSAM URL detection only.
- Requiring the use of hash matching technology to detect terrorism content and intimate image abuse content that has been shared without consent (including explicit deepfakes) to reduce reliance on reporting from users. Ofcom notes the difficulties some users (particularly child users) face when navigating reporting mechanisms, highlighting the need for automatic detection and take-down procedures.
- Internal crisis response procedures for services at risk of terrorism, hate or harassment/abuse content or foreign interference. This includes measures both during a crisis and a requirement to conduct a post-crisis analysis.
- Requiring some services to exclude illegal content (including terrorism, hate and suicide content) from recommender systems until it has passed human moderation to reduce the virality of illegal content.
Stakeholders must submit responses by 20 October 2025. The updated codes are expected to be implemented in summer 2026.
Ofcom publishes update on online safety implementation plans
Ofcom has published an update on its online safety implementation plans, setting out "key milestones" for the remainder of this year and the beginning of 2026. These include:
- Final guidance on a safer life online for women and girls by the end of 2025.
- Publication later this month of a report on access by independent researchers to data from regulated online services to enable research on online safety matters to be conducted.
- Publication of the register of categorised services once legal challenges to the thresholds for categorisation (as set by the government in the Online Safety Act 2023 (Category 1, Category 2A and Category 2B Threshold Conditions) Regulations 2025) are complete.
- The launch of the super-complaints regime in early 2026.
- Publication of the following statutory reports: Report on Highly Effective Age Assurance by July 2026; report on content harmful to children by October 2026; and report on app stores by January 2027.
Media Act updates
Ofcom consults on local news and information on analogue commercial radio under Media Act 2024
The Media Act 2024 has changed Ofcom's functions and responsibilities in relation to regulating local analogue commercial radio. Part 5 of the Act ensures the delivery of local news and information by establishing new mandates for the consistent transmission of local news and information on these stations. Ofcom's consultation sets out how it proposes to implement this new framework, including the specific conditions that Ofcom proposes to include in licences.
The consultation also contains proposed draft guidance on how licensees might meet the requirements of the new licence conditions.
The consultation closes on 22 September 2025.
EU updates
European Accessibility Act enters into application
On 28 June 2025, the European Accessibility Act (EAA) entered into application in the EU across all member states. It is focused on digital technology, but also covers physical products used in digital spaces, such as laptops, streaming sticks and payment devices. The aim of the EAA is to make "everyday" digital products and services accessible to people with disabilities or impairments on an equal basis with others. The scope of the EEA is broad, so many businesses are likely to be affected and should ensure compliance. See our Digital regulation timeline for more.
Commission seeks feedback on protecting media service providers on online platforms
The European Commission is seeking feedback on guidelines it will issue under Article 18 of the European Media Freedom Act, which became law in May 2024 and comes into general effect on 8 August 2025.
Article 18 of the Act is designed to provide safeguards for media service providers in relation to the moderation of their content by providers of very large online platforms (VLOPs) (as defined in the Digital Services Act (DSA)), when such moderation is based on the VLOP's relevant terms and conditions. In essence, it gives media providers protection from having their content removed unwarrantedly.
Article 18 will require VLOPs to notify media providers when they plan to remove their content and to explain the reasons for removal. VLOPs must also give media providers 24 hours to respond to the notification, although this timeframe can be shortened in a crisis situation.
The guidelines aim to help media providers navigate the requirements of Article 18 and assist VLOPs in implementing the Article 18 safeguards.
The consultation closes on 23 July 2025.
European Commission publishes delegated Act on data access for researchers under DSA
The new delegated act under the DSA clarifies how VLOPs and very large search engines (VLOSEs) should share internal data that is not publicly available with qualified researchers. The delegated Act sets out the legal and technical requirements for such access so that researchers can assess systemic risks and mitigation measures in the EU. It also establishes a new online DSA data access portal .
Researchers must first be vetted by a Digital Services Coordinator (DSC) before they can access the data. This involves making a data access application to the relevant DSC, with access only being granted if the researcher meets certain requirements. The researcher must also disclose the funding for the project and commit to publishing the results of its research. Only data necessary to perform research on systemic risks in the EU can be requested.
The new delegated act complements DSA rules obliging VLOPs and VLOSEs to grant access to researchers to publicly available data on their platforms.
It will now be scrutinised by the European Parliament and Council of the EU and will enter into force on publication in the Official Journal.
Implementing regulation outlining rules and templates for transparency reporting under DSA comes into effect
Under the DSA, intermediary services, VLOPs, and VLOSEs are required to publish transparency reports on their content moderation practices. Intermediary services must report annually, whereas VLOPs and VLOSEs must report twice a year. These obligations have been in effect since 17 February 2024.
The implementing regulation, which the Commission adopted in November 2024 and which came into effect on 1 July 2025, harmonises the format for transparency reporting, as well as the reporting periods, to ensure that providers give clear and comparable information on their content moderation practices.
The harmonised templates are available in multiple languages and come with instructions on how to complete them.
As for the reporting periods, intermediary services must report each year by the end of February. VLOPs and VLOSEs must report by the end of February for the period 1 January to 30 June, and by the end of August for the period 1 July to 31 December, each year.
European Commission publishes final guidelines and age-verification app blueprint to protect minors online under DSA
Publication of the guidelines under Article 28 of the Digital Services Act (DSA) follows public consultation, as well as research and consultation with industry and with children and young people (see this Regulatory Outlook).
Key recommendations include:
- Privacy by default – minors' accounts should be private by default.
- Recommender systems – these should be modified to lower the risk of children encountering harmful content, including by prioritising explicit signals from children (for example requests not to see certain content) over behavioural signals.
- Cyberbullying – children should be able to block and mute any user and only be added to groups with their explicit consent.
- Addictive design – functionality that promotes excessive use by children, such as "streaks", "read receipts", autoplay and push notifications, as well as design features aimed at increased engagement, should be disabled by default.
- Commercial practices – children's lack of commercial literacy should not be exploited such that they are exposed to manipulative commercial practices that can lead to addictive behaviours.
The guidelines also recommend the use of age verification technologies to restrict access to adult content, pointing to EU digital identity wallets (when they become available) and to the Commission's blueprint for age verification apps, which set a standard for age verification apps.
Although the guidelines are not binding, the Commission will use them to assess compliance with Article 28(1) of the DSA.