Regulatory Outlook

Consumer Law | UK Regulatory Outlook April 2024

Published on 23rd Apr 2024

Ofcom publishes call for evidence on additional duties for categorised services under the Online Safety Act 2023 | EU Commission publishes DSA guidelines on the mitigation of systemic risks for electoral processes

Consumer protection icon

Ofcom publishes call for evidence on additional duties for categorised services under the Online Safety Act 2023

Ofcom has published a call for evidence to inform its codes of practice and guidance on the additional duties that will apply to categorised services under the Online Safety Act 2023.

This call for evidence is part of the third phase of the regulator's work to implement the online safety regime. Part three consists of:

  • identifying the service providers that are subject to additional duties;
  • consulting on draft codes and guidance on those additional duties (first consulting on guidance on the transparency reporting regime in summer 2024 and then consulting on additional duties for categorised services in early 2025); and
  • publishing final codes and guidance.

This call for evidence aims to support the early 2025 consultation.

The Act introduces a system categorising some regulated online services, based on their key characteristics and whether they meet certain numerical thresholds, as category 1, 2A or 2B services. Alongside the call for evidence, Ofcom has published its advice to the government on setting the thresholds, which was sent to the Secretary of State for Science, Innovation and Technology on 29 February 2024.

Based on this advice, the Secretary of State will set out the thresholds in secondary legislation and Ofcom will, once it has assessed the services against the final thresholds, publish a register of categorised services, as well as a list of emerging category 1 services.

Following Ofcom's advice, the Secretary of State wrote to Ofcom asking for further information on how the regulator arrived at the various assessments it makes in its advice.

EU Commission publishes DSA guidelines on the mitigation of systemic risks for electoral processes

The European Commission has published guidelines under the Digital Services Act (DSA) on mitigating against systemic risks in relation to electoral processes for providers of Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs).

The guidelines aim to support VLOPs and VLOSEs with their compliance obligations under Article 35 of the DSA (mitigation of risks) and with other obligations relevant to elections. As well as mitigation measures, the guidelines cover best practice for before, during and after electoral events.

The guidelines highlight the importance of clearly labelling political advertising, in anticipation of the new regulation on the transparency and targeting of political advertising, which came into force on 9 April 2024.

The guidelines also state that VLOPs and VLOSEs should:

  • reinforce their internal processes, including adopting an incident response mechanism during an electoral period; and
  • adopt specific mitigation measures linked to generative AI, for example by clearly labelling content generated by artificial intelligence, such as deepfakes.
Follow

View the full Regulatory Outlook

Interested in hearing more? Expand to read the other articles in our Regulatory Outlook series

View the full Regulatory Outlook

Regulatory law affects all businesses.

Osborne Clarke’s updated Regulatory Outlook provides you with high level summaries of important forthcoming regulatory developments to help in-house lawyers, compliance professionals and directors navigate the fast-moving business compliance landscape in the UK.

Expand
Receive Regulatory Outlook each month

A round-up of forthcoming regulatory developments – straight to your inbox

* This article is current as of the date of its publication and does not necessarily reflect the present state of the law or relevant regulation.

Connect with one of our experts

Interested in hearing more from Osborne Clarke?