Regulatory Outlook

Data Law | UK Regulatory Outlook April 2024

Published on 23rd Apr 2024

ICO publishes new fining guidance | ICO outlines 2024-2025 priorities to protect children's privacy online |  Call for evidence on data adequacy and its implications for the UK-EU relationship

ICO publishes new fining guidance

The UK Information Commissioner's Office (ICO) has published new data protection fining guidance, outlining the statutory framework (in greater detail than previously) within which the Commissioner must operate when deciding whether it is appropriate to issue a penalty notice and the factors he will take into account. It also explains how the amount of any fine is determined.

The guidance does not appear to indicate any major change in approach, but does provide a clearer explanation of the process the Commissioner will go through when deciding the appropriate level of fine and provides greater transparency. It shows just how much discretion the Commissioner has when deciding whether, and how much, to fine, but it remains to be seen if it heralds any changes to the ICO's approach to enforcement in practice or results in a reduction in the number of successful appeals against its fines. See our Insight for more details.

ICO outlines 2024-2025 priorities to protect children's privacy online

The ICO has set out its 2024-2025 priorities for protecting children's personal information online and called on social media and video-sharing platforms to do more to ensure their safety. The new strategy builds on the progress achieved so far in implementing the Age Appropriate Design Code of Practice 2021 and outlines priority areas where further progress is needed, including:

  • ensuring that children's profiles are private by default, and that geolocation settings are turned off by default;
  • ensuring that profiling children for targeted advertisements is turned off by default unless there is a compelling reason to use profiling;
  • the use of children's information in, and the design of, recommender systems, since algorithmically-generated content feeds can expose children to harmful content, as well as leading them to spend more time on a platform than they otherwise would, which in turn increases the probability of them giving away more personal information; and
  • ensuring that parental consent is obtained for processing the personal data of children under 13 years old.

As part of this work the ICO will:

  • gather further evidence, including by way of a call for evidence to be published in summer 2024;
  • engage with parents, carers, children and organisations to identify areas for further improvement and provide additional guidance and advice where needed; and
  • focus on the most serious risks to children's privacy rights and take enforcement action where appropriate.

Call for evidence on data adequacy and its implications for the UK-EU relationship

The House of Lords European Affairs Select Committee has launched an inquiry into the data adequacy decision granted by the EU post-Brexit and the implications of any divergence by the UK from the EU's data protection regime for the UK-EU relationship.

The European Commission granted "data adequacy" to the UK after Brexit in 2021. The adequacy decision allows for the free flow of commercial and criminal investigation related personal data from the EU to the UK under the GDPR and the Law Enforcement Directive.

The decision is due to be considered for renewal by the EU, which must decide by June 2025 (although it also has the power to withdraw adequacy at any time). The inquiry will consider the existing adequacy arrangement, any challenges to the adequacy regime, the implications if the Commission were to withdraw or fail to renew the adequacy decision, and the experience of other countries with the EU's adequacy system, including their encounters with the Commission's process. The committee encourages anyone with expertise or experience in this area to submit written evidence to the inquiry by 3 May 2024.

There have already been rumblings of discontent from the EU as a result of the UK's plans to reform its data protection regime with the Data Protection and Digital Information Bill, currently going through Parliament, but the UK government has so far insisted that the bill maintains adequacy. See this

The committee will hold public evidence sessions between now and June and aims to report to the House by July 2024.

Information Commissioner publishes response to Ofcom's consultation on protecting people from illegal harms online

The Information Commissioner has published its response to Ofcom's consultation on protecting people from illegal harms online, which Ofcom launched in November last year as part of its implementation of the new online safety regime under the Online Safety Act 2023.

In its response, the ICO says that it expects organisations to fully comply with their data protection obligations when meeting their online safety duties and highlights the importance of data protection compliance when undertaking content moderation.

Broadly speaking, the ICO agrees with Ofcom's recommended content moderation measures, but disagrees with Ofcom's assertion (in its draft codes of practice) that the privacy impact of automated scanning is minimal. The ICO believes that the privacy safeguards around the use of automated measures should also explicitly refer to data protection requirements. The ICO recently published guidance on content moderation and data protection compliance. See this Regulatory Outlook for details.

The ICO also thinks that Ofcom's draft guidance currently lacks certainty about whether content is communicated "publicly" or "privately", potentially stopping services from making a confident assessment of this question when deciding whether to use proactive technology. The ICO is concerned that this might lead to services incorrectly evaluating content as public when in fact it is private. In the ICO's view, the default position, when assessment is difficult, should be that the content is private.

As for Ofcom's risk assessment guidance, while the ICO agrees that encrypted messaging and anonymity/pseudonymity functionality are risks for illegal harm, it is concerned that the guidance could, in practice, stop services from deploying these functionalities due to perceived risks under online safety law. The ICO would therefore like the guidance to clarify that these measures are not prohibited, but do require appropriate safeguards.

The ICO also urges Ofcom to consider the data minimisation principle when finalising its guidance to ensure that services are not incentivised to process more personal data than necessary.

Ofcom is due to publish its final decisions on the draft documents and submit them to the Secretary of State for approval in autumn 2024. In the meantime, organisations within the scope of the new online safety regime should prepare for compliance, ensuring that their measures also align with data protection requirements.

UK ICO launches third consultation on generative AI

See AI section 

Follow

View the full Regulatory Outlook

Interested in hearing more? Expand to read the other articles in our Regulatory Outlook series

View the full Regulatory Outlook

Regulatory law affects all businesses.

Osborne Clarke’s updated Regulatory Outlook provides you with high level summaries of important forthcoming regulatory developments to help in-house lawyers, compliance professionals and directors navigate the fast-moving business compliance landscape in the UK.

Expand
Receive Regulatory Outlook each month

A round-up of forthcoming regulatory developments – straight to your inbox

* This article is current as of the date of its publication and does not necessarily reflect the present state of the law or relevant regulation.

Interested in hearing more from Osborne Clarke?