Data law | UK Regulatory Outlook April 2026
Published on 30th April 2026
UK: Data (Use and Access) Act 2025: ICO publishes new guidance | Automated decision-making: ICO consultation on updated guidance | Automated decision-making: ICO sets out expectations for employer use in recruitment | ICO and Ofcom joint statement on the overlap between online safety and data protection in relation to age assurance | EU: CJEU rules that a first data access request may be refused as 'excessive' under the GDPR in exceptional circumstances
UK updates
Data (Use and Access) Act 2025: ICO publishes new guidance
Following consultations, the Information Commissioner's Office (ICO) has published new and updated guidance reflecting the changes introduced by the Data (Use and Access) Act 2025 (DUA Act) to the UK GDPR. The ICO has published:
- New guidance on recognised legitimate interest, a new lawful basis added by the DUA Act. A list of "recognised legitimate interests" (set out in Schedule 4 to the Act) includes processing for certain purposes relating to security, defence, emergencies, crime and safeguarding vulnerable individuals, as well as responding to public body requests.
- Updated guidance on the existing legitimate interests lawful basis to reflect amendments introduced by the DUA Act. The legislation introduces an expanded range of processing purposes that are more likely to qualify as "legitimate interests", including processing for the purposes of direct marketing, intra-group transmission of personal data (whether relating to clients, employees or other individuals) where that is necessary for internal administrative purposes, and network and IT system security.
- Updated guidance on the purpose limitation principle to reflect amendments introduced by the DUA Act, which introduces clearer safeguards for when information can be reused for a new purpose.
- Accompanying guidance on the compatible reuse of personal information for a purpose other than that for which it was originally collected.
Automated decision-making: ICO consultation on updated guidance
The ICO has published for consultation draft guidance on automated decision-making (ADM), including profiling, an update to existing guidance following the introduction of the DUA Act. According to the ICO, the key changes include:
- New content on how organisations can determine whether their processing falls within the scope of Article 22A of the UK GDPR, relating to solely automated decisions with significant effects.
- Clarification of the circumstances in which organisations' ability to undertake Article 22A automated decisions is restricted and the conditions which they must satisfy in those cases.
- A new section on the safeguards organisations must put in place, as well as the rights individuals have in relation to Article 22A automated decisions that affect them.
The consultation closes on 29 May 2026.
This forms part of the ICO's broader AI and biometrics strategy. The guidance is intended to inform parts of the ICO's AI and ADM code of practice, to be developed under secondary legislation committed to during the passage of the DUA Act.
Automated decision-making: ICO sets out expectations for employer use in recruitment
The ICO has also published a report examining how organisations use ADM in recruitment and setting out its regulatory expectations based on those findings. Its central finding is that many employers are likely relying on solely automated decisions – systems operating without meaningful human involvement – and that the decisions these systems take have "legal or similarly significant effects" on people. These fall within the scope of the UK GDPR's provisions on solely automated decision-making, thereby triggering a higher standard of safeguards than currently appears to be in place.
The ICO's findings suggest that:
- Employers must improve their transparency practices, ensuring candidates are adequately informed about the use of ADM in recruitment, and enabling them to make representations about and contest these decisions and request human intervention.
- Where human involvement is included, it must be applied consistently across all candidates at each hiring stage to ensure fairness and compliance.
- Employers should strengthen bias and fairness monitoring, including by asking developers about their own bias testing and conducting their own trials to verify that results minimise bias.
- The use of ADM in recruitment is likely to be an activity that requires a data protection impact assessment (DPIA) to be conducted, and the ICO would consider it to be good practice to do so.
- Where employers are currently relying on contract or consent as their lawful basis (due to the position in relation to ADM before the changes in the DUA Act), the ICO anticipates that it is now easier to rely on alternative bases, such as legitimate interests.
The ICO has written to employers that are likely to be conducting ADM in their recruitment process, setting out specific recommendations. Looking ahead, the ICO intends to revise its guidance on recruitment and selection in 2026 following the changes introduced by the DUA Act.
ICO and Ofcom joint statement on the overlap between online safety and data protection in relation to age assurance
Ofcom and the ICO have published a joint statement on age assurance, setting out what online services need to do to meet their obligations under both the Online Safety Act 2023 (OSA) and UK data protection law simultaneously.
This follows the ICO's open letter to social media and video-sharing platforms, calling on them to strengthen age assurance measures and Ofcom's demands directed at the sites and apps most used by children, requiring them to enforce their minimum age rules and implement highly effective age checks.
The statement is aimed at any service that is likely to be accessed by children and is implementing age assurance measures because they are in scope of either or both the OSA and/or UK data protection legislation. It provides practical examples as to which types of services need to comply with which regimes.
Key takeaways for online services:
- Both regulators view age assurance as a single, integrated issue and are aiming to take a more aligned and coordinated regulatory approach to protect children.
- Services must choose the age assurance method most appropriate to reduce risks and potential harms to children online.
- The OSA does not require services to set a minimum age. However, a user-to-user service allowing primary priority content, or a service publishing pornographic content, must use highly effective age assurance. Where a service does set a minimum age, it must apply it consistently and explain it in its terms of service.
- If a service does not use highly effective age assurance to enforce a minimum age, it must assume underage children are present and reflect this in its children's risk assessment.
- The duty to employ age assurance is not limited to those in scope of the OSA. Services that set a minimum age of use of the service must use an effective age gate to prevent access (thereby complying with the ICO's Children's Code).
- Where a service does not set a minimum age, but is not suitable for children under a certain age, it will generally have no lawful basis for processing the personal data of users below that age – using age assurance technology is recommended to avoid this.
- Services that are suitable for children or children above a certain age, should use age assurance methods to help ensure that the experience is age-appropriate (in line with the Children's Code).
- Where the age assurance technology used is likely to result in a high risk to children's rights and freedoms, age assurance technologies that "give the highest possible level of certainty on a user's age" should be used.
- Both regulators agree that self-declaration alone is insufficient.
- The ICO considers profiling for age assurance an ineffective way of preventing underage users from accessing the service.
- As all age assurance methods inherently involve data processing, they must be necessary, effective for the purposes of preventing children from accessing the service, proportionate to the risks and comply with data protection laws.
- Circumvention risks must be addressed when complying with either regime.
- Services are not expected to deploy methods that are not technically feasible or introduce risks to rights and freedoms disproportionate to the benefits.
DRCF publishes paper on the future of agentic AI
See AI section.
EU updates
CJEU rules that a first data access request may be refused as 'excessive' under the GDPR in exceptional circumstances
In Brillen Rottler GmbH & Co. KG v TC (Case C-526/24), the Court of Justice of the EU (CJEU) delivered its judgment on an interpretation of Articles 12(5) and 82(1) of the EU GDPR, addressing the scope of the right to refuse "excessive" data access requests and the right to compensation for infringement of the right of access.
The case concerned an individual in Austria who subscribed to the newsletter of Brillen Rottler, a German optician, and thirteen days later submitted a data access request under Article 15 of the GDPR. Brillen Rottler refused the request as abusive under Article 12(5) but the individual maintained the request and claimed €1,000 in non-material damages under Article 82. The company sought a declaration that the individual was not entitled to any compensation, relying on publicly available reports indicating that he had systematically subscribed to newsletters, submitted access requests and brought compensation claims against multiple controllers for alleged infringements that he had deliberately provoked.
The CJEU held that a first access request may be refused as "excessive" under Article 12(5), but only in exceptional circumstances, with the burden falling on the controller. The controller must demonstrate both an objective element (that the purpose of the GDPR has not been achieved despite formal compliance) and a subjective element (an abusive intention, for example where the request was made not to verify lawfulness of processing but to artificially create conditions for obtaining compensation). The controller must take into account all the circumstances of the case, in particular the fact that the data subject provided personal data voluntarily, the aim of providing those data, the time elapsed before the access request, and the data subject's conduct. Publicly available information indicating a systematic pattern of access requests and compensation claims may be considered, provided it is supported by other relevant material.
The CJEU further held that the right to compensation under Article 82(1) is not limited to damage resulting from the processing of personal data, since the provision refers to "infringement of this Regulation" without qualification. Restricting it in that way would exclude infringements of Chapter III rights, such as the right of access, thereby undermining the effectiveness of Article 82.
As regards non-material damage, the CJEU confirmed that loss of control over personal data may suffice, with no minimum threshold of seriousness, but the data subject must demonstrate actual damage distinct from the bare infringement. The causal link may be broken where the data subject's own conduct (such as submitting personal data with the deliberate aim of generating a compensation claim) is the determining cause of the alleged damage.
The CJEU left it to the referring court to determine whether, in the light of all the relevant circumstances, Brillen Rottler had established that the individual made the access request with an abusive intention.
EDPB adopts Data Protection Impact Assessment template
The European Data Protection Board (EDPB) has adopted a DPIA template, which is subject to public consultation until 9 June 2026. Following the consultation, all data protection authorities will initiate the necessary steps to adopt the template either as their sole standard or as a "meta-template" to which national-specific templates will be aligned. In the meantime, organisations are encouraged to use the template and to provide feedback as part of the public consultation.
EDPB consults on guidelines on processing of personal data for scientific research purposes
The EDPB has adopted guidelines on processing of personal data for scientific research purposes, subject to consultation until 25 June 2026.
Among the issues addressed by the guidelines are the concept of scientific research within the meaning of the GDPR, the storage limitation and transparency principles, consent, legitimate interests and the processing of special categories of personal data.