Data law | UK Regulatory Outlook September 2025
Published on 25th September 2025
UK: Data (Use and Access) Act timetable | ICO consults on data protection complaints, on recognised legitimate interest and on distributed ledger technologies and blockchain | ICO publishes guidance on secure public disclosure and on online safety profiling tools | Court of Appeal case opines on requirements for GDPR infringement and compensation | EU: CJEU considers whether pseudonymised data is personal data when shared | General Court dismisses annulment action of EU-US personal data transfer framework

UK updates
Data (Use and Access) Act implementation: approximate timetable published
The Department for Science, Innovation and Technology has published an outline of its tiered timetable for implementing the Data (Use and Access) (DUA) Act 2025. In summary, while certain provisions of the DUA Act come into force automatically (by virtue of the act itself) the majority of substantive provisions are dependent on secondary legislation and are being introduced in four stages:
- Stage 1 – after the DUA Act received Royal Assent on 19 June 2025 – includes the commencement of technical provisions, clarifying aspects of the legal framework and measures requiring the government to publish an impact assessment, a report and a progress update on its plans for AI and copyright reform.
- Stage 2 – three-four months after Royal Assent (mid-September to mid-October 2025) – includes the commencement of most of the measures on digital verification services and those on the retention of information by providers of internet services in connection with the death of a child (see below).
- Stage 3 – approximately six months after Royal Assent (mid-December 2025) – will include the commencement of the main changes to data protection legislation plus the provisions on information standards for health and adult social care.
- Stage 4 – more than six months after Royal Assent (from mid-December 2025 onwards) – will include the rest (for example, measures on the National Underground Register and electronic registering births and deaths).
Data (Use and Access) Act slowly grinding into effect
A series of commencement regulations have been made bringing into effect some of the DUA Act provisions.
The Data (Use and Access) Act 2025 (Commencement No 1) Regulations were made on 21 July 2025. They mainly brought into force (with effect from 20 August 2025) parts of DUA Act which allow the government to make certain provisions, rather than doing anything substantial in themselves.
However, it is worth noting that the regulations have brought into effect DUA Act section 111, which amends the Privacy and Electronic Communications Regulations (PECR) by extending the time periods for relevant service providers to notify a personal data breach under PECR to the regulator. The change is from the existing obligation to notify without undue delay or within 24 hours, to without undue delay "and, where feasible, not later than 72 hours after having become aware of it", and requiring an explanation where notification is not made within 72 hours.
Also brought into effect are parts of DUA Act section 117, which establishes the Information Commission (IC) (the new body which will replace the existing Office of the Information Commissioner (ICO)), though substantive changes to the ICO/IC are not expected until the new IC's board has been appointed, probably in early 2026.
The Data (Use and Access) Act 2025 (Commencement No. 2) Regulations 2025 were made on 2 September and bring into force Section 124 of the DUA Act (with effect from 30 September). This section amends the Online Safety Act 2023 (OSA), to impose a duty on Ofcom to issue notices requiring social media providers (and other regulated service providers) to retain information in relation to an investigation into the death of a child (if so ordered by a coroner or equivalent).
The Data (Use and Access) Act 2025 (Commencement No. 3 and Transitional and Saving Provisions) Regulations 2025 were made on 4 September and cover the following DUA Act sections, which amend the Data Protection Act 2018 (DPA):
- Section 79, which amends the DPA provisions on the legal professional privilege exemption for law enforcement processing relating to data subject rights. It came into force on 5 September.
- Section 88, which amends the DPA provisions on the national security exemptions, including those relating to lawful processing, data subject rights, breach notifications, transfers out of the UK etc. It came into force on 5 September.
- Sections 89 and 90, which amend the DPA provisions on joint processing by intelligence services and competent authorities. They will come into force on 17 November.
ICO consults on draft guidance on handling data protection complaints
Among the DUA Act changes are new requirements for controllers to have in place a process to facilitate data protection complaints. These requirements are to be inserted into the DPA at section 164A by the DUA Act, but are not yet in force.
In essence, they mean that organisations must:
- provide data subjects with a way of making data protection complaints to them (as a controller);
- acknowledge receipt of complaints within 30 days of receipt;
- without undue delay, take appropriate steps to respond to complaints, including making appropriate enquiries, and keeping complainants informed; and
- without undue delay, inform complainants of the outcome of their complaints.
The ICO's draft guidance aims to help businesses understand the new requirements by setting out what they must, should and could do to comply, together with examples of good practice in relation to designing the complaints process, what to do when a complaint is received and how to respond once the investigation is complete.
The consultation closes on 19 October.
ICO consults on draft guidance on recognised legitimate interest
The ICO is consulting on draft guidance on recognised legitimate interest, a new lawful basis for processing that is to be added to the UK General Data Protection Regulation (GDPR) by the DUA Act, but is not yet in force.
The new lawful basis for dealing with personal data is separate from the legitimate interests lawful basis we are all familiar with. It consists of five conditions (to be inserted in annex 1 in the UK GDPR) containing pre-approved purposes that are in the public interest, covering:
- requests for data from a third-party organisation that needs it for a public task or official function;
- safeguarding national and public security;
- dealing with emergency situations;
- preventing, detecting or investigating crimes; and
- protecting vulnerable people.
If one of these applies, the organisation does not have to undertake the legitimate interest balancing test. However, the organisation must be able to justify that using the personal data is "necessary" for the particular condition.
The draft guidance explains the new lawful basis, including the differences from the legitimate interest basis, and how organisations should use it.
The ICO is also consulting on complementary draft guidance for public authorities on use of recognised legitimate interest for a public task or official function (that is, the first recognised legitimate interest condition listed above). It is aimed at assisting organisations with a public task or official function, such as public authorities, when asking other organisations to voluntarily share personal data to for these purposes.
Both consultations close on 30 October.
ICO publishes guidance on disclosing documents to the public securely
This guidance aims to assist organisations, when disclosing documents to the public (whether publishing online, responding to an information request or sending a document to a customer) to avoid accidental breaches of personal information. It sets out practical steps for checking documents for hidden personal information, together with examples of accidental breaches that could happen, as well as examples of commonly used software that can assist, and supporting videos.
Publication of this guidance follows accidental releases of highly sensitive data by the Ministry of Defence and the Police Service of Northern Ireland, and is a reminder that all organisations must have robust measures in place to protect the personal information they hold and prevent it from being inadvertently disclosed.
ICO publishes guidance on profiling tools for online safety
The aim of the guidance is to help organisations comply with data protection laws when deploying profiling tools as part of trust and safety processes, including when doing so in order to comply with obligations under the OSA. It focuses on the use of trust and safety tools that involve profiling (as defined in the UK GDPR), on user-to-user services (as defined in the OSA). However, the guidance also applies to organisations using profiling for their own purposes which are outside of any OSA obligations.
Ofcom has powers under the OSA to require regulated services to use certain "proactive technologies" in certain circumstances. Proactive technologies include content identification technology, user profiling technology or behaviour identification technology. Although the definition of user profiling in the OSA differs from the definition in the UK GDPR, the ICO expects that both user profiling and behaviour identification technologies will involve profiling as defined in the UK GDPR.
ICO consults on draft guidance on distributed ledger technologies and blockchain
The ICO is consulting on draft guidance on distributed ledger technologies (DLTs).
DLT is a type of digital system that allows an electronic ledger, where transactions are recorded, to be created, shared, added to and synchronised in real time. It allows multiple parties to maintain records simultaneously. What makes a DLT different from a traditional single ledger (such as a central spreadsheet or databases) is an absence of a single record keeper and a need for the parties transacting via the ledger to know or trust each other. One of the most well-known DLTs is blockchain.
The draft guidance explains the technology underpinning DLTs, including blockchain, and sets out how data protection law applies to the blockchain.
The consultation closes on 7 November.
Court of Appeal case opines on requirements for GDPR infringement and compensation
In Farley v Paymaster (1836) Limited (t/a Equiniti) [2025], the UK Court of Appeal has held that it was not essential for data subjects to prove disclosure of their personal data to third parties in order to plead an infringement of the UK GDPR or the DPA, and that mere fear of disclosure of data to third parties may be enough to entitle the data subjects to compensation.
In August 2019, Equiniti, the administrator of the Sussex Police pension scheme, posted out some pension recipients' annual benefit statements (ABSs) to addresses which were out-of-date and so incorrect. This occurred due to a process flaw in the way Equiniti dealt with the addresses. The ABSs contained personal information such as date of birth, national insurance number and pension-related details. Some envelopes were returned unopened, however, most were not recovered and it was not known whether they were read.
The claimants brought a collective action alleging breaches of the UK GDPR and the DPA and misuse of private information for failing to keep data secure, seeking compensation for anxiety, distress, embarrassment and loss of control of their data, among other things. The High Court allowed some claims to proceed where there was an arguable case that the contents had been read, but struck out the remainder on the basis that, absent third‑party access, there had been no "real" processing of the data and thus no actionable GDPR breach.
On appeal, the appellants claimed that both the way that the data was stored by Equiniti, and the acts of printing and sending the letters to the wrong addresses, were breaches of various data protection principles including those of lawfulness, fairness, accuracy, integrity and confidentiality.
The Court of Appeal:
- Ruled that it was not essential for the appellants to allege or prove third-party disclosure in order to plead a UK GDPR and DPA breach. The court held that the concept of "processing" "embraces a great deal more than disclosure or publication" – it includes mere recording of data and that therefore the appellants have pleaded a reasonable basis for alleging that Equiniti's mistake involved infringement of the GDPR.
- Considered whether there was a threshold of seriousness that the claimants needed to overcome. The court accepted that there was such a threshold in the law of misuse of private information, but not for breach of the GDPR. The court held that the appellants in this case could succeed if they prove a reasonable basis for fearing (i) that their ABS had been or would be opened and read by a third party and (ii) that this would result in identity theft or one or other specific adverse consequences which they feared might follow.
EU updates
CJEU considers whether pseudonymised data is personal data when shared
In European Data Protection Supervisor (EDPS) v Single Resolution Board (SRB), the Court of Justice of the EU (CJEU) found that: opinions expressed by individuals are personal data; whether pseudonymised data is personal data depends on the context and the availability of means to identify the person; and transparency duties apply to transfers of pseudonymised data.
In 2017, European Union institution SRB adopted a decision relating to Spain's Banco Popular Español. The SRB consulted with affected stakeholders (the bank's shareholders and creditors). As part of the process, the SRB provided a third party with data about opinions expressed by stakeholders relating to the valuation of the bank in pseudonymised form, having first removed information about the identify of the persons expressing the opinions. The SRB is subject to the Data Protection Regulation for EU institutions, bodies, offices and agencies (EUDPR) rather than the EU GDPR, although the relevant provisions of the EUDPR and the GDPR are the same.
Five individuals complained to the EDPS, alleging that they had not been told that their data would be submitted to third parties contrary to the SRB's privacy statement.
The case eventually made its way to the CJEU, which found that:
- There was no need to consider the content, purpose or effects of comments when determining whether the data in those comments sent to the third party "related" to identifiable individuals. It was clear that the comments conveyed the personal opinions and views of the individuals and were closely linked to their authors.
- The mere existence of additional information in the hands of the controller (here, the SRB) does not mean that pseudonymised data shared with a third party must always be treated as personal data by that third party. It depends on the context and whether the third party has reasonable access to any other information to enable identification, including information available online or that could lawfully be obtained from others. The fact that the SRB retained the additional information required to re-identify did not automatically mean that the data sent was, in the third party's hands, personal data – that assessment is contextual.
- Regarding transparency, the relevant perspective is that of the controller at the time the personal data is collected, before any transfer. Therefore, the SRB, as controller, should have told the individuals that their data would be shared with third parties, regardless of whether the third party could identify them from the pseudonymised data it had received.
The CJEU therefore overturned the General Court's decision and sent it back to the court for reconsideration.
General Court dismisses an action for annulment of the EU-US personal data transfer framework
In the case of in Latombe v Commission, the General Court of the European Union has dismissed an action for annulment of the EU-US Data Privacy Framework. By this decision, the General Court confirmed that, when the framework was adopted, the US had ensured an adequate level of protection for personal data transferred from the EU to US, and organisations may continue the transfers based on this decision.
The Commission had (on 10 July 2023) adopted the relevant adequacy decision, which allows personal data flows between the EU and US under the framework without any further safeguard being necessary.
Philippe Latombe, who is a French member of parliament, and commissioner of the CNIL, France's data protection regulator, but was bringing the case in his personal capacity, had asked the General Court to annul this adequacy decision.
Mr Latombe argued that the Commission infringed the Charter of Fundamental Rights and the GDPR by treating the US as providing adequate protection despite the possibility of "bulk" collection of EU personal data by US intelligence agencies, and by denying the right to an effective remedy and access to an independent tribunal, allegedly because the US Data Protection Review Court (DPRC) was not an independent and impartial tribunal, but was dependent of the US government.
The General Court rejected both arguments, finding that the "bulk" collection of EU personal data by the US intelligence agencies was susceptible to independent judicial oversight (albeit after the event) and that this was enough to meet EU law requirements (referencing the requirements laid out in the Schrems II case – there was no requirement that it had to be subject to prior independent authorisation. And it found that the DPRC is as an independent tribunal. The court also noted that the legality of the adequacy decision could be decided only on the basis of the facts at the time of the decision, not in the light of any subsequent US developments.
While the case means that data transfers under the privacy framework can continue, there is an on-going review mechanism to take account of US developments occurring since its implementation, so future challenges are possible. Mr Latombe has two months and 10 days to lodge any appeal.