GDPR for HR

GDPR for HR | March 2026

Published on 30th March 2026

Data protection in internal investigations, use of AI in DSARs and recruitment, and deleted data DSAR obligations

Banking-finance-building-facade

At a glance

  • Internal investigations carry significant data protection obligations – data professionals should shape the process, not merely review it.

  • AI is reshaping how tactical DSARs are made in employment disputes and how recruitment is conducted, attracting regulatory scrutiny across both UK and EU frameworks. 

  • A recent EU court ruling offers UK employers practical clarity on two recurring DSAR questions: deleted data and who counts as a recipient.

The investigation paradox: when data protection meets HR

'It is a capital mistake to theorise before one has data' (Sherlock Holmes, in Arthur Conan Doyle's 'A Scandal in Bohemia')

The number of grievances, whistleblowing complaints and Employment Tribunal claims involving artificial intelligence (AI)-generated content is rising sharply. Meanwhile, organisations are increasingly turning to AI for guidance on internal investigations. Three forces are converging to drive this further: whistleblowing laws requiring effective complaints mechanisms; the Data (Use and Access) Act 2025 (DUAA), which pushes organisations to formalise internal reporting and redress; and a host of new employment laws coming into play later this year. Together, they will inevitably trigger an increase in non-compliance complaints.

This raises a number of questions. Should internal investigations be considered as much as a data protection issue as an HR and legal one? And should data protection professionals make the case for an investigation to be treated as a data protection event from the outset? The answer is yes.

Identify a lawful basis

Investigations typically rely on legitimate interests or legal obligation as their legal basis. Where special category data is involved – racial harassment, health, sexual conduct, trade union activity – an additional condition under article 9 of the UK General Data Protection Regulation (GDPR) must also be satisfied. That analysis must be conducted and recorded at the outset, not reverse-engineered afterwards.

Transparency

Complainants must be told what happens to their data without compromising the investigation itself. The subject is entitled to understand the evidence against them, but complete disclosure can undermine whistleblower protection. Neither extreme is defensible.

Scoping

This is where Holmes' warning bites hardest – and the risks are greatest.  Without a clearly defined purpose, investigations drift. Trawling through emails and Teams messages can quickly exceed the original lawful basis. Roles must also be clarified early: who is the controller, and are external investigators processors or joint controllers?

When a DPIA is needed

Data protection impact assessments (DPIAs) are frequently overlooked, but required where investigations use AI, involve large-scale monitoring, or process sensitive data at scale. The more tech-heavy the investigation, the more likely one is needed.

Accuracy in reporting

Reports must clearly distinguish between allegations, witness accounts and established findings. Conflating these creates both legal and data protection risk.

Retention

Reports must clearly distinguish between allegations, witness accounts and established findings. Conflating these creates both legal and data protection risk.

Too much data

Theorising before data is a mistake. But collecting too much data, without a lawful basis or proper controls, is as dangerous as collecting too little data.

Data protection professionals should not be consulted after an investigation framework is built — they should help build it. That, as Holmes himself might say, is elementary.

A DSAR-pocalypse or not?

In the employment context, data subject access requests (DSARs) have become the ultimate power play and the first move in almost every dispute before anyone has set foot near a tribunal. Once a relatively niche transparency mechanism, the DSAR is now a pre-litigation weapon of choice.

In 2024-25, the Information Commissioner's Office (ICO) received nearly 43,000 complaints from individuals about how their personal data is handled, the majority relating to the right of access. This is up from just under 40,000 the previous year. The UK data regulator's figures represent the tip of the iceberg: they capture only those cases in which a formal complaint was made to the ICO. The underlying volume of DSARs submitted to organisations is considerably higher.

AI on both sides

The challenge is growing. More data than ever exists – emails, messages, call recordings, AI-generated summaries – with more and more to wade through when a DSAR lands. Employees, now often AI-assisted, have worked out that a well-timed DSAR is a highly effective pre-litigation tool and employers can find themselves buried in DSAR-related correspondence that is increasingly legalistic, tactical and difficult to manage within a one-month deadline.

AI cuts both ways. Organisations are also increasingly using AI to summarise documents, draft communications and flag issues – often without appreciating that those AI-generated outputs can themselves contain personal data and inadvertently fall within scope of a DSAR.

An AI tool used to summarise a grievance meeting or analyse sentiment in employee communications, for instance, may produce output that constitutes personal data. few organisations have mapped such outputs as part of their data landscape. Where a data subject discovers their information has been processed in ways they did not anticipate, the transparency concerns alone can inflame an already difficult situation. The outputs may also touch on third parties or legally privileged material.

Regulatory developments

The ICO has updated its guidance ahead of the DUAA changes to explain the new data protection complains requirements. Some of the revisions are welcome, if largely clarificatory: there is more flexibility to pause the compliance clock and volume is now expressly recognised as a legitimate proportionality factor. 

But the bar has risen in other respects. The ICO's updated guidance has now codified the findings in Harrison v Cameron & ACL [2024]. And from 19 June, every organisation will need its own structured process for handling data protection complaints.

There has been a real shift in how the ICO handles escalated DSAR-related complaints. In the majority of cases, the ICO now makes a preliminary finding based on the employee's complaint alone, rather than consulting both parties. That presents a distorted picture and means findings of non-compliance against employers are more common than they perhaps should be. Even a relatively minor finding stays on an organisation's record and a pattern of DSAR complaints today could colour how a completely unrelated data issue is treated tomorrow.

The question is no longer whether DSARs are being used tactically in employment disputes – they clearly are – but whether the framework, the regulator or the legislature will move quickly enough to rebalance it. The pressure does not ease; it just arrives through more doors. 

What to do now

Every DSAR should be treated as a potential litigation document. The line between a DSAR response and litigation disclosure is increasingly blurred. Organisations should approach every DSAR with that in mind from the outset: not as an administrative exercise but as a legal one.

Organisations will need to get their data landscape in order including AI-generated content. Most organisations have not mapped where AI-generated outputs sit in their data landscape. If AI is being used to summarise, draft or flag, organisations need to know whether those outputs contain personal data and how they would handle these in a DSAR response, including when privilege or third-party data is involved.

Existing complaints handling processes applicable to data rights (including DSARs) will also need to be reviewed and updated to ensure compliance with the new DUAA obligations coming into force in June and the corresponding ICO guidance. 

Who let the bots out: automating the hunt for top talent

AI tools are now used by organisations at all stages of recruitment: sourcing candidates, screening applications, recording and transcribing interviews, scoring. Candidates, in turn, are using AI to prepare CVs. The result is more applicants per role, which can drive greater reliance on AI to manage the volume. The direction of travel is clear: AI use will increase the need for good governance and guardrails.

ICO focus

In the UK, AI in recruitment is a priority area for the ICO. A consensual audit of developers and providers of AI recruitment tools in 2024 resulted in a report with key recommendations; AI in recruitment remains a listed element of the ICO's AI and biometrics strategy. Why? AI use continues to be an area of particular concern for the public and that is a major factor in determining ICO priorities.

The DUAA has also introduced changes to the use of AI in recruitment, including the relaxation of restrictions on automated decision-making in a recruitment context that makes it easier to use AI for screening and selection.

Looking ahead, ICO guidance on recruitment and selection for organisations expected this summer, alongside a statutory code of practice for organisations developing or deploying AI and automated decision-making systems.

The EU dimension

In the EU, data protection authorities share similar concerns, but with the added layer of the EU AI Act.  AI systems intended to be used for recruitment or selection purposes are classified as high-risk under that regime, meaning bot providers and deployers of those systems must comply with additional obligations.  Other rules are also relevant:  interception of communications legislation, for instance applies where interviews are recorded.

Questions to ask

Organisations using or considering AI in recruitment should work through the following questions: 

  • Is AI being used in recruitment and, if so, at which stages?
  • How is AI use explained to prospective candidates and existing employees?
  • What degree of meaningful human intervention is built into the process?
  • How is bias and discrimination addressed – a concern under both a data protection and an employment law perspective?
  • What are the main challenges around engagement of AI recruitment tool providers, including defining the roles of the parties and obtaining adequate due diligence information.
  • Where both UK and EU law applies, how is the intersection between UK obligations and the EU AI Act managed?

Deleted data and DSARs: what are your obligations?

A recent ruling by the General Court of the European Union in WS v European Commission provides practical clarity on the scope of the right of access under data protection law and its implications for employers managing DSARs.

Although decided under the data protection framework applicable to EU institutions and bodies (which sits alongside but operates separately from the EU GDPR and UK GDPR), the principles established offer persuasive guidance for UK HR teams.

No obligation to recreate or recover deleted data

The applicant sought access to a range of personal data, including internal communications, system access logs, and the identities of individuals who had accessed his data. The General Court dismissed the claim, confirming that data controllers are not required to recover or recreate personal data that was lawfully deleted before a DSAR was received.

The position differs for archived or backup data. ICO guidance makes clear that employers must have processes in place to restore access to and locate personal information held in such systems. Searches need only be reasonable and proportionate, however, and it is accepted that the search mechanisms used may be less sophisticated than those applied to live systems.

Who counts as a recipient?

The General Court also clarified that employees who access personal data in the ordinary course of their duties do not automatically qualify as "recipients" for the purposes of the right of access under article 15 of the GDPR. This is significant and helpful distinction, as it directly limits the scope of information an employer must disclose in response about who has accessed an individual's data.

Practical points for HR teams
  • Data that has been lawfully deleted before a DSAR is received does not need to be recovered or recreated in order to respond to the request.
  • Retention schedules should be clearly documented and consistently applied; all deletion decisions should be recorded to evidence their lawfulness.
  • Internal employees accessing data in the course of their duties are not necessarily "recipients" that must be disclosed in a DSAR response.

For further information on any of the issues raised in this edition, please contact our experts.

Related articles
GDPR for HR | December 2025
UK and EU GDPR for HR | Autumn 2025
Interested in hearing more from Osborne Clarke?

* This article is current as of the date of its publication and does not necessarily reflect the present state of the law or relevant regulation.

Interested in hearing more from Osborne Clarke?