Artificial intelligence

Facial recognition and data protection: new guidelines in the European Union

Published on 23rd Jun 2023

The European Data Protection Board takes a step towards a Union-wide legal framework for facial recognition technologies

In the era of increasing use of artificial intelligence-based technologies, facial recognition has become one of the most common forms of biometric identification. Europe is witnessing a concerning deployment of this technique known as facial recognition technology (FRT), which is often implemented without public knowledge and without the necessary legal safeguards. The rapid advance of these highly privacy-intrusive techniques raises significant concerns and leads to the need to protect the fundamental rights of individuals.

The fact that more and more entities benefit from the functionalities of FRT should make us think about the risks and obligations associated with its implementation. In this regard, it is important to highlight the recent fine imposed by the Spanish Data Protection Agency against a well-known telecommunications company for processing biometric data through facial recognition systems without having previously carried out the corresponding impact assessment.

EDPB guidelines

In order to address the challenges posed by this technology, the European Data Protection Board (EDPB) published in May 2022 a first draft of the Guidelines 05/2022 on the use of facial recognition technology in the context of law enforcement. The first version was open for public consultation until 27 June 2022. The EDPB adopted the final version and on May 17 2023, having updated and added some recommendations.

The main objective of these guidelines is to provide guidance to both national and European Union lawmakers and law enforcement authorities on the application and use of facial recognition techniques. Although these guidelines focus primarily on the use of FRT systems in the framework of criminal investigations in accordance with Directive 2016/680 on the processing of personal data for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, they also include general guidelines and clarifications.

These general provisions are relevant not only for authorities using FRT in the context of criminal investigations, but also to all entities benefiting from this technology in other areas, such as controlling access to premises, making electronic payments or accessing devices or applications, among others.

Biometric data

Firstly, the guidelines explain that FRT is a type of technology that works on the basis of probability and analysis of facial features for automatic identification of individuals. This technique is part of the so-called biometric technology, which consists of using automated processes to recognise individuals based on their physical, physiological or behavioural characteristics, such as fingerprints, iris structure, voice or even gait. These characteristics are known as "biometric data", as they allow for the unique identification of a person.

The text focuses on the use of FRT for two purposes: authentication and identification. Authentication involves verifying the identity of a person by comparing a stored biometric template with their face, while identification aims to find a specific person by analysing their face and comparing it with those in a database.

The EDPB highlights the importance of data controllers to carry out regular and systematic evaluations of these algorithmic procedures. These evaluations aim to ensure the accuracy, fairness and reliability of the results obtained from processing this biometric data. Furthermore, it emphasises that personal data used to evaluate, train and develop facial recognition techniques may only be processed to the extent that there is a sufficient legal basis and the data protection principles laid down in the regulation are complied with.

On the other hand, the EDPB confirms that the processing of biometric data constitutes an interference with the rights recognised in the Charter of Fundamental Rights of the European Union, in particular with regard to the respect for private life and the protection of personal data. In relation to the latter, throughout the text the EDPB insists that, in accordance with the provisions of the charter, the use of these facial recognition tools should only be carried out in strictly necessary and proportionate situations and when this interference with the fundamental rights and freedoms is provided for in a law describing the application and conditions of use of these technologies with the aim of fully respecting those rights and freedoms.

Directive 2016/680

In the legal framework of the Directive 2016/680, there is a prohibition of automated decision-making and profiling based on special categories of data, unless appropriate measures are implemented to protect the rights, freedoms and legitimate interests of the natural persons concerned.

Furthermore, according to the EDPB, remote biometric identification in public spaces poses serious challenges to the privacy of individuals and contradicts the principles of a democratic society by implying mass surveillance. In the same vein, on 14 June 2023, the plenary of the European Parliament adopted its negotiating position on the Artificial Intelligence Act and agreed to prohibit the use of real-time facial recognition systems by public authorities or private entities in public access areas.

The EDPB considers the use of FRT or similar technologies for the purpose of inferring a person's emotions should be prohibited and that the processing of personal data in a law enforcement context should not be based on databases constructed through the indiscriminate and massive collection of personal data, for example through the scraping of photographs and facial images available on the internet.

Special categories of data?

The Spanish Data Protection Agency has long held that biometric data is not considered specially protected data, unless used to uniquely identify an individual. This position differs from the one adopted by the EDPB in the guidelines, where it states that biometric data are considered as special categories of data for both identification and authentication purposes. The Spanish Data Protection Agency will need to reconsider its position in order to align itself with the EDPB. However, this will not be the case with the Catalan Data Protection Authority, which had already been adopting the criteria defined by the EDPB in the guidelines.

This new interpretation could generate uncertainty in the business landscape, as entities benefiting from FRT may have to adapt their practices and policies in order to comply with the requirements applicable to the processing of special categories of personal data. This could affect, for example, the simplicity of operability and access to applications and pose obstacles to employee authentication processes, in particular in those companies that use fingerprint or facial recognition systems for time registration or access to premises. In this regard, it will be essential for these companies to carry out a thorough analysis of the FRT procedures in place in order to bring them in line with the requirements applicable to the processing of special categories of personal data.

Osborne Clarke comment

These guidelines represent an important step towards a legal framework that provides greater legal certainty for the use of this technology in the European Union, promoting its use when strictly necessary, proportionate and with full respect for the privacy of individuals. Only through responsible and ethical management of FRT can we safeguard fundamental rights of individuals in a world increasingly dominated by artificial intelligence.

Share

* This article is current as of the date of its publication and does not necessarily reflect the present state of the law or relevant regulation.

Connect with one of our experts

Interested in hearing more from Osborne Clarke?