IT and data

Virtual voice assistants' challenge to comply with data protection legislation

Published on 22nd Sep 2021

On 7 July 2021, a new version of the guidelines on virtual voice assistants was adopted by the European Data Protection Board, analysing the use of this technology in the market and providing some recommendations in order to comply with data protection and privacy obligations.

Technological developments have enabled the integration of virtual voice assistants (or "VVAs") into all kind of devices, leading to a massive processing of a high volume of personal data that has certainly not gone unnoticed by European data protection authorities.

Currently, more than three billion smartphones have VVAs integrated by default. Tech giant, Google, announced during the Consumer Electronic Show 2020 that more than five hundred million users were using their voice assistant, 'Google Assistant', every month. Task simplification and easy access to information through voice-based interfaces are the main incentives for users to equip themselves with VVAs-integrated devices, especially in home automation, although its use in other sectors such as healthcare has increased too. For example, callbots (robots who answer calls in a personalized manner) with integrated VVAs were used for Covid-19 pre-diagnosis.

With the Guidelines on Virtual Voice Assistants (the "Guidelines"), the European Data Protection Board (the "EDPB") identifies the main data protection and privacy challenges posed by VVAs, while providing recommendations on how to comply with both the General Data Protection Regulation ("GDPR") and the e-Privacy Directive.

In the first section, the Guidelines focus on analysing the technological background of VVAs, defining them as software capable of understanding and executing users' voice commands and that allow the integration with third-party components and apps (e.g. online encyclopaedias, music, banking or weather apps). Additionally, it should be noted that, while on standby, VVAs are actively on a listening mode in order to detect locally (in the device itself) the expression that produces its activation and the processing of the information received. To this end, VVAs use machine learning techniques to identify voice patterns and make predictions to activate themselves, process the information received and execute users' commands. The large number of players involved in the VVA ecosystem (from the VVA designer who defines the functionalities, activation modalities or hardware specifications to the VVA integrator who adds it to its product and the developer of apps with VVA functionalities by default) increases the risk of having many subjects accessing the information captured by VVAs with different purposes (e.g. VVA designers want to access data about the use of VVA in real conditions to improve voice assistant performance).

The second section of the Guidelines provides recommendations to comply with the data protection and privacy obligations resulting from the application of the GDPR and the e-Privacy Directive. However, it previously addresses controversial issues such as non-authorised processing of personal data when VVAs are activated by non-registered users or accidentally, with special emphasis on the fact that the diversity of users may entail the processing of data of vulnerable subjects, such as children or people with disabilities. It also warns that, in the context of VVAs, users' voices are considered biometric data that require special protection and greater legal requirements for its processing (e.g. besides the legal basis regulated in Article 6(1) of the GPDR, also one of the exceptions regulated in Article 9(2) of the GPDR for the processing of special categories of personal data).

Among EDPB's recommendations, those regarding compliance with the duty of information and transparency stand out, urging data controllers to use VVAs interactivity to provide information on data protection to registered, non-registered and accidental users, insisting on the fact that information should be provided in a concise and transparent manner (e.g. in different sections within global privacy policies or clearly indicating which data is being collected and processed, and if VVAs are capturing sounds and background information). With regard to the lawfulness of the processing that the different actors involved may carry out with VVAs, the EDPB indicates that the appropriate legal bases are the execution of a contract with the users (to execute VVA users' requests or to improve, in some cases, VVAs performance) and the explicit consent (for voice identification or for user profiling for personalized content or advertising). Furthermore, it stresses the need for data processing purposes to not go far beyond the reasonable expectations of users of VVA integrated devices. Additionally, the EDPB indicates that, for the tasks of comparing and verifying the expression that activates VVAs, it is necessary to access the data stored locally in the user's device; therefore, the user's consent is necessary if such data is used for purposes other than for executing the user's requests. On the other hand, the EDPB warns that the widespread practices among VVAs of storing personal data indefinitely, unless the user proactively deletes it, violates the storage limitation principle and recommends, for the deletion of personal data, to check voice anonymization processes and to design VVAs in such a way that the minimum user information is stored by default. Finally, the use of voice as a means of communication in VVA environments poses new security risks, for which the EDPB recommends that biometric recognition should be activated at each use at the user's initiative and not by a permanent analysis of the background voices heard by the VVAs.

The amount of players involved in the virtual voice assistants environment makes it difficult to differentiate which role within the legal framework applies to each of them in terms of data protection, especially taking into account that the same player (e.g. a VVA's designer) may act as data controller and also as data processor when processing personal data on behalf of the application developers.

Finally, regarding the mechanisms set for VVA users to exercise their data protection rights and, in particular, the rights to access, rectification, data portability or erasure of personal data, the EDPB indicates the need to provide those rights to the three categories of users previously identified (registered, non-registered and accidental), including by means of easy-to-follow voice commands. The Guidelines also point out that data controllers should inform data subjects of their rights when switching on the VVA or at the time when the first user's voice request is processed.

Despite the efforts made by data protection authorities to provide recommendations in order to comply with data protection and privacy laws, the truth is that the evolution of technology and, in particular, artificial intelligence is causing scenarios of legal uncertainty among developers and users of virtual voice assistants.

Interested in hearing more from Osborne Clarke?


* This article is current as of the date of its publication and does not necessarily reflect the present state of the law or relevant regulation.

Connect with one of our experts

Interested in hearing more from Osborne Clarke?

Related articles