Support Centre

EU: EDPB draft guidelines on virtual voice assistants

The European Data Protection Board ('EDPB') recently issued draft guidelines on the use of virtual voice assistants ('VVAs')1, meaning technologies which can understand and execute user voice commands and which are commonly associated with devices such as smartphones, tablets, and traditional computers. Odia Kagan, Partner and Chair of GDPR Compliance & International Privacy at Fox Rothschild LLP, breaks down the guidance of the EDPB and the various legal challenges which are highlighted.

watchara_tongnoi / Essentials collection / istockphoto.com

What is it?

A VVA is a service that understands voice commands and executes them or mediates with other IT systems if needed.

The issue

  • VVAs act as interface between users and their computing devices and online services such as search engines or online shops. Due to their role, VVAs have access to a huge amount of personal data including all users' commands (e.g. browsing or search history) and answers (e.g. appointments in the agenda).
  • There are currently more than 3 billion smartphones and all of them have integrated VVAs, most of them switched on by default. The recent rise of smart speakers (147 million were sold in 2019) is bringing VVAs to millions of homes and offices.
  • The more a VVA provides services or features and is connected to other devices or services managed by other parties, the more the amount of personal data being processed and repurposing processing increases.

The key stakeholders

  • The VVA provider (or designer): responsible for the development of the VVA, designs and defines its possibilities and default functionalities.
  • The VVA application developer: as for mobile applications, creates applications extending the VVA's default functionalities.
  • The integrator: manufacturer of connected objects, who wishes to equip them with a VVA.
  • The owner: in charge of physical spaces receiving people (accommodation places, professional environments, rental vehicles, etc.) he/she wishes to provide a VVA to his/her audience (possibly with dedicated applications).
  • The user.

Data subjects should be in a position to understand and identify the roles at stake and should be able to contact or act with each stakeholder as required under the General Data Protection Regulation (Regulation (EU) 2016/679) ('GDPR').

The legal roles

  • Data controllers providing VVA services and their processors have to consider both the GDPR and the e-Privacy Directive (2002/58/EC) ('the e-Privacy Directive').
  • They should clearly decide and inform data subjects on the conditions under which each of them will act and comply with the resulting roles of controllers, joint-controllers, or processors as provided for by the GDPR.
  • When several stakeholders are involved as independent controllers, they must provide a clear information notice to the data subjects, explaining the various stages and actors of the processing. In cases of joint controllership, it should be made clear if every controller is competent to comply with all data subject's rights or which controller is competent for which right.

Controllers should be vigilant on personal data transfers and guarantee the required level of protection throughout the processing chain, in particular when they use services located outside of the EEA.

Transparency

  • Data controllers should inform all users (registered, non-registered, and accidental users), not only the user setting up the VVA.
  • Consider how to properly inform non-registered and accidental users in order to acquire their valid consent. They should be informed earliest time possible and at the latest, at the time of the processing.
  • Users should be informed of the status in which the device currently lies (is it listening in closed-loop circuit? is it streaming to the back-end?). This can be done by specific vocal acknowledgements and visible icons or lights, or the use of displays on the device and should be accessible for people with disabilities. Eye contact with the device should not be required so all user feedback, including state changes should be available in visual and acoustic form at least.
  • When using audio to inform data subjects, provide the necessary information in a way that is concise and clear and data subjects should be able to re-listen.
  • Information provided as complex 30 page privacy notice does not comply with the transparency requirements as set out in Articles 12 and 13 of the GDPR.
  • Do not bundle a single use account for the VVA service with other services such as email or video streaming.
  • Make the users understand whether or not their use of the VVA will be linked to other processing activities managed by the VVA service provider (e.g. telecommunication, e-commerce, information technologies, or web activities) apart from the strict use of the VVA and with controls allowing the user to choose if the data will be used or not for profiling.
  • Be transparent on what specific aspects of the raw data they process. For example - is meta-information contained in a voice sample collected (e.g. stress level of the speaker) and  what kind of information a VVA can derive about its surroundings, such as but not limited to other people in the room, music running in the background, any processing of the voice for medical or marketing other reasons, pets, etc.
  • If voice messages are to be used to inform users according to Article 13, the data controllers should publish such messages on their website so they are accessible to the users and the data protection authorities

Data minimisation

VVA designers should consider technologies filtering the unnecessary data (like individuals' voice or data such as background noise) and ensuring that only the user's voice is recorded.

Purpose limitation and legal basis

  • Data controllers should clearly specify their purpose(s) in relation to the context in which the VVA is used, so that they are clearly understood by the data subjects. This can include: (i) Execute users' requests; (ii) improvement by training of the machine learning model and human review and labelling of voice transcriptions; (iii) User identification (using voice data); (iv) User profiling for personalised content or advertising.
  • Voice data is inherently biometric personal data. As a result, when such data is processed for the purpose of uniquely identifying a natural person or is inherently or determined to be special category personal data, the processing must have a valid legal basis in Article 6 and be accompanied by a derogation from Article 9 GDPR.
  • It is very likely that VVA services fall into the categories and conditions identified as requiring a Data Protection Impact Assessment.
  • Consent is required for the storing or gaining of access to information for any purpose other than executing a users' request (Article 5(3) of the e-Privacy Directive). Insofar as the voice data is processed in order to execute the user's requests, it is exempted from the requirement of prior consent under the e-Privacy Directive.
  • Any personal data processing that is necessary to execute the user's request can therefore rely on the legal basis of the performance of the contract. Such processing notably includes the capture of the user's voice request, its transcription to text, its interpretation, the information exchanged with knowledge sources to prepare the reply and then, the transcription to a vocal final reply that ends the user's request.
  • Consent (under Article 6 of the GDPR) will likely also constitute the legal basis for the processing of personal data following the aforementioned processing operations.
  • It is highly unlikely that an accidental activation could be interpreted as valid consent. If a data controller becomes aware (e.g. due to quality review processes) of the accidental collection of personal data, they should verify that there is a valid legal basis for each purpose of processing of such data. Otherwise, the accidentally collected data should be deleted.
  • The use of voice data for user identification implies the processing of biometric data as defined in Article 4.14 of the GDPR. Consequently, the data controller will need to identify an exemption under Article 9 of the GDPR in addition to the identification of a legal basis under Article 6 of the GDPR. Of the exemptions listed in Article 9 of the GDPR, only data subjects' explicit consent seems applicable for this specific purpose.
  • User profiling for personalised content or advertising:
    • Where personalisation takes place in the context of a contractual relationship and as part of a service explicitly request by the end-user (and the processing is limited to what is strictly necessary to provide this service), such processing may be based on Article 6(1)(b) of the GDPR.
    • If processing is not strictly 'necessary for the performance of a contract' within the meaning of Article 6(1)(b) GDPR, the VVA provider must, in principle, seek the consent of the data subject.
    • Regarding user profiling for advertisement, it should be noted that this purpose is never considered as a service explicitly requested by the end-user. Thus, in case of processing for this purpose users' consent should be systematically collected.
    • A controller that seeks consent for various different purposes should provide a separate opt-in for each purpose, to allow users to give specific consent for specific purposes. For example, users should be able to separately consent or not for the manual review and labelling of voice transcriptions or the use of their voice data for user identification/authentication.

Data subject rights

  • Ensure users can exercise their data subject rights using easy-to-follow voice commands.
  • Inform users that their rights have been duly factored, by voice or by providing a writing notification to the user’s mobile, account or any other mean chosen by the users

Children's data

  • When the legal basis for the processing is the performance of a contract, the conditions for processing children data will depend on national contract laws.
  • When consent is the legal basis, explicit permission should be sought from parents or guardians to collect, process, and store children's data (voice, transcripts, etc.). Data controllers should invest in developing means for parents or guardians to control children use of VVAs.

Data retention

  • VVA should store data for no longer than is necessary for the purposes for which the personal data is processed. Retaining personal data until users require their deletion is not in line with the storage limitation principle.
  • The data retention periods should be tied to different processing purposes.
  • Not only do data controllers need to limit the data storage period, but also the type and quantity of data.
  • Retaining personal data indefinitely goes against the storage limitation principle. Providing data subjects with means to delete their personal data does not remove the data controller's responsibility of defining and enforcing a data retention policy.
  • VVA design needs to take into account users' controls to delete their personal data in their devices and in all remote storage systems.
  • Data controllers may need to retain personal data as evidence of a service provided to a user to comply with a legal obligation. The data retained should remain the minimum necessary to comply with such a legal obligation and for the minimum amount of time and should not be used for any other purposes without a legal basis under Article 6 GDPR.
  • If users do not consent or withdraw their consent to the use of their data for improvement by training machine learning systems with users' data, their data could not be lawfully used to train any more model and should be deleted by the data controller, assuming that there is no other purpose justifying the continued retention. Nevertheless, the model previously trained using this data does not have to be deleted. Data controllers and processors should therefore apply mitigation measures to reduce the re-identification risk to an acceptable threshold to make sure they use models which do not contain personal data.
  • Data subjects should not be nudged to keep their data indefinitely. VVA service providers should avoid making general statements on the degradation of the service after personal data is deleted.
  • After a query has been answered or a command executed, the personal data should be deleted unless the VVA designer or developer has a valid legal basis to retain them for a specific purpose.
  • Before considering anonymisation as means for fulfilling the data storage limitation principle, VVA providers and developers should check the anonymisation process renders the voice unidentifiable.
  • Configuration defaults should reflect these requirements by defaulting to an absolute minimum of stored user information. If these options are presented as part of a setup wizard, the default setting should reflect this, and all options should be presented as equal possibilities without visual discrimination.

Security

  • VVAs are increasingly present in users' homes and smartphones. This circumstance increases the risk of mass surveillance and mass profiling. Consequently, the security measures to protect the data both in transit and at rest, in the devices and in the Cloud, should match those risks.
  • VVA designers and application developers should provide secure state-of-the-art authentication procedures to users.
  • Human reviewers should always receive the strictly necessary pseudonymised data. The legal agreements governing the review should expressly forbid any processing that could lead to the identification of the data subject.
  • If emergency calling is provided as a service through the VVA, a stable uptime should be guaranteed.

Special category data

  • Some VVA have the capability of uniquely identifying their users merely based on their voice. This process is known as voice template recognition.
  • The recognition of a user's voice on the basis of a voice template amounts to the processing of special categories of personal data within the meaning of Article 9 GDPR (processing of biometric data for the purpose of uniquely identifying a natural person). This requires explicit consent. Therefore, when obtaining users' consent, data controllers must comply with the conditions of Article 7 and recital 32 of the GDPR and should offer an alternative identification method to biometrics, with regard to the free nature of consent.
  • Therefore, when obtaining users' consent, data controllers must comply with the conditions of Article 7 and recital 32 of the GDPR and should offer an alternative identification method to biometrics, with regard to the free nature of consent.
  • When the registered user configures the VVAs to identify the voice of its users (or to distinguish among them), the voice of non-registered and accidental users will also be processed for the purpose of uniquely identifying them.
  • In order to avoid such collection of biometric data without the knowledge of the data subjects while allowing a user to be recognised by the assistant, solutions based on the user's data alone should be given priority. In concrete terms, this means that biometric recognition is only activated at each use at the user's initiative, and not by a permanent analysis of the voices heard by the assistant. For example, the user can say 'identification' or the assistant can ask 'do you wish to be identified' and wait for a positive response to activate biometric processing.
  • Voice templates should be generated, stored and matched exclusively on the local device, not in remote servers.
  • Due to the sensitiveness of the voiceprints, standards such as ISO/IEC 24745 and techniques of biometric template protection should be thoroughly applied.
  • If a VVA uses voice based biometric identification VVA providers should ensure that the identification is accurate enough to reliably associate personal data to the right data subjects and ensure that the accuracy is similar for all user groups by checking that there is no substantial bias towards different demographic groups.

Accountability

  • Voice data can be used for accountability (e.g. to prove consent). The retention obligation for such voice data would then be dictated by the accountability requirements of the relevant specific legislation
  • Include processing voice data in your Article 30 GDPR ledger.

Data Protection by Design and by Default

  • Consider the necessity of having a registered user for each of their functionalities.
  • Some VVA providers include third-party apps in the VVA default setup so these apps can execute those apps by using specific wake up expressions. VVAs using this third-party app deployment strategy should ensure that users get the necessary information also on the third-party processing.
  • Some VVA providers alleged their VVAs could not delete all users' data even when requested by the data subject. VVA providers should ensure that all users' data can be erased at the user's request in accordance with Article 17 of the GDPR.

Mechanisms for data subject rights

  • Data controllers providing VVA services must allow all users, registered and non-registered, to exercise their data subject right.
  • VVA providers and developers should facilitate data subjects' control over their data during the entire processing period.
  • The data controller should provide information on the data subject's rights at the time when data subjects switch on a VVA and at the latest, at the time when the first user's voice request is processed
  • Ensure that users, registered or not, can exercise any data subject rights, using easy-to-follow voice commands. This includes self-service tools, as a profile management system. At the end of the exercise process inform the user that his/her rights have been duly factored, by voice or by providing a writing notification to the user's mobile, account, or any other mean chosen by the user.

Right to access

  • The format for providing the information should enable the information to be presented in a way that is both intelligible and easily accessible.
  • A PDF file would likely not be sufficient. And neither is simply referring users to a history of their interactions with the voice assistant.

Right to rectification

  • The right to rectification applies to any opinions and inferences of the data controller, including profiling, and should consider the vast majority of data is highly subjective

Right to erasure

  • Applies to all users, registered or not.
  • Due to the inherent difficulties of anonymising voice data and the wide variety of personal data collected from, observed and inferred about the data subject, in this context the right to erasure could be hardly accommodated by anonymising personal datasets. As the GDPR is technology neutral and technology evolves rapidly, it will nevertheless not be excluded that right to erasure may be made effective through anonymisation.
  • A dashboard (or an application) widely accessible to users in order to ease its use should be supplied with the voice assistant to delete the history of the requests asked and customise the tool according to user's needs.
  • In case the same personal data is processed for different processing purposes, data controllers should interpret erasure requests as a clear signal to stop processing of the data for all purposes that are not legally exempted.
  • Data processed on the basis of legitimate interests of the VVA providers should not be an exemption to the right of erasure, in particular because data subjects do not reasonably expect further processing of their personal data.

Right to data portability

  • The right to data portability should facilitate switching between different VVA providers.
  • Offer users the possibility of directly retrieving their personal data from their user area, as a self-service tool. The users should also be able to exercise this right through voice command.
  • VA providers should provide personal data using commonly used open formats (e.g. mp3, wav, csv, gsm, etc.) along with suitable metadata used in order to accurately describe the meaning of exchanged information.

Odia Kagan Partner and Chair of GDPR Compliance & International Privacy
[email protected]
Fox Rothschild LLP, Philadelphia


1. Available at: https://edpb.europa.eu/sites/edpb/files/consultation/edpb_guidelines_022021_virtual_voice_assistants_adopted-public-consultation_en.pdf