Support Centre

You have out of 5 free articles left for the month

Signup for a trial to access unlimited content.

Start Trial

Continue reading on DataGuidance with:

Free Member

Limited Articles

Create an account to continue accessing select articles, resources, and guidance notes.

Free Trial

Unlimited Access

Start your free trial to access unlimited articles, resources, guidance notes, and workspaces.

Spain: An approach to facial recognition

Facial recognition technology ('FRT') is an area which creates novel challenges for European data protection law and in particular how certain provisions of the General Data Protection Regulation (Regulation (EU) 2016/679) ('GDPR') can be interpreted. Roger Vilanova Jou, Lawyer at PwC, discusses this topic and recent developments in the Spanish context.

4X-image / Signature collection /


At the time of writing, the EU is at a turning point in shaping the relationship between artificial intelligence ('AI') and society. Last April, the European Commission ('the Commission') proposed the first legal framework on AI ('the Proposal')1, and whilst the public debate is raging about its pros and cons, governments2 and several entities have been increasingly interested in enhancing the use of AI.

However, it would be unreasonable to claim that any operation that is more efficient with the application of a new technology is always justifiable from an ethical or even a data protection perspective. In fact, some European initiatives3 even claim that we are at a point of no return whether regulators conditionally restrict – but do not totally ban – certain uses of AI which can lead to unlawful mass surveillance and discrimination, such as remote biometric identification4.

Indeed, the advent of AI in general and the development of FRT in particular is raising questions about where the red lines for the processing of personal data should lie, in an ongoing debate between the efficient allocation of technological resources and the fundamental rights and freedoms of citizens.

As it happens, the Spanish data protection authority ('AEPD') has expressed its concern about the proliferation of proposals received in relation to the use of FRTs in its Report 47/20215 and has already imposed a multi-million sanction to a supermarket chain for installing a facial recognition system contravening personal data protection regulations6.

Considering the above, the aim of this piece is to shed some light on how authorities are analysing facial recognition in Spain from a data protection perspective. We will discuss what category of personal data is processed, the problems on finding a legal basis under Article 9(2) of the GDPR, as well as other principles and obligations that a data controller should not forget when considering the use of FRT.

The category of personal data

A recurring question when deciding to use a FRT is what category of personal data is to be processed. Despite its apparent simplicity, this question is key for several reasons. Firstly, to know whether we are handling biometric data and, secondly, to determine whether we are processing sensitive data, which is crucial for considering the exceptions set out in Article 9(2) of the GDPR and the risks involved in the processing.

On the one hand, according to the definition set forth in Recital 51 and Article 4(14) of the GDPR, images will be considered biometric data insofar as they are processed by specific technical means allowing the unambiguous identification or authentication of a natural person (e.g. on the basis of physical, physiological, or behavioural characteristics).

On the other hand, following the Guidelines 3/2019 on processing of personal data through video devices7, published by the European Data Protection Board ('EDPB'), the processing of such biometric data should not systematically entail the processing of sensitive data under Article 9 of the GDPR if they have not been specifically technically processed to contribute to the unique identification of an individual so, according to the EDPB, special attention should be paid to the purpose and means of the processing8.

At this point, then, what is unique identification considering Article 9(1) of the GDPR? To the reader's possible surprise, there is an active discussion in Spain to address this question in relation to the processing of biometric data:

  • The AEPD argued in its Report 36/20209 that, in general, biometric data will only be considered sensitive data in cases where it is subject to a technical processing aimed at biometric identification (matching process one-to-many) but not in the case of biometric authentication (matching process one-to-one), following the definitions set forth in the Opinion 3/2012 on developments in biometric technologies10 of the Article 29 Data Protection Working Party ('WP29') and later used by the EC's White Paper on Artificial Intelligence11.
  • However, this position has not been held by the Catalan data protection authority ('APDCAT'). According to the APDCAT's Opinion 21/202012, both biometric identification and biometric authentication can achieve the unique identification of a natural person. Therefore, it suggests that biometric data processed by either of these techniques should deserve the same level of enhanced protection set forth in Article 9 of the GDPR.

The above could leave uncertainty as to whether a restrictive interpretation of Article 9(1) of the GDPR may subsequently lead to future sanctions due to a change in criteria or even lead to promoting the installation of FRT systems in Spain by those who misleadingly argue that they are using this technology for authentication purposes.

However, the AEPD has recognised that the distinction between biometric identification and biometric authentication is a complex issue that must be analysed on a case-by-case basis and it is subject to interpretation. So, the AEPD urges that, in case of doubt, and as long as the EDPB or the courts do not pronounce themselves on this matter, the most favourable interpretation to the protection of data subjects' rights should be applied.

Therefore, and despite the questions that this discussion may pose, under the current interpretation of the AEPD the use of FRTs will generally entail the biometric identification of a data subject, thus involving the processing of sensitive data – as concluded in various reports13.

The legal basis to process biometric data

It follows from the above that, in most cases, the processing of biometric data with FRTs would involve the processing of sensitive data, requiring a case-by-case derogation under Article 9(2) of the GDPR.

Nonetheless, finding this exception may not be an easy road. Below we analyse the main legal grounds argued before the AEPD to legitimise the processing of personal data with the use of FRTs under Article 9(2) of the GDPR.

Article 9(2)(g) of the GDPR: substantial public interest

The most common question the AEPD receives in relation to the use of FRTs is whether there can be a substantial public interest to justify the processing under Article 9(2)(g) of the GDPR, especially when no other legal basis can be applied.

In this regard, the Spanish authorities agree that is not sufficient to merely invoke such interest  (please see Judgment 292/200014 and Report 47/2020), but rather it must be supported by legislation that respects the principle of proportionality. In detail, such law shall contain:

  • the specific substantial public interest that justifies the restriction of the right to the protection of personal data and in what circumstances it may be limited;
  • precise rules making the imposition of such a restriction and its consequences foreseeable for the data subject; and
  • appropriate technical, organisational, and procedural safeguards to prevent risks of varying probability and severity, as well as to mitigate their effects.

As the AEPD has confirmed, there is currently no regulation in the Spanish legal system regarding facial recognition that meets the above requirements. Therefore, the substantial public interest required by Article 9(2)(g) of the GDPR could not be found today, notwithstanding that the legislator may consider it in future regulations.

Article 9(2)(b) of the GDPR: employment, social security, and social protection

In the labour sphere, however, several grounds can apparently be found for applying Article 9(2)(b) of the GDPR, from collective agreements to legal provisions such as Article 20.3 of the Spanish Workers' Statute. Particularly, this article keeps an open door for the employer to adopt the measures he or she deems most appropriate for the purpose of monitoring the fulfilment of worker's obligations and duties, with due regard to their dignity (please see Report 73/2020).

Still, employers also have limitations. Whereas the use of FRTs can be perceived as particularly effective, employers must previously evaluate the impact on fundamental rights and freedoms, by assessing the necessity and proportionality of the processing in relation to the purpose (Article 35(7)(b) GDPR) - derived from the proportionality test followed by courts, please see Judgment 14/200315 and EDPB's Guidelines 3/2019.

In view of the above, it is doubtful to observe situations where there are no less intrusive means to achieve the legitimate purpose of the processing other than with a facial recognition technique, so this should be analysed on a case-by-case basis. It was not without reason that the WP29 stated in its Opinion 2/2017 on data processing at work16 that employers should generally refrain from using FRT, with a few exceptions.

Article 9(2)(a) of the GDPR: what about consent?

The answer to this question will very much depend on whether the data controller is able to guarantee that explicit consent has been freely given, in compliance with the conditions set out in the GDPR. This could be the case where facial recognition is offered as an alternative possibility to another that does not require the use of biometric data, following AEPD's Report 36/2020 on the use of FRT for the online evaluation of university students.

By contrast, there are scenarios where it will be very difficult to ensure that consent has been freely given in accordance with the Articles 4(11) and 7(4) GDPR, as for example where there is a power imbalance or when the data controller makes the provision of a service conditional on the processing of biometric data.

Moreover, consent might also not be appropriate in situations where there is a legal provision requiring identification by an exhaustive list of means that do not include the processing of biometric data, as envisaged in the Report 47/2021 on the use of facial recognition for know your customer purposes under Spanish anti-money laundering ('AML') and combating the financing of terrorism ('CFT') regulations.

Further considerations: principles and obligations

Notwithstanding the above, the data controller should not forget some of the most common personal data protection obligations, such as, for example:

  • informing the data subject on the processing of his or her personal data with FRTs, in compliance with Article 13 of the GDPR or, when appropriate, Article 14 GDPR;
  • to the extent that facial recognition solutions may be offered by providers acting as data processors, data processing agreements must be in place, in accordance with Article 28 of the GDPR; and
  • appointing a data protection officer ('DPO') where circumstances so require, following Article 37 of the GDPR.

Moreover, the controller must also pay attention to the principles set out in Article 5 of the GDPR - with particular regard to purpose limitation, data minimisation, and limitation of the retention period - whilst considering the principle of accountability and Data Protection by Design and by Default.

In this regard, according to Article 24 of the GDPR, the data controller shall implement appropriate technical and organisational measures in order to ensure and be able to demonstrate that the processing is in compliance with GDPR, for which a corresponding risk analysis and Data Protection Impact Assessment ('DPIA') are essential to provide a level of security appropriate to the risk17.

Precisely, under certain circumstances where the DPIA shows that the processing would still entail a high risk, the data controller shall consult the competent data protection authority in accordance with Article 36 of the GDPR.

What the AEPD might say in this prior consultation may vary depending on the case. However, the Spanish authority offers some hints in Reports 36/2020 and 65/201518 in relation to the implementation of measures that might reduce the risk of the processing: the data subject should retain maximum control over his or her biometric data19.

Additionally, the controller must note that if the use of FRTs involves automated individual decision-making, Article 22 of the GDPR should also be considered.


  • Just because a FRT is on the market or available to an organisation does not mean that it is the best solution to improve a process or automate it. Before proceeding with its installation, the risks that it may entail should be warned, with particular attention to compliance with personal data protection regulations.
  • In Spain, there is still an open debate on the interpretation of Article 9(1) of the GDPR, but this should not influence the adoption of appropriate measures to ensure the processing of biometric data which, in most cases when FRTs are used, will be considered sensitive data.
  • To process such data, there must be a legal basis from Article 9(2) of the GDPR, but consent is not always an adequate solution. There are also no rules establishing a substantial public interest in Spain for now. Moreover, the proportionality of the measure will have to be considered.
  • The controller will still need to take into consideration all the obligations and principles set out in the GDPR, including the implementation of appropriate technical and organisational measures. In this regard, there will be occasions where prior consultation with the competent supervisory authority will be necessary.
  • It can be seen from the above how difficult it can be to assess the implementation of an FRT-based measure, so it will be necessary to involve the DPO, where appropriate, and get advice from a privacy professional on the feasibility of such technology before it is installed.

Roger Vilanova Jou Lawyer
[email protected]
PwC, Barcelona

1. Available at:
2. In Spain, the interest on AI is no different. At the end of 2020, the Spanish Government presented the National Artificial Intelligence Strategy to reinforce the transformation of the Spanish economy and society –of which the non-binding Charter of Digital Rights from July is part (resources only available in Spanish).
3. In this regard, please also note that the European Data Protection Board ('EDPB') and the European Data Protection Supervisor ('EDPS') have called for a general ban on some uses of AI in their joint opinion on the Proposal.
4. See:;
5. Only available in Spanish at:
6. Only available in Spanish at:
7. Available at:
8. By way of example, when the purpose of the processing by a video surveillance system is to distinguish one category of people from another based on gender and age, but it cannot uniquely identify anyone, the processing would not fall under Article 9 of the GDPR.
9. Only available in Spanish at:
10. Available at:
11. Available at:
12. Only available in Catalan at:
13. See Reports 36/2020, 31/2019, 73/2020, and 47/2021, only available in Spanish.
14. Only available in Spanish at:
15. Only available in Spanish at:
16. Available at:
17. Please note that the AEPD has published a document with Audit Requirements for Personal Data Processing Activities involving AI that may help to establish specific controls derived from the peculiarities of AI-based components.
18. Only available in Spanish at:
19. For instance, by storing such data on devices in the custody of the data subject that incorporate the necessary safeguards to ensure that access to such information is not compromised.