Support Centre

You have out of 5 free articles left for the month

Signup for a trial to access unlimited content.

Start Trial

Continue reading on DataGuidance with:

Free Member

Limited Articles

Create an account to continue accessing select articles, resources, and guidance notes.

Free Trial

Unlimited Access

Start your free trial to access unlimited articles, resources, guidance notes, and workspaces.

UK: Biometric technologies - An early warning and picture of the way forwards from the ICO

Immature, prone to bias, and at danger of discriminating against data subjects: according to the Information Commissioner's Office ('ICO'), all these descriptors could apply to emergent biometric technologies. Kelly Hagedorn and Anna O'Kelly, from Orrick, Herrington & Sutcliffe (UK) LLP, explore the ICO's perspective on biometric technologies, their potential, risks, and regulatory avenues for the future.

da-kuk / Signature collection / istockphoto.com

Background

In a statement published in October 2022, the ICO issued a 'warning'1 to organisations developing and offering 'emotional analysis' biometric technologies that they must consider data protection at an early stage, addressing integral data protection issues of proportionality, fairness, and transparency. Emotional analysis technologies are thought by many to be a major step forwards in biometrics, promising technologies which process data, such as gaze tracking, sentiment analysis, facial movements, gait analysis, heartbeats, facial expressions, and skin moisture in order to interpret and analyse human emotions. However, those technologies are also considered by the ICO as some of the more problematic variants of biometric developments that it will have to regulate in the future.

In a clear message that biometrics will be one of the ICO's key strategic concerns over the coming years, the statement was published alongside two new reports2 aimed at helping businesses which are developing or deploying biometric technologies.

Recognising the future

The reports highlight the twin aims of the ICO's approach to biometric technology regulation: to clarify the current regulatory landscape and to future-proof the ICO's regulation. This much is evident from the titles of the reports: 'Biometrics: insight' ('the insight report') and 'Biometrics: foresight' ('the foresight report') cover, respectively, the current state of biometric technology and regulation, and the privacy implications of biometric technologies in the near future.

Both reports respond to the need, identified by the ICO, to provide further regulatory clarification on key definitions and terminology, sector-specific regulatory issues, the data protection risks associated with biometric technologies, and the lack of perceived regulatory coherence across regulators, including the ICO, the Financial Conduct Authority ('FCA'), the Competition and Markets Authority ('CMA'), and sector-specific regulators.

The insight report

Defining biometrics

In many cases, biometric data is used to identify a natural person and is therefore special category data under Article 9 of the UK General Data Protection Regulation (Regulation (EU) 2016/679) ('GDPR'). This means that it cannot be processed unless one of the Article 9 exceptions applies. However, the insight report explicitly considers activities which use biological or behavioural data for the purposes of classifying or characterising aspects of a person (i.e. where the data may identify an individual, but is not actually being used for such purpose) and which therefore does not come under the definition of special category data (what the ICO terms 'sensitive non-special category data').

The insight report's consideration of wider use cases broadens the reach of its guidance, and raises an important question as to how biometric data should be treated when it falls short of characterisation as special category data. The ICO's concern could be interpreted as an indication that the ICO may focus future guidance on guidelines relating to so-called 'sensitive non-special category data' outside of the UK GDPR special category regime.

Identifying the concerns

The ICO's focus on the data protection risks related to biometric technologies is driven by two main issues:

  • biometric data is intrinsic in nature (i.e. biometric data may be near impossible, or difficult, to alter); and
  • the fact that biometric data has a high potential for inaccurate or inappropriate inferences, and could be impacted by underlying systemic bias.

The insight report identifies several current uses of biometric technologies and highlights potential data protection risks, including:

  • Bias, discrimination, and exclusion: certain technologies, such as facial recognition and emotional artificial intelligence ('AI'), may reflect underlying bias and lead to discriminatory outcomes. Other technologies, such as vasal analysis, iris analysis, and ear analysis, may increase the risk of exclusion or discrimination against disabled individuals where it is the only means of data collection.
  • Transparency: for increasingly 'frictionless' technologies (such as fingerprinting carried out through a high-resolution camera), there is a risk that biometric technologies may be collected and processed in a way that is not transparent to the individual.
  • Sensitivity: the ICO has expressed concern that emotional analysis and physiological analysis, such as brain analysis, sweat tracking, voice analysis, and behavioural analysis, may reveal individuals' sub-conscious responses and provide highly sensitive personal data without their choice.

Mapping the regulatory landscape

The insight report summarises current approaches to biometric regulation around the world, ranging from what the ICO terms as 'broad-spectrum' approaches in Brazil and China, to more prescriptive legislation implemented in France, California, and Illinois.

In particular, the ICO notes the potential benefits of broad-spectrum approaches when it comes to defining and regulating biometric data, highlighting the flexibility which comes from the Brazilian and Chinese regimes. The Chinese regime, which considers all biometric data to be sensitive data, even when it is used for the purposes of classification as opposed to identification, is singled out as having broad regulatory potential, as is the Brazilian regime, which does not precisely define biometric data, but which considers all biometric data to be sensitive.

The ICO also acknowledges that AI regulation will have an impact on the development and deployment of biometric technologies, highlighting the EU's Proposal for a Regulation of the European Parliament and of the Council Laying Down Harmonised Rules on Artificial Intelligence ('the AI Act'), likely to be adopted in late 2023, as a significant development.

The foresight report

Regulating the future

The foresight report is concerned with emergent technologies and how the ICO may tackle the key issues associated with such technologies in the future, also giving an indication of the sectors on which the ICO will concentrate its attention.

The foresight report reiterates many of the concerns of the insights report, emphasising that such concerns are likely to become more prevalent as biometric technologies advance and their deployment increases. In particular, the ICO identifies what it sees as four key issues for the future:

  • the need for further clarity and guidance on biometrics terminology;
  • the increasing use of biometrics technologies for classificatory purposes, and the potential for harm where sensitive non-special category data is not protected with additional safeguards;
  • compliance with transparency and lawfulness requirements when processing ambient data, in particular where technology allows increasingly low-friction collection; and
  • the speed of emotional AI development despite being considered a high-risk technology, whereby the concerns of the foresight report are reflected in the ICO's October statement that the ICO is 'yet to see any emotional AI technology develop in a way that satisfies data protection requirements'.

In the short term (two to three years), the ICO envisages increased use of biometrics across the banking and finance sectors, identifying the potential of voice, gait, and vasal analysis for identification and security purposes. The ICO also sees the fitness and health industries as key sectors for deployment of biometrics in the short term.

In the medium to long term (four to five years), the ICO identifies employment, primary and secondary education, and entertainment as being main areas for biometric growth. The ICO also considers early-stage 'emotional analysis technology' as an area of long-term growth.

Alongside this analysis, the ICO has identified the need for sectoral guidance on the development and use of biometrics. It seems likely that, alongside general guidance, the ICO will seek to provide such sector-specific guidance, either independently or through discussion with other relevant regulators. Due to the sectoral focus of the report, it can be expected that the ICO will focus on short to long term sectors in the coming years in a methodical fashion.

Picturing the way forwards

The statement and reports form part of the ICO's wider strategy toward biometric technology regulation: increased attention on assisting organisations innovating in the biometrics field was highlighted in the ICO's new three-year strategic plan ('ICO25')3. In 2023, we can expect the ICO's activity to focus on two main avenues: guidance and enforcement.

Issuing new guidance

Alongside the statement and reports, the ICO has promised new biometrics guidance in Spring 2023, with consultation open through the early months of 2023 for those who wish to assist the ICO's thinking in this area. For the time being, the statement and reports give a clear indication of the likely direction of the ICO's guidance, including:

  • reconsideration of the definitions used to regulate the biometrics field and guidelines on approaching 'sensitive non-special category data';
  • consideration of sectoral guidance, in particular in the banking, finance, health, education, and entertainment sectors; it is possible that the ICO will consider whether a coordinated approach is needed across regulators to combat perceived incoherence; and
  • consideration of emotional analysis technologies, possibly repeating the ICO's statement that such technologies are considered as unlikely to meet the standards required under data protection legislation.

Aiming at enforcement

The ICO has also indicated increased enforcement activity in the biometrics field, commenting that organisations which 'fail to meet ICO expectations' in developing or deploying emotional analysis technology 'will be investigated'.

The ICO has already signalled some appetite for investigating and taking action against companies deploying biometric technologies in breach of data protection laws. In May 2022, the ICO fined Clearview AI Inc. £7.5 million for the use of biometric data derived from images of UK data subjects obtained from the internet4. The ICO alleged that Clearview used such images to generate vectors that may be used to identify individuals. In addition to the heavy fine, Clearview was also given an enforcement notice that required it to stop processing the personal data of UK data subjects. This enforcement action is currently the subject of an appeal to the First Tier Tribunal.

The long-term view

As the insights report highlights, the ICO may be looking further afield for models of how biometrics regulation can take shape going forwards. While legislation which departs materially from the current regime set by the UK GDPR may be unlikely for the time being, it is possible that the ICO may advocate for a more flexible and risk-based regulatory regime, moving closer to those implemented in jurisdictions, such as Brazil and China.

While the ICO's concentrated focus on biometrics is notable, it remains to be seen whether the ICO will agree with the statements of other stakeholders in the biometrics space, such as the Ada Lovelace Institute whose influential Ryder Review5 argued that the biometrics 'revolution' is upon us, identifying an 'urgent' need for an ambitious new legislative framework specific to biometrics.

For the time being, those developing or deploying biometrics technologies should welcome the ICO's guidance, and note the warnings given around future action and enforcement.

Kelly Hagedorn Partner
[email protected]
Anna O'Kelly Associate
[email protected]
Orrick, Herrington & Sutcliffe (UK) LLP, London


1. See at: https://ico.org.uk/about-the-ico/media-centre/news-and-blogs/2022/10/immature-biometric-technologies-could-be-discriminating-against-people-says-ico-in-warning-to-organisations/
2. Available at: https://ico.org.uk/about-the-ico/research-and-reports/biometrics-technologies/
3. See at: https://ico.org.uk/media/about-the-ico/our-information/our-strategies-and-plans/ico25-strategic-plan-0-0.pdf
4. See at: https://ico.org.uk/media/action-weve-taken/mpns/4020436/clearview-ai-inc-mpn-20220518.pdf
5. See at: https://www.adalovelaceinstitute.org/wp-content/uploads/2022/06/The-Ryder-Review-Independent-legal-review-of-the-governance-of-biometric-data-in-England-and-Wales-Ada-Lovelace-Institute-June-2022.pdf

Feedback