UK: ICO enforcement action for misuse of biometric data
On 9 May 2019, the Information Commissioner's Office ('ICO') issued an enforcement notice against Her Majesty's Revenue and Customs ('HMRC') for processing biometric data in breach of the first data protection principle under the General Data Protection Regulation (Regulation (EU) 2016/679) ('GDPR'), which requires personal data to be processed in a way that is fair, lawful and transparent. The decision is significant as it is the first enforcement action taken by the ICO in respect of biometric data. Nicola Gulrajani, Solicitor at Birketts LLP, explores what the enforcement notice issued against HMRC entails and why it was deemed necessary.
What is biometric data?
Biometric data is defined under the GDPR as:
'personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person.'
Therefore, biometric data includes data such as:
- iris scans;
- facial scans;
- hand geometry;
- voice recognition; and
- certain behavioural characteristics.
Over the last few years, there has been a significant rise in the use of biometric data for security purposes, for example, the use of fingerprint and facial scanning to unlock mobile phones and to sign in to mobile apps.
Background to the HMRC action
In January 2017, HMRC introduced a system on some of its helplines whereby a person's voice could be used as a way to verify caller identity. The characteristics of a person's voice, which can be used to identify them, constitute as biometric data.
Since the voice authentication system was introduced, HMRC had collected the biometric data of 7 million callers.
A complaint was made to the ICO by Big Brother Watch, which alleged that HMRC's use of the voice authentication system was in breach of the GDPR's first data protection principle as it was not undertaken in a fair, lawful, and transparent manner.
Biometric data is classed as special category personal data under the GDPR. In order for the processing of special category personal data to be deemed lawful, an organisation must identify both a lawful basis for processing, under Article 6 of the GDPR, and [emphasis added] an additional condition for processing under Article 9(4) of the GDPR.
HMRC claimed that it relied on consent for its Article 6 of the GDPR lawful basis and explicit consent as its Article 9(4) condition. The ICO investigation found that the consent HMRC had obtained was not valid as it did not meet the GDPR standard for valid consent. Under the GDPR, consent must be freely given, specific, informed, unambiguous and given by a statement or a clear affirmative action.
The method which HMRC had used to obtain consent was to play an automated recording to callers informing them that HMRC was introducing a quicker and more secure way of verifying identification of callers. The recording also contained a brief explanation of how the system worked and why it was beneficial. Callers were then asked to repeat the phrase 'my voice is my password.'
The ICO found that this did not meet the standard for consent required under the GDPR as callers were not provided with sufficient information about how their voice data would be used. The automated message also failed to inform callers that they were not required to sign up to the system, and it also failed to provide a clear alternative for those callers who did not wish to sign up.
Under the enforcement notice issued by the ICO, HMRC must:
- delete the biometric data collected by the voice authentication system for which it does not have explicit consent; and
- instruct its suppliers involved in the operation of the system to delete all biometric data processed by the voice authentication system for which HMRC does not have explicit consent.
In reaching its decision, the ICO took into account the fact that whilst the biometric data had been unlawfully obtained, it was unlikely that there would be damage or distress caused to the individuals as a result. It also regarded the fact that HMRC had attempted to obtain explicit consent retrospectively, but also noted that there were still approximately 5.5 million records for which HMRC did not have explicit consent.
What does the ICO decision tell us?
The large fines which can be imposed under the GDPR have been well publicised. At the time of publication, we have seen the ICO issue its two largest fines to date (£183 million for British Airways, followed by £99 million for Marriott International). However, the enforcement action against HMRC serves as a reminder that the ICO's powers extend beyond the ability to impose fines.
The ICO enforcement notice stated that whilst a large number of individuals had been affected by the breach, it was not a breach which was likely to cause damage or distress to them. This appears to be one of the main factors in the decision not to impose a fine on HMRC.
The enforcement notice also noted that whilst HMRC had tried to take steps retrospectively to obtain explicit consent, it had paid little regard to the data protection principles when planning and implementing the system. This is a reminder of the importance placed on 'data protection by design and default' by the ICO. Had HMRC considered data protection from the outset when implementing this system, it seems likely that it could have made some relatively small changes to the process which would have potentially made the collection of the biometric data lawful.
Nicola Gulrajani Solicitor
Birketts LLP, Norwich