Support Centre

Ukraine: Regulation of facial recognition technology

All over the world, the implementation of facial recognition technology is growing quickly. There are concerns surrounding this escalation, such as how personal data is being processed and whether valid consent is being obtained. Vladyslav Podolyak and Tetiana Maksymovych, from Podolyak and Partners, analyse the use and legality of facial recognition technologies in Ukraine and look at how the current legal framework regulates this technology.

ilbusca / Signature collection / istockphoto.com

At the beginning of 2019, a new analytical facial recognition system was launched in Kyiv. Previously, in San Francisco, facial recognition technology was restricted and in Hong Kong, the implementation of facial recognition systems led to the organisation of anti-government protests against new technologies used for surveillance. These different approaches and reactions have created a climate of uncertainty.

Such a lack of legal clarity can pose risks for municipalities, governments, and businesses that are involved in the installation and the use of facial recognition technologies. Below is an outline of the concerns raised by facial recognition from a legal standpoint.

Potential stakeholders

Everyone who walks in the streets is a potential stakeholder when it comes to facial recognition. They are all data subjects with privacy rights, whose faces might be captured by smart technologies and processed. Although, there is little public and professional discussion on this issue in Ukraine at the time of publication, the situation might evolve under different scenarios. No doubt, such technology could be seen as helpful. The use of video surveillance, both by public authorities and the private sector, is developing quickly to address urban safety infrastructure. The collection and processing of personal data has become a vital component in the creation of smart cities, and artificial intelligence is increasingly being applied to enable smart city sustainability.

While these are potential positive uses of smart technology, experiences such as Hong Kong demonstrate that public responses can be complex and focus on the risks of such technology

Ukraine's legal framework

On 6 July 2010, the Ukrainian Parliament ratified the Strasbourg Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data No. 108 ('Convention 108'). The same basic principles were implemented into national law by the Law of 1 June 2010 No. 2297-VI on Personal Data Protection (as last amended) ('the Law'). Thus, Ukraine's data protection legislation is more or less similar to Convention 108. However, both Convention 108 and the Law can seem outdated when used to regulate new and emerging technology.

As well as this, the General Data Protection Regulation (Regulation (EU) 2016/679) ('GDPR') is not part of Ukraine's national legislation, however it has a limited exterritorial scope. For example, Ukrainian data controllers are required to be GDPR-compliant if they operate in the European Economic Area or collect data on EU citizens.

Privacy by Design is not often implemented in Ukraine, nor are respective possible undertakings for manufacturers while designing video systems in the context of privacy restrictions. Therefore, privacy requirements, including those related to the use of CCTV and facial recognition, hold the data controllers and processors fully accountable, rather than the manufacturers. Opt in/opt out concepts are also very vague and are applied on a case by case basis.

Video surveillance rules

Article 307(1) of the Civil Code of Ukraine of 16 January 2003 No. 435-IV provides that consent is required for video surveillance and that consent can be assumed if the recording is carried out in public areas. However, these provisions do not necessarily apply to facial recognition. Therefore, there is an issue with whether the assumption of consent for video recording should also encompass consent for facial recognition. Does this mean that special notices should be introduced that clarify that CCTV has facial recognition capabilities?

Facial recognition might be sensitive data processing

Facial recognition is normally a subsequent phase after video recording, and generally requires specific searches or image matching. There are no practical restrictions for smartphones or similar gadgets which have facial recognition features embedded as these gadgets are used privately, and normally by their owners. CCTV, however, is usually operated by third persons who are therefore data controllers.

A person's facial image shall normally be regarded as biometric data, and consequently as sensitive personal data. Thus, facial recognition involves the processing of sensitive data which has specific restrictions. The processing of biometric or sensitive data is normally prohibited for data controllers, unless explicit consent is given by the individual, or there is a justified reason for processing this data, such as protecting human rights, medical help, or other similar statutory exemptions.

This raises the question of whether human rights protection and public safety concerns, such as combatting terrorism and law enforcement, override requirements for explicit consent? Reportedly, with the help of new technologies it has been possible to improve the crime situation in certain cities in eastern Ukraine and in Kyiv. Last year, Mariupol, which implements the 'Safe City' program, was recognised as one of the safest cities in Ukraine, which was at least partly due to the installation of smart CCTV. This smart CCTV identifies and tracks criminals or suspects; however the general public are also recorded by this CCTV. For this surveillance system, initially there are no differences between a criminal and a law-abiding citizen: everyone is watched.

The image matching process, as part of facial recognition technology, needs separate analysis, and consequently, image database clearance is another issue. Suspects are normally recognised if their image is stored in a database used for the matching process. How are these images linked to the facial recognition database, has everyone involved consented for being included in the database, and does the purpose of data processing remain the same as it was initially?

Also, facial recognition is very close to automated data decision making, which data subjects have the right to be protected from. Therefore, are data subjects protection from facial recognition processing?

Conclusion

So far, there is no ban on the use and application of facial recognition technologies in Ukraine. Possibly, facial recognition is justified based on its application in crime reduction, and especially terrorism. However, the absence of clear legislation is preventing businesses from using this technology. Perhaps, low awareness on the use of this technology gives people a feeling of safety from being identified. So far, we have not come across any litigation matters involving facial recognition technologies in Ukraine. Nevertheless, we consider that general 'privacy literacy' is increasing worldwide, which means that more and more individuals will have privacy concerns in the future. We need to actively consider the risks that facial recognition technology can create for privacy, data protection, and society.

Vladyslav Podolyak Managing Partner
[email protected]
Tetiana Maksymovych Associate
[email protected]
Podolyak and Partners, Kyiv