Support Centre

You have out of 5 free articles left for the month

Signup for a trial to access unlimited content.

Start Trial

Continue reading on DataGuidance with:

Free Member

Limited Articles

Create an account to continue accessing select articles, resources, and guidance notes.

Free Trial

Unlimited Access

Start your free trial to access unlimited articles, resources, guidance notes, and workspaces.

UK: ICO issues guidance on employee monitoring - helpful on both sides of the Atlantic - part two

In the second part of this Insight article, Odia Kagan, Partner and Chair of GDPR Compliance & International Privacy at Fox Rothschild LLP, offers a more in-depth exploration of the intricacies related to compliance with the UK Information Commissioner's Office (ICO) guidance on employee monitoring. You can access part one here.

kynny / Essentials collection / istockphoto.com

CCTV

  • If you anticipate the likelihood of capturing special category data, you must carry out a Data Protection Impact Assessment (DPIA).
  • Using audio recording, particularly where it is continuous, is considered more privacy-intrusive than purely visual recording. Consequently, it requires a much greater justification if you use it. You should switch off by default any capability to record audio, and it should only be employed in exceptional circumstances, such as through a trigger switch. Continuous audio and video recording can be highly intrusive and is unlikely to be justifiable in most situations.
  • You should target any monitoring at areas of particular risk and confine it to areas where expectations of privacy are low.
  • If you are considering using video or audio monitoring, you must:
    • complete a DPIA, as this will help you assess whether the benefits justify the adverse impact;
    • consider why this monitoring is necessary for the intended purpose as part of your DPIA;
    • make sure you inform workers about the extent and nature of the monitoring and why you are carrying it out; and
    • ensure that you make anyone else caught by the monitoring, such as visitors or customers, aware of its operation and why you are carrying it out.
  • You should also consider the right of access. If a worker or any other person captured by the monitoring makes a Subject Access Request (SAR), you may need to be able to redact third parties from the footage.

This is also generally the same in the US and you need to be mindful of any state or local laws that may apply.

Smart CCTV

  • Using video technology with facial recognition technology comes with higher risks to data protection rights and freedoms than standard video technology. This is particularly the case if you use facial recognition to make inferences about a person's likely behavior, emotional state, or intentions. There are also concerns about the accuracy of facial recognition technologies, particularly for people from ethnic minority groups.
  • If you are considering using facial recognition technologies, you must carry out a DPIA because they present a high risk. This requirement is addressed in the California Privacy Rights Act of 2020 (CPRA) draft DPIA Regulations.

Vehicle tracking

  • If you allow workers to use the work vehicle for private use, you will rarely be able to justify monitoring during private use. To address this, consider implementing a system that the driver can disable so it does not monitor driver activity when they are not working.
  • You must inform workers and passengers of any vehicle monitoring.
  • If you use cameras or audio monitoring, you must carry out a DPIA as this type of processing would be considered high risk.
  • You should consider whether less intrusive methods could achieve your purpose and document this assessment as part of your DPIA.
  • If you are considering the use of any monitoring tool that uses analytics to make inferences, predictions, or decisions about drivers, you must carry out a DPIA as this presents a high risk.
  • Dashcams with audio recording capabilities present a higher risk, so you should switch off any capability to record audio by default. You should only trigger audio recordings in exceptional circumstances.

These principles align with the CPRA and it's important to note that the Federal Trade Commission (FTC) has expressed concerns about biometrics surveillance and geolocation monitoring. Additionally, the new California In-Vehicle Camera law should be taken into account.

Monitoring for data leak prevention

  • You should consider the least invasive means possible when selecting solutions to protect against data loss or external threats.
  • You should complete a DPIA.
  • Keep in mind that monitoring network traffic may be high risk, particularly if you carry out an analysis of the data to make inferences about workers.
  • As an alternative to more detailed traffic monitoring, you could consider blocking suspicious incoming or outgoing traffic or redirecting the worker to a portal where they may ask for a review of the decision to block traffic.

These principles remain generally consistent with CPRA requirements. Additionally, it's essential to factor in considerations regarding US State law employee monitoring laws.

Device monitoring

  • Device activity monitoring is likely to capture excessive amounts of workers' personal data. This could potentially include special category data, such as emails about health conditions and emails to union representatives.
  • You are particularly unlikely to be able to justify capturing webcam shots or footage.
  • You must be clear about your purpose, and fully document your justification for carrying out device monitoring. This should include an assessment of less intrusive alternatives, and whenever possible, choose the less invasive means to achieve your objective.
  • You must carry out a DPIA before undertaking any processing likely to cause high risk to workers and other people's interests. Even where not mandated, you should carry out a DPIA as the process can assist with your risk assessment and planning.
  • Consider discussing the proposed device monitoring with workers or their representatives.
  • You must inform workers about device monitoring, including how you are using it for making decisions that affect them.
  • You could consider making aggregated analytics reports, which can identify trends without identifying individual workers.
  • You could consider banning the private use of work devices and blocking problematic websites. However, remember that even with such a policy in place, it would be difficult to justify accessing a worker's personal communications.
  • You should ensure that when workers are using their own personal devices for work, you are not capturing their private use of their devices.

These principles remain generally consistent and important in light of the new CPRA DPIA regulations.

Biometric data for time and attendance monitoring

DPIA

  • You should consider whether there are any alternatives to using biometric data, in order to achieve your intended objectives.
  • You should document your reasons for choosing to rely on biometric data, including any consideration of other less intrusive means and why you think they are inadequate.
  • You should be clear about your purpose and why using biometric data is necessary. If a reasonable alternative option to using biometric data is possible, you should be able to justify why you don't use this method. You must document all of this in your DPIA.
  • If you are relying on biometric data for workspace access, you should provide an alternative for those who do not want to use biometric access controls, such as swipe cards or PINs. You should not disadvantage workers who choose to use an alternative method. It is likely to be very hard to justify using biometric data for access control without providing an alternative for those who wish to opt out.
  • Yes, you must carry out a DPIA whenever you intend to process biometric data to uniquely identify a worker. You must complete your DPIA before starting the processing.

This information provides valuable insight into the DPIA process, which is further detailed in the new CPRA regulations.

Transparency

You are required to inform workers, and this remains consistent in the US, especially when AI is involved, about:

  • how the system works;
  • what personal information is being collected;
  • how their information will be used; and
  • the nature and purposes of the monitoring.

Security and retention

When collecting, using, and storing biometric data, it is essential to assess the need for additional security measures. You should consider whether you need to store a copy of the underlying image or whether it is sufficient to store the biometric template. In either case, you should consider security measures, such as encryption, and organizational measures, such as access restrictions.

If you opt to store biometric templates, ensure that:

  • you don't retain them for longer than is necessary;
  • they remain accurate and you refresh them as often as considered necessary;
  • you store them in a way that does not allow for reverse engineering into the original image or identity (i.e. the biometric templates are encrypted); and
  • you don't store the biometric templates alongside other associated images or lists.

These principles align with those outlined in the CPRA and recent FTC guidance and enforcement on data minimization and retention limitation.

Accuracy

  • You should think carefully about the accuracy of the system and its ability to correctly identify people.
  • Implement systems to quickly correct any inaccurate information so it does not negatively impact workers.
  • Assess and mitigate the bias in the system. If you have engaged another organization to provide the system, you should check it is suitable for the groups and people whose information it will capture. If the system you use results in processing which causes bias or discrimination, you are likely to breach the fairness principle. This principle is consistent with the US under FTC guidance, CPRA, Local Law 144 of 2021 (NYC 144), and labor and employment laws.
  • Provide manual reviews if an automatic process has resulted in a possible access denial.
  • You must give workers the option to ask for a review if they are unhappy with a decision made by solely automated processing.
  • You should quickly identify issues with workers accessing systems or buildings and give back access to workers as soon as possible.
  • You should not disadvantage workers who request manual reviews.

Accuracy is receiving increasing attention, not only from the FTC, with a focus on fairness and avoidance of deception but also more recently from the Consumer Financial Protection Bureau (CFPB).

Odia Kagan Partner and Chair of GDPR Compliance International Privacy
[email protected]
Fox Rothschild LLP, Philadelphia