Denmark: Datatilsynet's latest guidance on data transfers and cloud computing
In March 2022, the Danish data protection authority ('Datatilsynet') issued its 'Guidance on the use of cloud' ('the Guidance')1. Birgitte Toxværd, Partner and Attorney-at-law at Horten Advokatpartnerselskab, chronologically dissects the Guidance and its updates, and discusses key takeaways in relation to data transfers and cloud computing.
The room for uncertainty following the Court of Justice of the European Union's ('CJEU') judgment in Data Protection Commissioner v. Facebook Ireland Limited, Maximillian Schrems (C-311/18) ('Schrems II') is closing in fast as the EU data protection authorities ('DPAs') provide clarification on data protection compliance in the field of cloud services. The Datatilsynet is also highly engaged in cloud services and has issued guidance and decisions as to e.g. EU-based hosting facilities owned by US parent companies, the legal basis for disclosure for the cloud providers' own purposes, and thumbs-down to the preparation of Transfer Impact Assessments ('TIAs') based on the likelihood of the certainty in the legal assessment. In a landslide decision in July 2022, the Datatilsynet has issued an injunction against a municipality on the use of Google Chromebooks and Google Workspace for Education in local schools.
Datatilsynet's Guidance, March 2022
The Guidance addresses the basic requirements in the EU's General Data Protection Regulation (Regulation (EU) 2016/679) ('GDPR') and transfers to third countries, including cloud service providers owned by EU companies with parent companies in third countries. The Datatilsynet has emphasised that the Guidance conforms to the views of the European Data Protection Board ('EDPB').
The Guidance elaborates on the cloud service delivery models in terms of content, allocation of responsibility, security profiles, and technical complexity. The Guidance further draws on the six steps in the EDPB's Recommendations 01/2020 on measures that supplement transfer tools to ensure compliance with the EU level of protection of personal data ('Recommendations 01/2020')2 on supplementary measures in transfers.
Furthermore, the screening process set out in Article 28(1) of the GDPR requires a detailed assessment of the guarantees for compliance with the data processing obligations of the data processor.
The tasks as regards cloud services include e.g. mapping the full chain of data processors and international transfers, the preparation of risk assessments of security of processing, the preparation of Data Protection Impact Assessments ('DPIAs'), when required, the documentation of Privacy by Design and Default, the preparation of TIAs, and the negotiation of data processing agreements to ensure flow-down of terms to the sub-processors in the processing chain. This must be done before entering into the data processing agreement.
The Guidance emphasises that the TIA of the problematic legislation in the non-EEA country and adoption of supplementary measures, if required, must include all transfers in the chain of data processors, i.e. all data processors storing data, providing support, etc.
In the assessment of 'problematic legislation and case law', the Guidance permits the approach that, if many third countries are involved in the processing chain, the data controller may base its assessment of the third countries' legislation on a 'worst-case scenario', in which all third countries involved have problematic legislation and case law.
Supplementary measures adopted to mitigate problematic legislation and case law may be technical, organisational, and contractual. A highlighted technical measure in the Guidance is encryption, which must cover data 'in transit, at rest and in motion'. Encryption will not be an effective measure if the cloud provider carries out the encryption and is in possession of the decryption keys, if the cloud provider based on third-country legislation can be ordered to hand over the decryption keys together with the encrypted data, or merely the decrypted information, to the non-EEA authorities.
The Datatilsynet addresses an important set-up not covered by the EDPB recommendations regarding EU-based cloud providers with a US parent company without presumed transfers of personal data to the US. Still, according to the US Clarifying Lawful Overseas Use of Data Act ('the CLOUD Act'), the parent company may be ordered to 'preserve, backup, or disclose the contents of a wire or electronic communication and any record or other information pertaining to a customer or subscriber within such provider's possession, custody, or control, regardless of whether such communication, record, or other information is located within or outside of the United States'. The parent company in the US may thus order the EU subsidiary to disclose data to the parent company, which will in turn hand it over to US authorities.
The Datatilsynet finds that using an EU-based hosting facility owned by a US parent company does not in itself entail an international transfer subject to Chapter V of the GDPR for the cloud customer. This is in line with the Doctolib ruling rendered by the French Conseil d'État in 2021. However, in the Guidance, the Datatilsynet distinguishes between 'intended transfers' and 'unintended transfers', and if the provisions of the data processing agreement do not allow for international transfers, any transfer made at the initiative of the cloud provider will be deemed 'unintended'. The scenario of unintended transfers triggers the requirement to carry out a risk assessment under Article 32 of the GDPR, but not a transfer tool under Chapter V of the GDPR. Furthermore, the EU cloud provider, and not the cloud customer (i.e. the data controller), will be liable for any such disclosure and international transfer in violation of Chapter V of the GDPR as an independent data controller (Article 28(10) of the GDPR).
The risk assessment under Article 32 of the GDPR must weigh the consequences for the data subjects and the likelihood of the 'threat' materialising, i.e. of the EU subsidiary disclosing the personal data to the US authorities in infringement of the data processing agreement. After publication of the Guidance, the Datatilsynet has clarified that the risk must be mitigated to a 'low' residual risk, and not merely medium, which is a tricky threshold to meet. In practice, this will entail carrying out thorough risk analyses and documentation in close cooperation between information security and legal teams.
Difference between 'unintended' vs. 'intended' transfers, April 2022
The distinction in the Guidance between 'unintended' and 'intended' international transfers prompted the private company Kombit A/S to request the Datatilsynet to clarify what the Datatilsynet meant by disclosing personal data 'as an infringement of the provisions of the data processing agreement' (i.e. unintended). Specifically, it asked about a school communication tool called 'Aula', which is used daily by teachers and parents in Denmark. Aula is hosted on Amazon Web Services ('AWS'). Kombit asked whether the wording in the data processing agreement with AWS would entail that the transfer was 'intended' or 'unintended'.
Kombit referred to the data processing agreement with AWS, Section 12.1, which contained the following wording:
'12.1 Regions. […] Once Customer has made its choice, AWS will not transfer Customer Data from Customer's selected Region(s) except as necessary to provide the Services initiated by Customer, or as necessary to comply with the law or binding order of a governmental body. If the Standard Contractual Clauses apply, nothing in this Section varies or modifies the Standard Contractual Clauses'.
In April 2022, the Datatilsynet informed Kombit that such a wording would be considered 'intended transfers'. Furthermore, this entailed that Kombit's customers (i.e. the municipalities as the data controllers), not only AWS, were co-liable for any non-compliant transfers to US government authorities. The Datatilsynet elaborated that the broadly worded instruction was not limited or specific as to which countries, including third countries' legislation or binding orders, were in scope of the instruction.
Who is the data exporter? June 2022
In June 2022, the Datatilsynet published another clarification to its Guidance concerning the definition of the term 'data exporter' under Chapter V of the GDPR. The term is used, but not defined in the GDPR. The Standard Contractual Clauses ('SCCs'), which is a transfer tool issued by the European Commission for international transfers, define the exporter as the data controller or data processor transferring the personal data to a third country. If the data processor transfers the personal data to another data processor in a third country (Module 3 in the SCCs), the question arises as to which party is required to ensure compliance with Chapter V of the GDPR, including, but not limited to, the preparation of - and not least, paying for - the TIA.
The Datatilsynet clarified that the exporting data processor is considered the data exporter under the SCCs. However, the Datatilsynet added that Article 44 of the GDPR contains the general principle for international transfers, and that both data controllers and data processors are liable for the compliance with this provision. This entails that the data controller and the exporting data processor are both liable for ensuring that a transfer tool and effective supplementary measures are in place. Therefore, the data controller will also be liable for documenting that the data processor has implemented the transfer tool and effective supplementary measures based on a TIA. In practice, this means that cloud customers must document TIAs as part of the Article 28(1) screening process of the cloud provider, alongside the screening of security measures under Article 32 of the GDPR. This stems from the accountability principle in Articles 5 and 24 of the GDPR.
Datatilsynet on Google Chromebook and Google Workspace for Education, July 2022
In July 2022, the Datatilsynet continued its cloud endeavours and issued a landslide decision on the use of cloud services. The case concerns Helsingore Municipality in Denmark, the hometown of Hamlet, and its use of Google Chromebooks and Google Workspace for Education in schools. The case started in 2019 when a parent complained to the Datatilsynet that their child had created a YouTube account for use in classes. The school had not asked or informed the parent about the creation of the account which included that the name, school name, and class of the pupil was published when the pupil made comments on YouTube. Further, the pupil's login details were posted on a paper on the lid of the laptop.
In September 2021, the Datatilsynet publicly criticised Helsingore Municipality for not having prepared a satisfactory risk assessment for the use of Google Chromebooks and Google Workspace for Education. The municipality was ordered to prepare a risk assessment, and the Datatilsynet also issued a warning that a DPIA should be carried out. The Datatilsynet severely criticised the non-compliant processing. Finally, the municipality was reminded of its duty to contact the parents and to ensure facilitation of correction, anonymisation, or deletion that the parents could not carry out themselves for personal data relating to their children.
In July 2022, the Datatilsynet issued an injunction against Helsingore Municipality to cease the use of Google Chromebooks and Google Workspace for Education as the Datatilsynet held that the mandatory GDPR documentation had not been prepared to the satisfaction of the Datatilsynet.
The municipality had handed a risk assessment over to the Datatilsynet in January 2022, but not the DPIA, as the municipality objected that a DPIA was required based on its assessment of the risk.
At the time, the injunction was held to be effective until the municipality would ensure that the processing activities comply with the GDPR and that the mandatory GDPR documentation would be in place. The Datatilsynet granted the municipality until 3 August 2022 to delete users and transferred data. The municipality then provided new documentation to the Datatilsynet, including a DPIA.
Furthermore, the Datatilsynet suspended all transfers of personal data to the US by use of Google Cloud EMEA Limited until the municipality could demonstrate that Chapter V of the GDPR was complied with.
Students in Helsingore Municipality started school in early August without Chromebooks, but the teachers were still allowed to use them. The teachers included the pupils in the classes by other means, e.g. old-fashioned books, pen, and paper. Helsingore Municipality is one among 43 municipalities in Denmark with similar issues relating to the use of Google Chromebooks.
Injunction against Helsingore Municipality and its suspension, August and September 2022
Since then, based on the information provided by the municipality, the Datatilsynet upheld the injunction in August, but in September, the Datatilsynet suspended its own injunction. The reason for the suspension was a bit surprising, namely that now the municipality had acknowledged the legal issues and was working with the Datatilsynet to solve them under an Article 36 procedure as the high risk is still pending. However, the suspension will only be in force until 3 November 2022, and if the municipality has still not been able to document compliance with the GDPR, then the pupils are back to using pen and paper once again.
The Datatilsynet decisions include an abundance of key take-aways about cloud services; five of those are listed here:
1. The technology stack
When an organisation provides a complex system to users, be it pupils (or employees or customers), then the organisation must document the data flow within, between, and outside the systems to ensure that the organisation knows which entity is the data controller and data processor for which information. Further, in this scenario, the Datatilsynet flagged that the municipality had provided a laptop to the pupils and let Google LLC be the data controller of the data, meta-data in the operating system, and browser layers, while Google was the data processor in the Workspace for application layers. The municipality must now - together with Google - document the data flows and identify for which information the municipality and Google respectively are the data controller or data processor.
2. Scope of legal basis for disclosing data to Google
The municipality shared some personal data with Google based on Article 6(1)(e) of the GDPR, i.e. processing which is necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the data controller.
The Datatilsynet found that Google had not identified exactly which data was shared and that the legal basis in Article 6(1)(e) could be used for the processing in tools and systems which support class teaching, as provided for in the Danish Public School Act. However, the legal basis could not be extended to apply to sharing personal data on the pupils with Google for Google's own purposes. In reaching this decision, the Datatilsynet weighed in that the data subjects were children in need of special protection.
The Datatilsynet also considered the use of meta data collected directly by Google by use of the equipment and software, even if this data was not used for marketing purposes to target the individual pupil directly, or the classes indirectly.
This legal basis relied upon by public sector entities entails a narrow (if any) scope for sharing data with cloud providers for their own purposes. Cloud providers and all other types of data processors will have to review and contemplate how they can update their data processing agreements and systems, in particular if they wish to retain business within the public sector.
Private-sector customers, on the other hand, may enjoy a greater scope for sharing data with data processors for their own purposes, possibly based on Article 6(1)(f) of the GDPR on legitimate interests. However, this must be assessed carefully on a case-by-case basis, e.g. considering contractual and statutory secrecy requirements.
3. Scope of risk assessments
The Datatilsynet found that the risk assessment provided by the municipality did not cover all known risk scenarios. The missing scenarios included how the applications process the personal data, how the municipality audits Google's access to personal data in the Chromebook operating system, and how personal data was separated in the interaction between Google Workspace for Education and Google's backend (i.e. which data were shared and how).
The Datatilsynet found that, by not including and documenting all risk scenarios, the municipality had not documented a GDPR-compliant processing under Articles 5(1)(a) and 5(2) of the GDPR.
In the risk assessment, the municipality had included as a risk scenario that it could not entirely rule out that Google could infringe the data processing agreement and process personal data for marketing purposes or other unintended purposes. On other occasions, the Datatilsynet has stated that infringement of the data processing agreement is not a scenario which needs to be included in a risk assessment since it is generally fair to assume that the data processor will comply with the agreement, provided that Article 28(1) and Chapter V of the GDPR are complied with, of course in addition to audit processes.
The Datatilsynet responded that a data controller can only use a data processor which provides sufficient guarantees for GDPR compliance, and that an expectation of such a threat combined with the lack of identification of sufficient measures in the risk assessment entailed that the municipality had not documented that such guarantees had been provided (Articles 24 and 28(1) of the GDPR). Therefore, such risk scenarios should not be included in a risk assessment.
4. Scope of DPIAs
The Datatilsynet warned the municipality in its first decision in September 2021 that the same should prepare a DPIA under Article 35 of the GDPR if there was a high risk involved for the individuals. The municipality argued in January 2022 that it was not required to prepare a DPIA because it had assessed the risk to the pupils as low considering the mitigating measures.
In July 2022, the Datatilsynet turned up the volume and explained that any risk carrying a high consequence for the rights and freedoms of the data subjects requires a DPIA to be prepared, even if the likelihood of such a risk materialising is relatively low. This assessment works with the inherent high risk as laid out by the EDPB in the Article 29 Data Protection Working Party Guidelines on Data Protection Impact Assessment (DPIA) and determining whether processing is "likely to result in a high risk" for the purposes of Regulation 2016/6793 and the list of situations requiring a DPIA to be prepared (based on Article 35(5) of the GDPR). Once again, the Datatilsynet ordered the municipality to prepare a DPIA, but this time on the back of a suspension of the transfer.
5. Scope of TIAs
Even if the municipality had restricted data storage for Google Workspace for Education within the EU, transfers to third countries could in an earlier set-up still take place during support situations initiated by the municipality. The Datatilsynet reiterated that transfers occurring during support situations triggered the requirements for preparing a TIA and implementation of supplementary measures on top of the SCCs.
In its TIA, the municipality argued, among other things, that the selectors used in government searches under Section 702 of the Foreign Intelligence Surveillance Act of 1978 ('FISA') were not used for data in support situations, and that the communication during support was 'to a U.S. person' (i.e. Google), which rendered Section 702 inapplicable. The TIA also assessed that:
- '[t]he data importer/recipient is not subject to a higher interest from a public foreign authority in requesting access to the personal data (i.e., the data importer or potential recipient is not subject to national law facilitating mass surveillance)';
- '[h]owever, there is a high likelihood that the data accessible to the Google LLC, is per se excluded from access under Section 702 FISA because it is data that are not transmitted by it but to it for the purpose of providing a support service'; and
- '[i]n addition, Helsingore [Municipality's] personal data does not comprise personal data about "U.S. Persons" and US authorities are thus barred from accessing data under Section 702 FISA for this reason as well'.
The municipality had assessed the likelihood of the above assessment being correct at 40 %.
The Datatilsynet countered all arguments and ruled that the U.S. Government would be permitted under Section 702 to collect this type of information, which concerned Danish citizens (i.e. non-US persons).
Further, the TIA also contained a calculation on how many years would pass before the non-EEA authorities would request the personal data in scope and concluded that the likelihood would be very low.
Further, the Datatilsynet rejected both methodologies used in the TIA concerning the assessment of 'likelihood', and upheld the position taken by the EDPB in Recommendation 01/2020 that Chapter V must be complied with for all transfers, and not almost all transfers. It is a binary assessment of gaps in fundamental rights and closing such gaps, not one of risk or likelihood.
Hence, the Datatilsynet found that the municipality had not documented that the transfer of personal data for support purposes was legal under the GDPR (Articles 44 and 46(1)(c)).
Many law firms and organisation have relied on this methodology, but once and for all, it has now been confirmed by a supervisory authority that this approach is not compliant with the GDPR.
The Datatilsynet decision on Google Chromebooks has received wide public interest as it brings the severity of data protection compliance to a new level ending the soft transition period from the Data Protection Directive (Directive 95/46/EC) to the GDPR. The gloves are off at the Datatilsynet, which has so far been known for its pragmatism in its approach to enforcement.
Considerations about the future
Preparing a compliant implementation of cloud services is both time-consuming and costly. Companies engaging in these exercises often face the choice of higher security of processing, better functionality, and lower prices on the one hand, and non-compliance with Chapter V of the GDPR on the other hand. However, when it comes to cloud services, it is important to convey a message to decision takers that they must resist temptation as having to roll back or redirect cloud decisions following an injunction will entail huge costs, as well as massive and critical interruptions of the daily business or conduct of a public authority, e.g. that pupils cannot use their laptops in school, or that a company cannot serve its customers or pay its employees.
In my experience, many customers place trust in the big cloud providers that they have ensured a legal processing for the customer, however, this is not always the case. Therefore, if I could have one wish fulfilled for our customers based on the strict enforcement approach by the Datatilsynet, it would be to better facilitate a legal use of cloud services by the customers. This could include that more cloud service providers would take it upon them to provide template documentation, such as risk assessments, DPIAs, TIAs, and that they would rewrite template data processing agreements to not include own purposes or broad reservations for transfers of data when requested by public authorities. This way, cloud customers would not have to spend vast amounts of time and resources to negotiate standard terms and prepare vast amounts of documentation which is almost identical for all customers using a given cloud service.
At the end of the day, both cloud service providers and customers would benefit from such an approach, and it would ensure a level playing field for businesses which work to be both legally compliant and have long-term continuity assurance, as well as a competitive edge.
Birgitte Toxværd Partner and Attorney-at-law
Horten Advokatpartnerselskab, Hellerup
1. Available at: https://www.datatilsynet.dk/Media/637824108733754794/Guidance%20on%20the%20use%20of%20cloud.pdf
2. Available at: https://edpb.europa.eu/system/files/2021-06/edpb_recommendations_202001vo.2.0_supplementarymeasurestransferstools_en.pdf
3. Available at: https://ec.europa.eu/newsroom/article29/items/611236