Support Centre

You have out of 5 free articles left for the month

Signup for a trial to access unlimited content.

Start Trial

Continue reading on DataGuidance with:

Free Member

Limited Articles

Create an account to continue accessing select articles, resources, and guidance notes.

Free Trial

Unlimited Access

Start your free trial to access unlimited articles, resources, guidance notes, and workspaces.

California: CPPA publishes draft regulations on DPIAs - what do you need to understand and take action on?

The California Privacy Protection Agency (CPPA) has released suggested draft regulations for discussion by the CPPA board before its scheduled meeting on September 8, 2023. These draft regulations address both cybersecurity audit regulations and risk assessment regulations, the latter generally known as Data Protection Risk Assessment or Data Protection Impact Assessment (DPIA). While this is only a draft for discussion, certain sections have been distinctly marked as open "for additional discussion." Notwithstanding the preliminary nature of the document, the general approach and direction of these regulations are evident.

In this Insight article, Odia Kagan, Partner and Chair of GDPR Compliance & International Privacy at Fox Rothschild LLP, provides an in-depth exploration of the draft DPIA draft regulations.

-Panya- / Essentials collection /


Within the context of DPIA regulations, there are essential elements that demand special attention:

  • The triggers for conducting a DPIA are broader than both the General Data Protection Regulation (GDPR) and the Colorado Consumer Privacy Act (CPA). These triggers include using automated employment decision tools (AEDT) for significant effects (which includes employment opportunities or compensation); processing information of children under 16; processing information through monitoring in public places; and processing personal information to train artificial intelligence (AI).
  • Illustrative scenarios include using rideshare data to determine compensation, using photos and faceprints to train AI, and tracking drivers' behavior and performance.
  • The requirements for the DPIA itself largely mirror those stipulated by the CPA and GDPR, yet with additional specific requirements. While the same DPIA can be used for compliance with several laws, it is imperative to provide an addendum explaining compliance with each element. Hence companies should formulate a comprehensive template that applies across the various jurisdictions from the outset.
  • Additional requirements for AEDT or AI align with, and in certain aspects extend, the supplementary requirements outlined in the CPA, particularly concerning processing that involves certain types of profiling.
  • Apart from the obligation to provide the DPIA to the Attorney General (AG) upon request, as mandated by several other states, it is mandatory to provide an additional proactive yearly submission in a condensed form and with appropriate certification, affirming the completion of the DPIA process.

Therefore, several tasks need to be completed, which include:

  • assessing which of your activities would trigger a DPIA in California, Colorado, and the EU;
  • formulating a template that addresses the requirements across all applicable jurisdictions;
  • developing a process to conduct the DPIA internally, involving various entities such as teams, stakeholders, service providers, external consultants, and legal advisors;
  • starting gathering the information from third parties; and
  • beginning immediate actions on these tasks due to the active requirement; regulatory authorities have already shown their expectations, and completing this process consumes considerable time and resources.

Definitions and their significance

  • AI: the definition of AI mirrors that outlined in the EU AI Act. It refers to an engineered or machine-based system designed to operate with varying levels of autonomy and that can, for explicit or implicit objectives, generate outputs such as predictions, recommendations, or decisions that influence physical or virtual environments. This includes generative AI as well.

  • Automated decision-making technology: any system, software, or process, including those based on machine learning, statistics, data-processing techniques, or AI. It involves the processing of personal information and uses computation as a whole or part of a system to make or execute a decision, or to assist human decision-making. This definition is different than the one in New York's Local Law 144 of 2021 on AEDTs.

  • Profiling: entails any form of automated processing of personal information to evaluate certain personal aspects relating to a natural person and in particular to analyze or predict aspects relating to an individual's job performance, economic status, health, personal preferences, interests, reliability, behavior, location, or movements. This definition aligns with the CPA definition, with the exception that it includes 'performance at work.'

Service providers and contractors

Service providers and contractors are required to assist businesses in conducting DPIAs. This includes providing the business with all facts necessary to conduct the risk assessment, including meaningful information about the AEDT where applicable. Moreover, service providers and contractors are prohibited from any form of misrepresentation concerning pertinent facts necessary to the risk assessment process.

The obligation of the service providers and contractors to assist with risk assessments and cybersecurity audits should be explicitly incorporated into the agreement established between the business and the service provider or contractor.

When to conduct a risk assessment

A risk assessment must be conducted before commencing the processing of personal information that presents a significant risk to consumers' privacy. This aligns with both the CPA and the GDPR.

If a specific processing begins before the regulations' effective date and continues thereafter, a DPIA must be carried out within 24 months from the said effective date.

It is permissible to perform a single risk assessment for similar sets of processing. A DPIA that was previously conducted for compliance with another law can be used, provided it meets all the requirements. However, an addendum is required to explain how the prior assessment meets the stipulated criteria. In cases where certain requirements are absent, these must be addressed in a supplement.

Importantly, the assessment should be made accessible to the AG upon request, a provision akin to the CPA. Additionally, there is a proactively annual obligation to submit two components:

  • abridged summaries of the DPIA; and
  • a certification from a designated executive affirming adherence to the requirements. This requirement stands apart from other legislation, including the GDPR.

Situations constituting significant risk

Acknowledging the triggers established in the CPA, Virginia Consumer Data Protection Act (VA CDPA), and GDPR, there are additional situations that pose a significant risk:

  • Using AEDT for significant effects, which includes employment opportunities or compensation. It is noteworthy that although the CPA might seem more restrictive, it contains a catch-all clause for profiling resulting in 'other substantial injury to consumers.' This alignment with California's perspective is particularly relevant, especially since this type of profiling, excluding the employment aspect, already mandates the right to opt-out in Colorado.

  • Processing information of individuals under 16 years of age, mirroring GDPR, but differing from Colorado and Virginia definitions of a 'known child,' which applies to those under 13 years of age.
  • Engaging in the processing of personal information of employees, contractors, job applicants, or students using monitoring technology.
  • Processing the personal information of consumers in publicly accessible places using technology to monitor consumers' behavior, location, movements, or actions. Publicly accessible places include shopping malls, stores, restaurants, cafes, movie theaters, amusement parks, convention centers, stadiums, gymnasiums, hospitals, medical clinics or offices, transportation depots, privately operated transit, streets, or parks. This is similar to the EU requirement except it doesn't explicitly require that the monitoring be on a large scale.
  • Processing personal information of consumers to train AI or automated decision-making technology.


Examples requiring a DPIA:

  • A rideshare provider using AEDT to distribute rides and establish fares and bonuses for its drivers. The need arises due to implications for employment opportunities and compensation.

  • Mobile dating apps sharing precise geolocation data, ethnicity details, and medical information from user profiles with their analytics provider. The necessity stems from the inclusion of sensitive personal information, even if it doesn't constitute a sale.

  • A personal budgeting application seeking to target consumers with behavioral advertising on different websites for payday loans based on their income as provided in the use of the app. The need arises due to the sharing of this information.
  • Installing in-vehicle video cameras to observe and assess the behavior and performance of drivers. This necessity arises from the monitoring of employees.
  • Grocery stores aiming to process consumers' device MAC addresses through Wi-Fi tracking to analyze their shopping patterns within their grocery stores. The need arises due to the monitoring of public spaces.
  • A technology provider endeavoring to process consumers' photographs and derive facial templates to train its facial recognition technology. The requirement emerges from the aspect of training AI.

DPIA process

The DPIA process involves stakeholders throughout the organization. This entails representatives from the product team, the business's fraud prevention team, or the compliance team. They need to ensure the accurate and transparent disclosure of all relevant information, without misrepresentation.

As required, third parties should also engage in the process. This can include service providers, contractors, providers of technological components, academics who specialize in detecting and mitigating bias in automated decision-making technology, or a subset of the consumers whose personal data is intended for processing.

If the outcome of the assessment indicates that the potential risks outweigh the benefits, the processing of data is prohibited.

Regular reassessment of the DPIA is essential, although the specific timing for these reassessments is yet to be determined. Furthermore, any significant changes in the processing activity require an immediate reassessment. These changes include alterations in the components of the DPIA, such as purpose, processing methods, data, outputs, or safeguards. Additionally, modifications in the purpose or compatibility with the original purpose, possibly arising from issues such as complaints, are also considered significant changes.

What to include in the DPIA

In preparing the DPIA, it is essential to include several elements that closely align with the CPA framework, to ensure comprehensive evaluation. These elements include:

  • summary of the processing;
  • categories of personal information processed;
  • context of the processing;
  • reasonable expectation regarding the purpose and compatibility with the context. This needs to be tied to the criteria for 'reasonable expectations' in the existing California Privacy Rights Act of 2020 (CPRA);
  • operational elements of the processing: this includes adherence to data minimization principles, protocols for data retention, number of consumers affected, technology to be used; names of all service providers or third parties with whom the information is shared (alternatively, provide an explanation for not disclosing their names);
  • purpose of the processing: a specific description of the purpose and how the processing achieves it;
  • benefits resulting from the processing: with specificity, including discussion of benefit probability and scale;
  • negative impacts on privacy from the processing: with specificity, encompassing sources of impact, as well as probability and extent of the benefits). A list of harms to address is provided;
  • safeguards the business plans to implement to address the negative impacts, including an explanation of how they mitigate risks, and whether there are any residual risks; and
  • evaluation of whether the safeguards' mitigation of negative impacts outweighs the benefits.

Additional requirements for automated decision-making technology

When employing automated decision-making technology that allows for opt-out rights, it is imperative to provide clear and comprehensible explanations encompassing the following aspects:

  • why the business would like to use the AEDT to achieve the purpose over manual methods;
  • the personal information processed and used to train the AEDT;
  • the outputs and how the business will use them;
  • the steps the business has taken or plans to take to maintain the quality of personal information processed by the AEDT, including personal information used by the business to train the technology;
  • the logic of AEDT, including any underlying assumptions;
  • how the business evaluates its use of the AEDT for validity, reliability, and fairness. This should include the metrics used for assessment, why they are the appropriate metrics, how any third-party components meet these requirements (including internal assessments of the technology), whether and how alternative versions or AEDT technologies were evaluated for validity, reliability, and fairness, and reasons for their non-selection. Results of evaluations should also be included;
  • why the business has not consulted external third parties and how protection has been accounted for without such consultation;
  • the degree and specifics of human involvement in the business's use of AEDT, including detailed requirements for the qualifications of the involved personnel and how they can influence the output; and
  • any safeguards that the business plans to implement to address negative impacts on consumers' privacy specific to its use of automated decision-making technology or for data sets produced by or derived from the automated decision-making technology.

Additional requirements for processing to train AI or AEDT

Moreover, there are additional requirements for processing to train AI or AEDT, such as:

  • offering a straightforward and easily comprehensible explanation of the purposes for which individuals or entities to whom you have granted access to the AI or AEDT for their independent usage, may use the AI or automated decision-making technology;
  • documenting in your own DPIA how you are providing this information and any safeguards in place to ensure that the AI is used for appropriate purposes by other individuals; and
  • if the AI or AEDT provided is intended for processing that inherently necessitates its own DPIA, It is mandatory to provide all the relevant information to the recipient businesses. Subsequently, within your own DPIA, explicitly describe the strategies you employ to deliver this information to the recipients and detail your method for facilitating the DPIA processes.

Odia Kagan Partner and Chair of GDPR Compliance & International Privacy
[email protected]
Fox Rothschild LLP, Philadelphia and Seattle