Support Centre

You have out of 5 free articles left for the month

Signup for a trial to access unlimited content.

Start Trial

Continue reading on DataGuidance with:

Free Member

Limited Articles

Create an account to continue accessing select articles, resources, and guidance notes.

Free Trial

Unlimited Access

Start your free trial to access unlimited articles, resources, guidance notes, and workspaces.

Canada: An overview of Bill C-27 and its proposed changes to PIPEDA

Canada has an existing comprehensive federal private-sector privacy legislation, the Personal Information Protection and Electronic Documents Act 2000 ('PIPEDA'), which became law in 2000. Recently, changes to PIPEDA have been proposed via the draft language of Bill C-27 for the Digital Charter Implementation Act 20221 ('Bill C-27'). Kirsten Thompson, Partner at Dentons, takes a look into the proposed changes, and what impact Bill C-27 would have in areas such as penalties, artificial intelligence ('AI'), and data portability.

TLRG / Essentials collection /


Under a 2001 decision by the European Commission (most recently reaffirmed in May 2018, when the General Data Protection Regulation (Regulation (EU) 2016/679) ('GDPR') came into effect), Canada is considered as providing an adequate level of protection for personal data transferred from the EU to recipients subject to PIPEDA. Canada's adequacy status ensures that data processed in accordance with the GDPR can be subsequently transferred from the EU to Canada without requiring additional data protection safeguards (for example, standard contractual rules) or authorisation to transfer the data.

Article 45(4) of the GDPR requires the Commission, on an ongoing basis, to monitor privacy-related developments in Canada that could affect the functioning of the existing adequacy decision. Unless Canada amends PIPEDA prior to the next review (which has already begun), it is widely expected that Canada would not maintain its current adequacy status.

As a consequence of this adequacy review, and the fact that PIPEDA has not kept pace with development of new technologies and new uses of personal information, PIPEDA is in the process of being completely rewritten, with the proposed changes largely bringing it into line with the GDPR.

Bill C-27 would not only modify PIPEDA, but would also implement three new laws:

  • The Consumer Privacy Protection Act ('CPPA'), which is the privacy law that will replace PIPEDA.
  • The Personal Information and Data Protection Tribunal Act, which would create a new Personal Information and Data Protection Tribunal ('the Tribunal') to review the findings of the Office of the Privacy Commissioner of Canada ('OPC') and levy penalties.
  • The Artificial Intelligence and Data Act ('the AI Act'), which is framework legislation that would prohibit certain conduct in relation to AI systems that may result in serious harm to individuals or their interests. The full impact of the legislation is not yet clear as much of the detail has been left to regulations, which are not yet drafted.

Below is a summary of the distinctions between the regulation of the collection, use, and disclosure of personal information under PIPEDA, and under the latest version of the CPPA2.

Key differences


The CPPA would retain Canada's consent-based model (in contrast to the GDPR, under which organisations may process personal information on an enumerated lawful basis) and much of PIPEDA's principles-based language. However, PIPEDA's Schedule I, which was essentially the CSA Model Code for the Protection of Personal Information, has been scrapped and the principles from the Model Code incorporated directly into the actual language of the CPPA. This is a welcome change as there would now be additional detail about what organisations must do to comply with the law. For instance, whereas PIPEDA Schedule 1 requires that an organisation 'implement policies and practices to give effect to the CSA Model Code principles', the proposed Section 9 language of the CPPA explicitly requires that an organisation implement and maintain a privacy management program that 'includes the policies, practices, and procedures the organisation has put in place to fulfil its obligations under this Act'. The CPPA goes on to set out particular policies, practices, and procedures the program must cover and the factors the organisation must take into account in developing its program.

Orders, penalties, and fines

Under PIPEDA, the OPC was envisioned to be an ombudsman and lacked any significant powers – it could not issue orders directly, had no power to levy fines or penalties, and, after investigations, could issue only non-binding 'findings' instead of binding decisions. It worked through persuasion and 'naming and shaming' to nudge organisations into compliance.

The proposed CPPA reflects a more traditional regulatory approach. It proposes to grant the Privacy Commissioner broad audit and order-making powers, and enable the Privacy Commissioner to make recommendations to the Tribunal for the imposition of significant administrative monetary penalties ('AMPs') on organisations for violating the key provisions (however, the Privacy Commissioner could not itself levy the AMPs; that falls to the Tribunal). The Privacy Commissioner would also be granted greater powers in regards to conducting inquiries and making compliance orders.

The Tribunal would be authorised to impose AMPs of up to CAD 10 million or 3% of the organisation's global gross revenues, whichever is higher. It is unclear at this time whether this means the global revenue of the Canadian entity, or the revenue of the entirety of an organisation's global operations.

The most egregious violations of the new legislation, such as failing to report breaches to the Privacy Commissioner or maintaining records of same, knowingly using de-identified information to identify an individual, or knowingly contravening a compliance order issued by the Privacy Commissioner, would constitute offences punishable, upon prosecution, with a fine of up to CAD 25 million or 5% of the organisation's global gross revenues.

Private right of action

The proposed legislation would introduce a new private right of action by which an individual 'affected' by a contravention may (within two years) bring a claim against the organisation for damages for loss or injury suffered as a result of the contravention. The individual would be able to sue for a privacy violation following a finding by the Privacy Commissioner or the Tribunal that an organisation has contravened the CPPA. Note that this appears to be in addition to common law actions, and actions brought pursuant to certain provincial privacy acts which create statutory torts actionable without proof of damage.

While PIPEDA limits any action to recover compensation for a violation of privacy to the Federal Court, the CPPA would also allow aggrieved individuals to file such actions in the superior court of a province. While PIPEDA limits the ability to seek compensation for a violation from the court system to a 'complainant' who has made a complaint to the Privacy Commissioner, the CPPA broadens this to 'an individual who is affected' by a violation. It is also unclear whether an on organisation is also exposed to class action liability under this provision.

Enlarged Privacy Commissioner role and powers

Under the CPPA, the Privacy Commissioner would become a true regulator, and the privacy regime would be much more adversarial. As with PIPEDA, enforcement starts either with a complaint by an individual, or the Privacy Commissioner can initiate a complaint of their own accord. However, from that point on the process will change.

The first step continues to be an investigation, and the Privacy Commissioner is bound to investigate any complaint except in certain narrow circumstances. This means 'nuisance' complaints take a lot of companies' and the Privacy Commissioner's time and resources. The CPPA sets out more circumstances than PIPEDA did in which the Privacy Commissioner can decline to investigate.

After an investigation, the Privacy Commissioner may refer the matter to an inquiry. Inquiries have many more procedural protections for fairness and due process than under PIPEDA's current ad hoc system. For example, the CPPA guarantees each party a right to be heard and to be represented by counsel, as well as follow certain rules of procedure and evidence (which the CPPA requires the Privacy Commissioner to develop).

At the end of the inquiry, the Privacy Commissioner can issue orders requiring an organisation to do something/refrain from doing something in order to come into compliance with the CPPA. Note that these orders may include the destruction of personal information, the destruction of software, or algorithms developed using unlawfully acquired personal information, and so on. As under PIPEDA, the Privacy Commissioner can also continue to name and shame violators.

The Tribunal

Currently, because the Privacy Commissioner's findings under PIPEDA are not binding, there is no right of appeal. Under PIPEDA, an organisation may seek judicial review in the Federal Court after the Privacy Commissioner has issued its findings, but as a practical matter, the threshold for judicial review is so high (and the consequence under the current PIPEDA so limited) that this is rarely done. Under PIPEDA, a complainant may, after receiving the Privacy Commissioner's finding (or upon being notified that the investigation of the complaint has been discontinued) apply to the Federal Court for a hearing in respect of any matter in respect of which the complaint was made, or that is referred to in the Privacy Commissioner's report. Such hearing is de novo, meaning the Federal Court starts fresh and makes its own findings of fact and determinations of law, based on the parties' submissions.

Bill C-27 would create the new specialised Tribunal and organisations found by the Privacy Commissioner to have violated the CPPA would have a new right to appeal the Privacy Commissioner's findings (as well as any interim or final orders) to the Tribunal. The Tribunal will be made up of three to six members, and at least three of the Tribunal members must have expertise in privacy.

The Tribunal will review the Privacy Commissioner's decision under a fairly high standard: 'correctness' for questions of law; and 'palpable and overriding error' for questions of fact or questions of mixed law and fact. Practically, this means that while organisations will now have the opportunity to appeal the Privacy Commissioner's decision, it will likely be very hard to succeed on such appeal. The Tribunal's decisions would be further subject to limited judicial review before the Federal Court.

Data portability and deletion

PIPEDA currently contains the rights of access and rectification, and a right to withdraw consent. The CPPA would expand and clarify these existing individual rights, and provide for new individual rights of data portability and deletion. Consumers would be able to require an organisation to transfer their data to another organisation, provided that the organisations are connected to a 'data mobility framework'. The mechanism for such frameworks to is left to regulations which have not yet been drafted.

The CPPA would allow an individual to withdraw consent subject to similar limitations that currently exist in PIPEDA. However, unlike PIPEDA, under the CPPA an individual can also specifically request that an organisation dispose of their information; notably, disposal includes deletion and rendering the data anonymous.

If an organisation disposes of personal information at an individual's request, the CPPA would require it, as soon as feasible, to inform any service provider to which it has transferred the information of the individual's request and ensure that the service provider disposed of the information.

Algorithmic transparency

The CPPA would require organisations to provide in their privacy policy a general account of the organisation's use of any automated decision system to make 'predictions, recommendations, or decisions about individuals that could have a significant impact on them'. On request by the individual, the organisation must provide them with an explanation of the prediction, recommendation, or decision. The explanation must indicate the type of personal information that was used to make the prediction, recommendation, or decision, the source of the information, and the reasons or principal factors that led to the prediction, recommendation, or decision.

The definition of automated decision system is broad and means any technology that assists or replaces the judgment of human decision-makers through the use of a rules-based system, regression analysis, predictive analytics, machine learning, deep learning, a neural network, or other technique. Note that unlike the GDPR, the CPPA captures all such decision-making and does not provide an exemption for human-in-the-loop systems.

Collection, use, and disclosure without consent

PIPEDA has enumerated grounds on which an organisation can collect, use, and disclose personal information without consent. The CPPA provides much needed clarification to many of these existing grounds, but also creates new ones, largely aimed at removing friction from everyday, well understood situations.

The CPPA would allow collection and use of personal information without consent for certain business activities where it would reasonably be expected to be collected or used to provide the service requested, for security purposes, for safety, or for other prescribed activities.

There is also a 'legitimate interest' exception to consent for collection and use, but it requires an organisation to first identify any potential adverse effects on the individual that is likely to result from the collection or use, mitigate or eliminate them, and finally weigh whether the legitimate interest outweighs any adverse effects. This assessment must be documented. Note that unlike the GDPR, 'legitimate interest' is narrower, and is not in and of itself a basis for processing, but rather provides only for an exception to having to obtain consent.

Notably, an organisation cannot use this legitimate interest exception where the personal information is to be 'collected or used for the purpose of influencing the individual's behaviour or decisions'. On its face, this could be broad enough to include behavioural advertising; if so, an individual would have to provide express consent to have their personal information used for such purposes.

Anonymising and de-identifying data

PIPEDA does not address either technique. While evolving guidance from the Privacy Commissioner attempted to address issues of anonymisation and de-identification, such guidance was not consistent with evolving international norms. The CPPA would address both concepts directly. To anonymise is defined as 'to irreversibly and permanently modify personal information, in accordance with generally accepted best practices, to ensure that no individual can be identified from the information, whether directly or indirectly, by any means'. The CPPA states expressly that it does not regulate anonymous data because, by definition, there is no reasonable prospect of re-identification.

To de-identify data means 'to modify personal information so that an individual cannot be directly identified from it, though a risk of the individual being identified remains'. The CPPA does regulate de-identified data and generally prohibits attempts to re-identify it. The CPPA would also expressly allow for organisations to use an individual's personal information without their consent in order to de-identify their data. It would also require that in some cases, de-identified data must be used in place of fully identifiable personal information (for instance, in business transactions such as a merger or acquisition in order to take advantage of the provisions that say consent to disclose personal information pursuant to such transactions is not required).

Under certain circumstances, organisations can also disclose de-identified data to public entities for socially beneficial purposes.

Codes of Practice

The CPPA would introduce the concept of 'Codes of Practice'. Notwithstanding the requirement for a privacy management program, the CPPA allows private organisations to establish a 'code' and internal certification programs for complying with the CPPA, which the Privacy Commissioner will approve. Once approved, this 'code' will effectively establish the organisation's legal compliance obligations.


Bill C-27 also proposes to introduce the AI Act specifically intended to address AI systems and data. The AI Act will require organisations or individuals responsible for AI to, among other things, assess these systems' potential to cause a 'high-impact' risk, develop mitigation plans to reduce or eliminate these risks, publicly disclose when high-impact systems are being used, and notify the Minister of Innovation, Science and Industry when the system results or is likely to result in 'material harm', among other obligations. The penalty and AMPs regime is similar to that set out under the proposed CPPA.

Coming into force

As Bill C-27 is currently at the second reading stage, there will likely be much more debate and potential amendments as the Bill makes its way through Parliament. International developments regarding the regulation of AI in particular are likely to have a material effect on subsequent changes to the AI Act, and any regulations enacted after it becomes law. In addition, there have been calls to remove the AI Act from Bill C-27 altogether and have it separately considered at committee.

Kirsten Thompson Partner
[email protected]
Dentons, Toronto

1. Available at:
2. Note that three Canadian provinces (British Columbia, Alberta, and Québec) also have private-sector privacy laws that exist alongside PIPEDA and apply to those activities (i.e., collection, use, and/or disclosure) which occur wholly within that province). On 22 September 2021, the Québec government adopted the Act to modernize legislative provisions as regards the protection of personal information, 2021, Chapter 25 ('Bill 64'), enacting significant changes to the requirements governing the use and protection of personal information under various statutes, including notably the Act Respecting the Protection of Personal Information in the Private Sector, CQLR c P-39.1 ('the Private Sector Law'). On 22 September 2022, the first set of requirements brought on by Bill 64 came into force in Québec. The remaining requirements will come into force in increments, in September 2023 and in September 2024. Bill 64 makes significant amendments to Québec's Private Sector Law, which applies to Québec-based private sector entities, as well as out-of-province companies doing business involving personal information of Québec residents. Bill 64 is similar to Bill C-27 in that it moves the Québec Private Sector Law closer to the GDPR. Under the amended Act, administrative monetary penalties may be imposed on companies that contravene certain provisions of the law of up to CAD 10 million or 2% of worldwide turnover of the preceding year, and fines of up to CAD 25 million or 4% of worldwide turnover of the preceding year for several new offences created by Bill 64. Furthermore, the Bill also provides for a private right of action for certain acts and omissions.