Bill C-27, or the Digital Charter Implementation Act, was introduced by Innovation Minister François-Philippe Champagne in the House of Commons on Thursday.
In introducing the bill, the Liberal government says it would modernize privacy framework and introduce new rules for the development and deployment of AI. It would also ensure Canadians have confidence that information and AI is used respectfully, while “unlocking innovation that promotes a strong economy.”
A previous effort to update Canadian privacy regulations, Bill C-11, was introduced in late 2020, but was not passed before the 2021 election was called and died when parliament was dissolved (it is also not be confused with the Bill C-11 making its way through its final stages in the current parliament, and aims to regulate online content in a way similar to broadcast content).
Bill C-27 would create three new acts to regulate business’ use of personal data: the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act.
The first two acts were part of the former Bill C-11. The Consumer Privacy Protection Act governs how personal data is collected, used and disclosed in conducting business activities. The Personal Information and Data Protection Tribunal Act establishes a tribunal that can rule on violations and gives the office of the Privacy Commissioner the ability to levy penalties.
How and when business are to get consent
Like privacy regulations in other jurisdictions, a big part of the Consumer Privacy Protection Act is the idea of consent for collection of personal data by businesses.
After the former Bill C-11 was introduced, former privacy commissioner Daniel Therrien criticized the proposed regulations, calling them a “step back” from existing laws, as it gave Canadians less control over how their personal data was used. This was, Therrien said, due the rules around obtaining clear consent for data collection not requiring Canadians have a clear understanding of what they were agreeing to, and merely prescribing what might appear in a privacy notice.
The bill’s efforts to balance privacy controls with not wanting to “hinder innovation” gave commercial interests more weight than the need for Canadians to have control over their data, Therrien said last year. This was due to vague exceptions to consent obligations that put consumer privacy at risk.
Under Bill C-27, consent can only be obtained if an organization has given the individual a description of why the information is being collected and used of the personal information, how it is being collected and used, what exactly is being collected, potential consequences of the collection and use and any third parties the information may be disclosed to.
This information must be provided in plain language, and consent must be obtained before or at the time when the information is collected. Any time previously collected data is to be used for a purpose other than the one an individual agreed to, the organization must obtain consent again, and individuals must be allowed to withdraw their consent at any time.
However, Bill C-27 does lay out a number of exceptions to obtaining consent and situations where “implied consent” applies. Many of them are related to situations of emergencies, safety and the public interest, but there are also several exceptions that apply to business activities.
“Business activity” is an activity that is necessary to provide a product or service an individual has requested, for the organization’s information security or the safety of a product or service that the organization provides.
Implied consent is not to be used for business activities, but data can be collected without consent if “a reasonable person” would expect it to be collected or used, and the information is not used for the purpose of influencing the individual’s behaviour or decisions. Those provisions also apply if an organization has “a legitimate interest” that outweighs potential adverse effects on an individual, though they are obligated to identify any of these effects and take reasonable measures to reduce the likelihood that they will occur.
Exceptions for consent also exist for when an organization transfers an individual’s personal information to a service provider. If the information is de-identified, the information can be collected and used without consent for internal research, or for completing a business transaction. De-identification is not required if it is used only for the purposes of the transaction, if it is necessary for the transaction to go through, it is protected by adequate security safeguards and the information is disposed if the transaction does not proceed.
Using “de-identified” data
Organizations can also collect data if it is “de-identified,” though there are also strict guidelines around the use of this kind of data.
Under the new Act, this data cannot be used to “identify an individual person,” unless it is being done to test the effectiveness and accuracy of security safeguards and prediction models, as well as to comply with orders under the law. De-identified information can be used in other contexts, so long as an individual person is not identified.
The Act also draws a clear definition between “de-identified” and “anonymized” data. De-identified data is information that has been modified so that an individual cannot be directly identified from it, though some ability to trace it back to them remains, such as through indirect means. Anonymized data is information that has been “irreversibly and permanently” modified so that there is no ability to trace it back to an individual through any means.
The Act does not apply to fully anonymized personal information, except to establish that organizations must have clear measures in place when it comes to how the information is made anonymous and used.
Adding AI to the mix
The Artificial Intelligence and Data Act proposed in Bill C-27 is new. It aims to ensure “high-impact” AI systems are developed in a way that mitigates the risk of harm and bias, while also outlining penalties for when AI systems unlawfully obtain data or used in a “reckless” way. It also creates an AI and Data Commissioner to monitor compliance and issue audits.
The definition of a “high-impact system” is not expressly defined in the bill, as what it does is give the government in council the ability to establish that criteria. However, the definition of “harm” when it comes to AI including physical or psychological harm, damage to an individual’s property economic loss to an individual.
Penalties for non-compliance
The lack of an ability for the Privacy Commissioner to impose penalties has been lamented by privacy groups and Therrien, most recently when it came to Tim Hortons . Though the previous Bill C-11 would have also given , Therrien said the list of contraventions eligible to be fined were “narrow” and would leave the “most frequent and important” violations.
Under Bill C-27, there are two levels of violations: ones that carry a maximum fine of $10 million or 3% of gross global revenue (whichever is greater), and another that fines $25,000,000 or 5% of gross global revenue. The government pointed out these are among the strongest fines among G7 countries. The EU’s GDPR carries a maximum fine of 10 million Euros (roughly $14 million CAD) or up to 2% of revenues.
The lower tier of punishment is for those that fail to establish measures to manage and report on their use and monitoring of data, including failing to publicly report in plain language how AI systems are intended to be used and the outputs it generates.
The higher tier can be levied against those who use de-identified information to identify an individual person, whose AI systems cause harm, who possesses or uses data that was obtained in a way the contravenes the law and failing to report or record breaches of the Act.