Digital Charter Implementation Act, 2022

An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts

Sponsor

Status

In committee (House), as of April 24, 2023

Subscribe to a feed (what's a feed?) of speeches and votes in the House related to Bill C-27.

Summary

This is from the published bill. The Library of Parliament has also written a full legislative summary of the bill.

Part 1 enacts the Consumer Privacy Protection Act to govern the protection of personal information of individuals while taking into account the need of organizations to collect, use or disclose personal information in the course of commercial activities. In consequence, it repeals Part 1 of the Personal Information Protection and Electronic Documents Act and changes the short title of that Act to the Electronic Documents Act . It also makes consequential and related amendments to other Acts.
Part 2 enacts the Personal Information and Data Protection Tribunal Act , which establishes an administrative tribunal to hear appeals of certain decisions made by the Privacy Commissioner under the Consumer Privacy Protection Act and to impose penalties for the contravention of certain provisions of that Act. It also makes a related amendment to the Administrative Tribunals Support Service of Canada Act .
Part 3 enacts the Artificial Intelligence and Data Act to regulate international and interprovincial trade and commerce in artificial intelligence systems by requiring that certain persons adopt measures to mitigate risks of harm and biased output related to high-impact artificial intelligence systems. That Act provides for public reporting and authorizes the Minister to order the production of records related to artificial intelligence systems. That Act also establishes prohibitions related to the possession or use of illegally obtained personal information for the purpose of designing, developing, using or making available for use an artificial intelligence system and to the making available for use of an artificial intelligence system if its use causes serious harm to individuals.

Elsewhere

All sorts of information on this bill is available at LEGISinfo, an excellent resource from the Library of Parliament. You can also read the full text of the bill.

Votes

April 24, 2023 Passed 2nd reading of Bill C-27, An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts
April 24, 2023 Passed 2nd reading of Bill C-27, An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts

November 2nd, 2023 / 5:15 p.m.


See context

President, Privacy and Access Council of Canada

Sharon Polsky

I think each country wants to be the first. As was questioned earlier, is that the right choice? Canada is marching forward and pushing this through, but to what benefit and, more concerning, to what harm?

When it comes to the EU and the U.K., yes, they've given thought and lots of consultation, but I think it's important to not consider these pieces of legislation in isolation, because on one hand we have robust AI regulations coming out of the same country that just passed the euphemistically named “Online Safety Act” that requires all content to be monitored, including yours, because the Internet is global.

How do we protect anything when AI is behind the scenes? AI is used in these buildings, in airports and in shopping centres. It's everywhere already.

Yes, they have a jump on Canada. Is it the right direction? It's certainly better than what we have in Bill C-27. There is no disagreement on that, whether from today's meetings or from many of your previous witnesses. We can look to our European counterparts. They are on a better path. That's about as generous as I can get right now.

Brad Vis Conservative Mission—Matsqui—Fraser Canyon, BC

Thank you.

Ms. Polsky, what in your mind are some of the biggest gaps in Bill C-27's protection of children, beyond the sensitive information that I have already raised?

Brad Vis Conservative Mission—Matsqui—Fraser Canyon, BC

Okay.

Bill C-27 does not include a definition of “sensitive information”, yet it does outline that children's data would be subject to sensitive information. Do you think it's problematic that the government did not include a definition of sensitive information for both general purposes and specifically for children?

Brad Vis Conservative Mission—Matsqui—Fraser Canyon, BC

Thank you, Mr. Chair.

Mr. Hatfield, last year OpenMedia gave Bill C-27 a failing grade of D. Referring specifically to protections for children, how would you grade the protection of children in Bill C-27?

November 2nd, 2023 / 4:50 p.m.


See context

President, Privacy and Access Council of Canada

Sharon Polsky

Yes, if I may.

I think it's a terrific idea if the law requires that the regulator and others be fully funded so that they can actually do the job they are tasked with doing, and if they are able to write it into AIDA when it's split out from Bill C-27 and becomes its own, please, so that before AI products are allowed to be put on the market—I don't care from where in the world they are—they must go through basically a testing sandbox. It's not the self-interested vendor saying, “Don't worry your pretty little head; it's not biased.” It's an independent officer of Parliament whose office will identify and test the products—confidentially, with no secrets being divulged and no IP worries on behalf of the companies—so that, the same way any other product needs to be fit for purpose before it's released on the market, AI products must also.

Simon-Pierre Savard-Tremblay Bloc Saint-Hyacinthe—Bagot, QC

You said that AI must continue to be adopted responsibly, because it contributes to the prosperity of our people and our economies.

What needs to be incorporated into Bill C‑27 to make that happen?

Simon-Pierre Savard-Tremblay Bloc Saint-Hyacinthe—Bagot, QC

You talked about it. You said that artificial intelligence should be used responsibly and that it is a good tool for prosperity.

What needs to be included in Bill C‑27 so that we can promote the responsible adoption of AI?

Viviane LaPointe Liberal Sudbury, ON

Thank you, Mr. Chair.

Mr. Andrey, I would be interested in hearing your thoughts on Bill C-27 and its objectives to address online misinformation and online harm.

Simon-Pierre Savard-Tremblay Bloc Saint-Hyacinthe—Bagot, QC

According to Bill C‑27 as it currently stands, who should consumers turn to if they want to contest a decision made by an automated system or obtain clarification about that decision?

Simon-Pierre Savard-Tremblay Bloc Saint-Hyacinthe—Bagot, QC

You would therefore be in favour of adding provisions to Bill C‑27, provisions similar to those adopted in Europe and Quebec.

Simon-Pierre Savard-Tremblay Bloc Saint-Hyacinthe—Bagot, QC

Thank you, Mr. Chair.

I'd like to thank the committee for having me here today, even though it's not a committee I usually sit on. It's a pleasure to be here.

I want to thank the witnesses for their presentations.

Mr Konikoff, if you don't mind, I'd like to talk about automated decision systems. As we know, Bill C‑27 grants a new right, namely the right for an individual to receive an explanation about the use of these systems. However, unlike Quebec's Bill 25, Bill C‑27 does not contain provisions that would allow a person to object to the use of an automated decision system or to have a review of the decisions made by such a system.

In your opinion, what are the potential repercussions for consumers and users if Bill C‑27 does not include such provisions?

Tony Van Bynen Liberal Newmarket—Aurora, ON

Thank you.

Mr. Konikoff, in your brief to the committee, you recommended deleting proposed paragraph 18(2)(d) in the consumer privacy protection act, which provides an exception to consent for “any other prescribed activity.”

Conversely, in the brief of the Office of the Privacy Commissioner of Canada on Bill C-27, the Privacy Commissioner recommends amending this provision to require that all prescribed business activities for the purposes of proposed subsection 18(2) be activities necessary to achieve a specific purpose. What do you think of the Privacy Commissioner's recommendation?

Rick Perkins Conservative South Shore—St. Margarets, NS

In other words, it's not there. There's nothing in Bill C-27 that prevents this practice from continuing, where a business says it's changing the current terms and conditions of consent and complying with the law by posting it. If anyone challenges it, even though they'll never discover it because they don't know it happened, the business can go through a process to appeal and under the CPPA's proposed subsection 18(3) in particular, it can say, “Too bad. We have the right because it's in our business's legitimate interest to do so.”

John Lawford Executive Director and General Counsel, Public Interest Advocacy Centre

Thank you, Chair.

The Public Interest Advocacy Centre is a national, non-profit and registered charity that provides legal and research services on behalf of consumers—in particular, vulnerable consumers. PIAC has been active in the field of consumer privacy law and policy for over 25 years.

My name is John Lawford. I'm the executive director and general counsel. With me today is Yuka Sai, staff lawyer at PIAC.

Bill C-27 reverses 25 years of privacy law in Canada. Businesses can now assume consent, and consumers must prove abuse. If this sounds uncomfortable from an individual rights perspective, that's because it is.

Firstly, with regard to consent, the new business activities exception to consent, which is in proposed subsection 18(1), makes full use of your personal information without your consent, or even your knowledge, legal for business. Business activities are defined so widely and tautologically in proposed subsection 18(2) that only businesses will be able to define what a business is. It's ridiculous. Proposed section 18 completely reverses the default of an individual's informed consent for the collection or use of personal information under PIPEDA. Do Canadians really want that?

The addition of an exception to consent and knowledge in proposed subsection 18(3), for the collection or use of additional personal information for legitimate interests, is an import from European law but without the fundamental right to privacy that it modulates in Europe.

Secondly, with regard to de-identification, under proposed section 20, consumers also lose out on opportunities to scrutinize the use of their personal information when it is de-identified. De-identify is defined as:

to modify personal information so that an individual cannot be directly identified from it, though a risk of the individual being identified remains.

It is akin to saying that to kill means to take the life of a person directly, although a chance of their remaining alive remains. It is contradictory and meaningless.

De-identification was also clearly a “use” of personal information under PIPEDA. What that use approach stops is the indiscriminate filling of databases with personal information with only the most cursory removal of tombstone information identifiers from the data. Reidentification is therefore a real risk, but even de-identified information can harm individuals when they are profiled in databases that are then used to market to them or to deny them services. Bill C-27 supercharges this outcome.

Go ahead, Yuka.

Sharon Polsky President, Privacy and Access Council of Canada

Thank you.

Thank you for inviting me to share some views about Bill C-27 on behalf of the Privacy and Access Council of Canada, an independent, non-profit and non-partisan organization that is not funded by government or by industry.

Our members in public, private and non-profit sector organizations work with and assess new technologies every day, as have I through my 30-plus-year career as a privacy adviser. For that entire time, we have all heard the same promise: Technology will provide great benefits. To an extent, it has.

We’ve also been nudged to do everything digitally, and data is now the foundation of many organizations that collect, analyze and monetize data, often without the knowledge, much less the real consent, of the people the data is about.

It's understandable that there's great support for Bill C-27, except that many of the people who support it don't like it. They figure, though, that it's taken 20 years to get this much, and we can't wait another 20 for something better to replace PIPEDA, so it's better than nothing at all.

With respect, we disagree. We do not share the view that settling for the sake of change is better than standing firm for a law that, at its heart, would definitively state that Canadians have a fundamental right to privacy. The minister's concession to add that into the bill itself and not just the preamble is very welcome.

We disagree that settling for bad law is better than nothing, and Bill C-27 is bad law because it would undermine everyone's privacy, including children's—however they're defined in each jurisdiction. It also does nothing to counter the content regulation laws that would undermine encryption, would criminalize children who try to report abuse and would make it impossible for even your private communications to be confidential, whether you consent or not.

Definition determines outcomes, and Bill C-27 starts off by defining us all as “consumers” and not as individuals with a fundamental human right to privacy. It promotes data sharing to foster commerce, jobs and taxes. It adds a new bureaucracy that would be novel among data protection authorities and would delay individuals' recourse by years. It does not require AI transparency or restrict AI use by governments, only by the private sector that has not yet been deputized by government, which then gets sheltered by our current ATIP laws.

It won't slow AI and facial recognition from infiltrating our lives further. It won't slow the monetization of our personal information by a global data broker industry already worth more than $300 billion U.S. It doesn't impose any privacy obligations on political parties. It doesn't allow for executives to be fined—only organizations that then include the fine as a line item in their financials and move on, happy that their tax liabilities have been reduced.

Bill C-27 does allow personal information to be used for research but by whom or where in the world isn't limited. Big pharma using your DNA to research new medicines without your consent is just fine if it's been de-identified, although it can be easily reidentified, and larger and larger AI datasets make that more and more likely every day.

Bill C-27 would require privacy policies to be in plain language, and that would be great if it stated the degree of granularity required, but it doesn't. It allows the same vague language and generalities we now have, yet it still doesn't allow you to control what data about you may be shared or with whom, or give you a way to be forgotten.

It lets organizations collect whatever personal information they can from you and about you, without consent, as long as they say, in their self-interested way, that it's to make sure nothing about you is a threat to their “information, system or network security”, or if they say the collection and use “outweighs any potential adverse effect” on you resulting from that collection or use, and leaves it to you to find out about and to challenge that claim.

We've all heard industry's threat that regulation will hamper innovation. That red herring was invalidated when radio didn't kill newspapers, TV didn't kill radio and the Internet didn't kill either one. Industry adapted and innovated, and tech companies already do that with each new product, update and patch.

Companies that have skirted the edge of privacy compliance can adapt and innovate and can create things that, at their core, have a genuine respect for privacy, human rights, and sound ethics and morality. They can, but in almost a half a century since computers landed on desktops, most haven't. Politely asking organizations to consider the special interests of minors is lovely but hardly compelling, considering that, 20 years after PIPEDA came into force, barely more than half of Canadian companies the OPC surveyed have privacy policies or have even designated someone to be responsible for privacy.

Those are basic and fundamental components of a privacy management program that do not take 20 years to figure out. We don't have time to wait, but we also cannot afford legislation that is inadequate before it's proclaimed, that's not aligned with Quebec's Law 25, the U.S. executive order on AI or other jurisdictions that are well ahead of Canada on this. We also can't afford something that further erodes trust in government and industry as it freely trades away the privacy rights of Canadians for the sake of commercial gain.

I will be happy to answer your questions, and we will be detailing our views in a submission to the committee. I hope you hear us.