Digital Charter Implementation Act, 2022

An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts

Sponsor

Status

In committee (House), as of April 24, 2023

Subscribe to a feed (what's a feed?) of speeches and votes in the House related to Bill C-27.

Summary

This is from the published bill. The Library of Parliament has also written a full legislative summary of the bill.

Part 1 enacts the Consumer Privacy Protection Act to govern the protection of personal information of individuals while taking into account the need of organizations to collect, use or disclose personal information in the course of commercial activities. In consequence, it repeals Part 1 of the Personal Information Protection and Electronic Documents Act and changes the short title of that Act to the Electronic Documents Act . It also makes consequential and related amendments to other Acts.
Part 2 enacts the Personal Information and Data Protection Tribunal Act , which establishes an administrative tribunal to hear appeals of certain decisions made by the Privacy Commissioner under the Consumer Privacy Protection Act and to impose penalties for the contravention of certain provisions of that Act. It also makes a related amendment to the Administrative Tribunals Support Service of Canada Act .
Part 3 enacts the Artificial Intelligence and Data Act to regulate international and interprovincial trade and commerce in artificial intelligence systems by requiring that certain persons adopt measures to mitigate risks of harm and biased output related to high-impact artificial intelligence systems. That Act provides for public reporting and authorizes the Minister to order the production of records related to artificial intelligence systems. That Act also establishes prohibitions related to the possession or use of illegally obtained personal information for the purpose of designing, developing, using or making available for use an artificial intelligence system and to the making available for use of an artificial intelligence system if its use causes serious harm to individuals.

Elsewhere

All sorts of information on this bill is available at LEGISinfo, an excellent resource from the Library of Parliament. You can also read the full text of the bill.

Votes

April 24, 2023 Passed 2nd reading of Bill C-27, An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts
April 24, 2023 Passed 2nd reading of Bill C-27, An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts

November 20th, 2023 / 5:30 p.m.


See context

President, Privacy and Access Council of Canada

Sharon Polsky

I wouldn't go so far as saying it's fraudulent, but it's perhaps misleading. It's permitted under the current legislation and under Bill C-27, which is going to maintain the status quo of the same vague consent. That's not going to improve the privacy.

René Villemure Bloc Trois-Rivières, QC

You said earlier that, despite the efforts expended on C‑27, it did nothing to protect us from those kinds of invasions of privacy, is that not correct?

November 20th, 2023 / 5:15 p.m.


See context

President, Privacy and Access Council of Canada

Sharon Polsky

As lawmakers, one thing you could do is not enact Bill C-27, because that's not going to make it better; it's going to make it worse.

What can we do? Is PIPEDA a comfort? No, it is not, because it's not sufficient, as Jennifer Stoddart said when she was in the final days of her role as commissioner. It could use some more teeth. How many years ago was that? It still needs some more teeth. Sure, Canadian organizations are responsible for the proper collection, use, disclosure and all the rest of it under PIPEDA, but when the information goes offshore, they lose control of it. We as Canadians have no recourse when our information is in a foreign nation and goes into the wind, or when we see things that breach our privacy, whether from Equifax, Meta, Google or any other organization.

One commission or another somewhere in the world hammers them with a multi-million dollar fine or hundreds of millions of dollars as a fine. They put it in their financial report as a line item, and it reduces their tax liability—that's sweet, on to the next. That's all. It's lunch money to them. It's to the company, not an individual.

Sharon Polsky President, Privacy and Access Council of Canada

Thank you very much.

Thank you for inviting me to share some views about whether, and how, social media can undermine privacy, safety, security and democracy.

I am Sharon Polsky, president of the Privacy and Access Council of Canada, which is an independent, non-profit, non-partisan organization that is not funded by government or industry. It has members in the public and private sector who routinely use social media in their personal and professional lives.

Many can recall when Google mail was introduced. It was a brilliant marketing manoeuvre that preyed on human nature. Only the chosen few who were selected to have an account could have one. The invitation accorded those few people special status among their peers. This tactic and the media attention created demand. There was no talk about downsides, risk or privacy. People just wanted to have that Google account. It was simple psychology that showed how easily people can be manipulated.

Since then, we have seen countless examples of big tech manipulating us to share the most intimate details of our existence online. Social media continues to leverage human nature, and the lucrative data broker industry is the biggest beneficiary, other than those who would manipulate us for their own benefit, whether they're companies, political parties or governments. With recent geopolitical events, it's easy to think that what people post to social media might be used to coerce, extort or manipulate, but crediting social media alone, or social media from one country or another, is short-sighted.

Online risks reflect society and come from many sources, including familiar communication and collaboration tools that many in this room probably use most days. Every one of them is a real and constant threat. Zoom, Teams, Slack, Facebook and the rest are all foreign.

It's no secret that many companies scrape data and justify their actions by saying they consider the information to be public because their AI systems were able to find it on the web. Maybe the secure location where you posted personal or confidential information, or the Ontario hospital you visited recently, has been breached and now your health condition or sensitive conversations are being sold on the dark web.

If the concern is that people who use social media might disclose information that could make them politically sensitive and at greater risk of being influenced, I look to the recording we hear every time we call our cellphone provider or most other companies that says, “This call will be recorded for training”, which typically means the training of artificial intelligence systems through machine learning. The human side of that training is done in countries around the world by individuals who have access to your sensitive information.

A Finnish tech firm recently started using prison labour to do data labelling. It goes on and on. We have no choice whether the labelling is done by someone in Alberta or in Albania. There is no control over it and there is nothing stopping a company or a government from purchasing information, because it is available largely through the data broker system. It is widely available internationally. I could go on and on.

Yes, certainly education is important. Computers have been on desktops for almost half a century. The education is not there yet, as we see big tech investing tens of billions of dollars a year in objecting to and undermining efforts to regulate the industry, with the claim that it will undermine innovation. It's a red herring that's been disproven many times throughout history.

We see dating sites that people use routinely, which are wonderful for a social life, but when things like the Canadian dating site Ashley Madison are breached, I dare say that many of their customers become politically sensitive.

If children or adults go on any website, usually, before they even see the results, the fact that they have been there—whether it's for mental health, addiction or medical counselling.... That website has already secretly been transmitted to the likes of Facebook and data brokers.

This isn't something Bill C-27 is going to fix, or any of the other legislation. In fact, most of the laws being introduced here and abroad will make the situation much worse for everybody, including children—especially children.

I am happy to take your questions. This is a massive endeavour, and I commend you all.

The Chair Liberal Joël Lightbound

I don't think this will be settled in 15 minutes, however optimistic you are, Mr. Perkins, so if I have your consent, colleagues, I will thank the witnesses for joining us today.

If you wish to hear us debate this motion, you may very well stay. You're more than welcome. However, if you want to leave, know that your testimony has been appreciated, and if there are things that you want to submit to committee members, please do so via the clerk. The documents will be revised as we continue the study of Bill C-27.

Thank you very much.

The Chair Liberal Joël Lightbound

Mr. Généreux has moved a motion that deals with Bill C-27, which is before the committee today, so the motion is in order.

The motion is up for debate.

Go ahead, Mr. Lemire.

Bernard Généreux Conservative Montmagny—L'Islet—Kamouraska—Rivière-du-Loup, QC

I'm going to turn to the other witnesses now.

Did any of you participate in the consultations on Bill C-11 or the bill the committee is currently studying, Bill C-27? Please nod your head if you did.

I see that no one was consulted. All right.

In light of what we've seen since we began our study a few weeks ago, no one seems to have been consulted, but the Minister of Innovation, Science and Industry says that 300 individuals and organizations were consulted after the bill was introduced. I'd like to find those individuals and organizations. I don't know where they are.

In a moment, I'll be giving notice of a motion, but I'd like to ask you a question, first, Ms. Piovesan.

Mr. Balsillie appeared before the committee, and I'm sure you read his remarks. He likened the bill to a bucket that has holes. What witnesses have told us so far seems to suggest that the bucket basically has no bottom. That's what it seems like.

You talked about the fact that the committee has heard opposing views from witnesses. Take the tribunal, for instance. Some suggested getting rid of it because we didn't need it, while others argued the opposite, that having a tribunal in the sector was important.

Given how far apart on the spectrum people's views are, do you think the bill should have been split from the beginning? We've heard from the start that the bill is almost monstrous, that it's too big, that the privacy piece and the AI piece should have been dealt with separately.

What do you think?

November 9th, 2023 / 5:05 p.m.


See context

Co-founder and Partner, INQ Law, As an Individual

Carole Piovesan

Okay.

I participated in the national consultations on data and digital literacy, I think it was, in 2018. I participated as an innovator—as one of the innovation leads.

I did not participate in the drafting of the digital charter, nor in the white paper to reform PIPEDA that came out at that time. I have not participated in the drafting of any of these laws, neither Bill C-11 nor Bill C-27.

Bernard Généreux Conservative Montmagny—L'Islet—Kamouraska—Rivière-du-Loup, QC

Thank you, Mr. Chair.

Thank you to the witnesses as well.

Today's discussion is fascinating. I am very interested in what you have to say.

Ms. Piovesan, if I understood correctly, you helped draft Bill C-11, the predecessor to the bill before us today, Bill C-27.

Sébastien Lemire Bloc Abitibi—Témiscamingue, QC

Thank you, Mr. Chair.

Over the past few months, Ms. Piovesan, in our role as MPs, we've held a number of meetings with businesses that operate in Quebec, including small and medium-sized Quebec start-ups.

Since AIDA contains little in the way of details and imposes criminal liability on companies that use high-impact systems, in a podcast, you called Bill C‑27 an advanced draft. You raised the issue of the criminality component.

Can you explain what the bill is missing, and why that undermines how confident and comfortable businesses are operating both in Quebec and in Canada? How should Bill C‑27 be clarified to take it from a draft bill, as you put it, to a real one?

November 9th, 2023 / 4:55 p.m.


See context

Partner, Davies Ward Phillips & Vineberg LLP, As an Individual

Alexander Jarvie

Yes. If we were to undertake the suggestion to combine or generalize proposed sections 35 and 39, to make it a bit more like the framework in Quebec's Law 25, which begins at section 21 of that law, then it would involve what is styled in that law as “privacy impact assessments”. That isn't a concept that figures, as such, in Bill C-27, but I think it's been discussed to some extent at this committee already. It's been broadly outlined. It's understood. You're examining the disclosure in this case, or the collection.

I suggest that after seeing what kind of privacy impact it has, you do a proportionality analysis and many other things besides that. If an agreement is entered into between the parties to the exchange, it should have certain contractual assurances around how the information is to be handled throughout its life cycle for this purpose. Finally, notice should be given to the commissioner.

As I said, in Quebec's case, you actually submit the agreement to the commissioner, and then you can activate or operationalize that agreement only after 30 days, giving the Quebec commissioner time to respond, presumably. Once the commissioner has notice, they can of course simply request the agreement. They can request the privacy impact assessment and undertake any other steps. The important thing is to provide notice, so that the commissioner is aware.

November 9th, 2023 / 4:55 p.m.


See context

Partner, Davies Ward Phillips & Vineberg LLP, As an Individual

Alexander Jarvie

I agree with many of the other witnesses here today in supporting a change to the definition of anonymization to align more with Quebec's definition and maintain some interoperability there. Given the way it's drafted now, it's an impossible standard.

The other—and I'll make reference to my opening remarks—would be to change the consent exception framework for public interest purposes. That includes proposed sections 35 and 39. I think, in this regard, we could take some inspiration from Law 25, which inserted a new framework for disclosures by private sector entities to other private sector or public sector entities. That includes undertaking a privacy impact assessment and entering into an agreement with the other party. In the case of Quebec, it's actually submitting the agreement to the Commission d’accès à l’information. In the case of Bill C-27, it's adapting the language from proposed paragraph 35(c), which suggests notice to the commissioner at the very least.

In addition to allowing for information exchanges among private sector entities, which could be beneficial, I think it could also be extended to include taking information from the public Internet. As we know, machine learning technologies, in many cases, can benefit from having access to this, provided that some appropriate guardrails are in place, as suggested.

Brad Vis Conservative Mission—Matsqui—Fraser Canyon, BC

Would it be your position that we should adopt a definition of sensitive information that is similar to the Quebec law and include it in Bill C-27?

Brad Vis Conservative Mission—Matsqui—Fraser Canyon, BC

Thank you.

You touched upon the GDPR in some of your comments as well. This question relates to a debate that's starting to form—we haven't really touched on it too much—between privacy by design and.... Unlike the European Union's GDPR, the CPPA does not contain an explicit reference to the concept of privacy by design.

In the Office of the Privacy Commissioner of Canada's submission on Bill C-27, the commissioner recommends that the CPPA require organizations to implement privacy by design measures for a product, service or initiative from the earliest stages of development.

During their appearance before the committee, however, government representatives indicated that several elements of the CPPA, such as the fact that it requires organizations to develop a privacy management program, mean that the concept of privacy by design is already embedded in the legislation.

Do we need something similar to the GDPR, where it's explicitly stated, or is the current approach of privacy management as contained in proposed section 9 going to work okay?

Brian Masse NDP Windsor West, ON

Thank you, Mr. Chair.

To continue, I think it was Mr. Young who mentioned Bill C-27 and Quebec's Law 25. Can you give us a little more background as to why it's important to have consistency there?

Also, potentially, could we inadvertently cause some damage to Quebec with regard to this bill if we don't handle this properly? I'm worried. We're looking at neutrality for Quebec at the very least, I think, as an objective, but I'm also worried about inadvertently damaging their system right now.

Perhaps you could start us off on that conversation.