Digital Charter Implementation Act, 2022

An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts

Sponsor

Status

In committee (House), as of April 24, 2023

Subscribe to a feed (what's a feed?) of speeches and votes in the House related to Bill C-27.

Summary

This is from the published bill. The Library of Parliament has also written a full legislative summary of the bill.

Part 1 enacts the Consumer Privacy Protection Act to govern the protection of personal information of individuals while taking into account the need of organizations to collect, use or disclose personal information in the course of commercial activities. In consequence, it repeals Part 1 of the Personal Information Protection and Electronic Documents Act and changes the short title of that Act to the Electronic Documents Act . It also makes consequential and related amendments to other Acts.
Part 2 enacts the Personal Information and Data Protection Tribunal Act , which establishes an administrative tribunal to hear appeals of certain decisions made by the Privacy Commissioner under the Consumer Privacy Protection Act and to impose penalties for the contravention of certain provisions of that Act. It also makes a related amendment to the Administrative Tribunals Support Service of Canada Act .
Part 3 enacts the Artificial Intelligence and Data Act to regulate international and interprovincial trade and commerce in artificial intelligence systems by requiring that certain persons adopt measures to mitigate risks of harm and biased output related to high-impact artificial intelligence systems. That Act provides for public reporting and authorizes the Minister to order the production of records related to artificial intelligence systems. That Act also establishes prohibitions related to the possession or use of illegally obtained personal information for the purpose of designing, developing, using or making available for use an artificial intelligence system and to the making available for use of an artificial intelligence system if its use causes serious harm to individuals.

Elsewhere

All sorts of information on this bill is available at LEGISinfo, an excellent resource from the Library of Parliament. You can also read the full text of the bill.

Votes

April 24, 2023 Passed 2nd reading of Bill C-27, An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts
April 24, 2023 Passed 2nd reading of Bill C-27, An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts

Matthew Hatfield Executive Director, OpenMedia

Good afternoon. I'm Matt Hatfield. I'm the executive director of OpenMedia, a grassroots community of nearly 300,000 people in Canada who work together for an open, accessible and surveillance-free Internet.

I'm speaking to you today from the unceded territory of the Tsawout, Saanich, Cowichan and Chemainus nations.

What is there to say about Bill C-27? One part is long-overdue privacy reform, and your task is closing its remaining loopholes and getting the job of protecting our data done. One part is frankly undercooked AI regulation that you should take out of Bill C-27 altogether and take your time to get right. I can't address both at the length they deserve. I shouldn't have to, but we are where the government has forced us to be, so let's talk privacy.

There are some great changes in Bill C-27. These include real penalty powers for the OPC and the minister's promised amendments to entrench privacy as a human right. OpenMedia hopes this change to PIPEDA will clearly signal to the courts that our ownership of our personal data is more important than a corporation's interest in profiting off that data, but any regulatory regime is only as strong as its weakest link. It does no good for Canada to promise the toughest penalties in the world if they're easy to evade in most real-world cases. The weaknesses of Bill C-27 will absolutely be searched for and attacked by companies wishing to do Canadians harm.

That's why it's critical that you remove the consent exceptions in Bill C-27 and give Canadians the right to ongoing, informed and withdrawable consent for all use of our data. While you're fixing consent, you must also broaden Bill C-27's data rules to apply to every non-governmental body. This includes political parties, non-profit organizations like OpenMedia and vendors that sell data tools to any government body. No other advanced democracy tolerates a special exception to respecting privacy rules for the same parties that write privacy law. That's an embarrassing Canada original, and it shouldn't survive your scrutiny of this bill.

Privacy was the happier side of my comments on Bill C-27. Let's talk AI.

I promise you that our community understands the urgency to put some rules in place on AI. Earlier this year, OpenMedia asked our community what they hoped for and were worried about with generative AI. Thousands of people weighed in and told us they believe this is a huge moment for society. Almost 80% think this is bigger than the smart phone, and one in three of us thinks it will be as big or bigger than the Internet itself. “Bigger than the Internet” is the kind of thing you're going to want to get right, but being first to regulate is a very different thing from regulating right.

Minister Champagne is at the U.K.'s AI safety conference this week, telling media the risk is in doing too little, not too much. However, at the same conference, Rishi Sunak used his time to warn that we need to understand the impact of AI systems far more than we currently do, in order to regulate them effectively, and that no regulation will succeed if countries hosting AI developments do not develop their standards in close parallel. That's why the participants of that conference are working through foundational questions about exactly what is at stake and in scope right now. It's an important, necessary project, and I wish them all success with it.

If they're doing that work there, why are we here? Why has this committee been tasked with jamming AIDA through within a critical but unrelated bill? Why is Canada confident that we know more than our peers about how to regulate AI—so confident that we're skipping the basic public consultation that even moderately important legislation normally receives?

I have to ask this: Is AIDA about protecting Canadians, or is it about creating a permissive environment for shady AI development? If we legislate AI first, without learning in tandem with larger and more cautious jurisdictions, we're not going to wind up with the best protections. Instead, we're positioning Canada as a kind of AI dumping ground, where business practices that are not permitted in the U.S. or the EU can be produced here in rights-violating and even dangerous ways. I'm worried that this is not a bug, but rather the point—that our innovation ministry is fast-tracking this legislation precisely to guarantee Canada will have lower AI safety standards than our peers.

If generative AI is a hype cycle whose products will mostly underwhelm, then this is much ado about not much and there is no need to rush the legislation. However, if even a fraction of it is as powerful as its proponents claim, failing to work with experts and our global peers on best-in-class AI legislation is a tremendous mistake.

I urge you to separate AIDA from Bill C-27 and send it back for a full public consultation. If that isn't in your power, at the very least, you cannot allow Canada to become an AI dumping ground. That's why I urge you to make the AI commissioner report directly to you, our Parliament, not to ISED. A ministry whose mandate is to sponsor AI will have a strong temptation to look the other way on shady practices. The commissioner should be charged with reporting to you yearly on the performance of AIDA and on gaps that have been revealed in it. I also urge you to mandate parliamentary review of AIDA within two years of Bill C-27's taking effect, in order to decide whether it must be amended or replaced.

Since PIPEDA reform was first proposed in 2021, OpenMedia's community has sent more than 24,000 messages to our MPs demanding urgent comprehensive privacy protections. In the last few months, we've sent another 4,000 messages asking our Parliament to take the due time to get AIDA right. I hope you will hear us on both points.

Thank you, and I look forward to your questions.

Tim McSorley National Coordinator, International Civil Liberties Monitoring Group

Thank you, Chair, and thank you for the invitation to share the perspectives of the ICLMG today regarding Bill C-27.

We're a Canadian coalition that works to defend civil liberties from the impact of national security and anti-terrorism laws. Our concerns regarding Bill C-27 are grounded in this mandate.

While we support efforts to modernize Canadian privacy laws and establish AI regulations, the bill unfortunately contains multiple exemptions for national security purposes that are unacceptable and undermine Bill C-27's stated goal of protecting the rights and privacy of people in Canada.

We have submitted a written brief to the committee with 10 recommendations and accompanying amendments. I'd be happy to speak in more detail about any of these during the question period, but for now, I'd like to make three specific points.

First, in regard to the CPPA, we are opposed to proposed sections 47 and 48 of the act, which create exceptions to consent by allowing an organization to disclose, collect or use personal information if it simply “suspects that the information relates to national security, the defence of Canada or the conduct of international affairs”. This is an incredibly low threshold for circumventing consent.

Proposed section 48 is particularly egregious. It allows for an organization of “its own initiative” to collect, use or disclose an individual's personal information if it simply suspects that the information relates to these three areas. The concern does not even need to be connected to a suspected threat. Again, it only needs to relate, and that's not defined in the bill.

Not only are these sections very broad, they're also unnecessary. Other sections of the law would allow for more targeted disclosure to government departments, institutions and law enforcement agencies. For example, proposed section 45 allows an organization to proactively divulge information if it “has reasonable grounds to believe”—a much higher threshold—“that the information relates to a contravention” of a law that has been, is being or will be committed. We contrast that “reasonable grounds to believe” threshold with simply suspecting that it “relates”.

In that regard, we find proposed sections 47 and 48 unnecessary and overly broad. We propose, then, that proposed sections 47 and 48 simply be removed from the CPPA. Barring that, we've proposed specific language in our brief that would help to establish a more robust threshold for disclosing personal information.

Second, we're deeply concerned with the artificial intelligence and data act overall. In line with other witnesses, we believe it is a deeply flawed piece of legislation that must be withdrawn in favour of a more considered and appropriate framework. We have outlined these concerns in our brief, as well as in a joint letter shared with the committee and the minister, signed by 45 organizations and experts in the fields of AI, civil liberties and human rights.

AIDA was developed without appropriate public consultation or debate. It fails to integrate appropriate human rights protections. It lacks fundamental definitions. Egregiously, it would create an AI and data commissioner operating at the discretion of the Minister of Innovation, resulting in a commissioner with no independence to enforce the provisions of AIDA, as weak as they may be.

Finally, I'd like to address an unacceptable exception for national security that is found in AIDA as well.

Canadian national security agencies have been open regarding their interest and use of artificial intelligence tools for a wide range of purposes, including for facial recognition, surveillance, border security and data analytics. However, no clear framework has been established to regulate the development or use of these tools in order to prevent serious harm.

AIDA should present an opportunity to address this gap. Instead, it does the opposite in proposed subsection 3(2), where it explicitly excludes the application of the act to:

a product, service or activity that is under the direction or control of

(a) the Minister of National Defence;

(b) the Director of the Canadian Security Intelligence Service;

(c) the Chief of the Communications Security Establishment; or

(d) any other person who is responsible for a federal or provincial department or agency and who is prescribed by regulation.

This means that any AI system developed by a private sector actor that falls under the direction or control of this open-ended list of national security agencies would face absolutely no independent regulation or oversight.

It is inconceivable how such a broad exemption can be justified. Under such a rule, companies could create tools for our national security agencies without the need to undergo any assessment or mitigation for harm or bias, creating a human rights and civil liberties black hole. What if such technology were leaked, stolen or even sold to state or private entities outside of Canada's jurisdiction? All AI systems developed by the private sector must face regulation, regardless of their use by national security agencies.

Our brief includes specific examples of the harms that this lack of regulation can cause. I'd be happy to discuss these more with the committee. Overall, if AIDA does go ahead, we believe that proposed subsection 3(2) should simply be removed.

Thank you.

Daniel Konikoff Interim Director of the Privacy, Technology & Surveillance program, Canadian Civil Liberties Association

Good afternoon. Thank you for inviting us to appear before you today.

I am the interim director of the privacy, technology and surveillance program at the Canadian Civil Liberties Association, an organization that has been standing up for the rights, civil liberties and fundamental freedoms of people in Canada since 1964.

Protecting privacy and human rights in our tech-driven present is no small undertaking. We commend the government for trying to modernize Canada's legislative framework for the digital age, and we commend the work that this committee is doing to get this legislation right.

We also acknowledge the procedural hurdles that may make it challenging for us to speak completely to Bill C-27 and its potential amendments. However, I will highlight three amendments from CCLA's written submission that we believe must be adopted to make Bill C-27 more respectful of people's rights in Canada.

First, Bill C-27 does not give fundamental rights their due and frequently puts them in second place, behind commercial interests. It has been said before but CCLA believes that it's worth emphasizing that Bill C-27 must be amended to recognize privacy as a human right, both in the CPPA and in AIDA, since privacy is something that should be respected at all points throughout data's life cycle.

This bill must also be amended to recognize our equality rights in the face of data discrimination and algorithmic bias, risks that grow exponentially as more and more data is gathered and fed into AI systems that make predictions or decisions of resounding consequence.

Privacy, data and AI legislation the world over, such as that in the European Union, already have stronger rights-based framing and protections. Canada simply needs to catch up.

Second, there are concerning gaps in Bill C-27 around the issue of sensitive information. Sensitivity is a concept that appears often throughout the CPPA; however, it is left undefined, allowing private interests to interpret its meaning as they see fit. A lot of personal information does qualify as sensitive, and although information's sensitivity often depends on context, there are special categories of information whose collection, use and disclosure carry inherent and extraordinary risks.

I want to draw your attention to one category in particular, the collection and use of which have implications for both the CPPA and AIDA, and that is biometric data.

Biometric data is perhaps the most vulnerable data we have, and its abuse can be particularly devastating to members of equity-seeking groups. Look no further than the prevalence of facial recognition technology. Facial recognition is used everywhere from law enforcement to shopping malls, and it relies on biometric information that is often collected without people's awareness and without people's consent. Right2YourFace coalition, of which CCLA is a member, has advocated having stronger legislative safeguards with respect to facial recognition and the sensitive biometric data that fuels it. Bill C-27 must be amended to not only explicitly define sensitive information and its many categories but also to unequivocally define biometric information as sensitive information worthy of special care and protection.

Third and finally, we take issue with the number of consent carve-outs in proposed section 18 of the CPPA, and how these can ultimately trickle down to AIDA. These carve-outs are, by and large, an affront to meaningful consent, and so to people's right to privacy. People should be able to meaningfully consent or decline to consent to how private companies gather and handle their personal data. Prioritizing a company's legitimate interest to violate consumer consent over people's privacy is simply inappropriate, as is leaving room for more consent carve-outs to be added in regulations later on. Bill C-27 is, frankly, porous with these exemptions and exceptions, and these gaps come at the expense of people's privacy.

There is no shortage of concerns around this bill, and I haven't really spoken to the issues that CCLA has with AIDA's narrow conception of harm, its lack of transparency requirements and its dangerous exclusions of national security institutions whose public mandates are often performed with privately acquired artificial intelligence technologies. We address these issues in greater depth in our written submission to the committee, but I'd be happy to expand on them in questioning.

I'd also like to direct the committee's attention to our written submission, which flags some of these concerns and includes an AI regulation petition that received over 8,000 signatures.

Bill C-27 overall needs tighter provisions to prioritize people's fundamental rights. The CPPA needs to plug its gaps around information sensitivity and consent, and if AIDA is not to be scrapped outright, reset or just separated from this bill, it needs fundamental rethinking.

Thank you.

The Chair Liberal Joël Lightbound

Good afternoon, everyone. I call this meeting to order.

Welcome to meeting no. 94 of the House of Commons Standing Committee on Industry and Technology.

Today's meeting is taking place in a hybrid format, pursuant to the standing orders.

Pursuant to the order of reference of Monday, April 24, 2023, the committee is resuming consideration of Bill C‑27, An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts.

I'd like to welcome our witnesses today: Daniel Konikoff, interim director of the Privacy, Technology & Surveillance program at the Canadian Civil Liberties Association; Tim McSorley, national coordinator at the International Civil Liberties Monitoring Group; Matthew Hatfield, executive director of OpenMedia; Sharon Polsky, president of the Privacy and Access Council of Canada; John Lawford, executive director and general counsel at the Public Interest Advocacy Centre, who is joined by staff lawyer Yuka Sai; and Sam Andrey, managing director of The Dais at Toronto Metropolitan University.

Thank you for being here today.

I'm pleased that we are able to start on time.

Without further ado, Mr. Konikoff from Canadian Civil Liberties Association, you have the floor for five minutes.

November 1st, 2023 / 5:15 p.m.


See context

Director, Policy and Research, Council of Canadian Innovators

Laurent Carbonneau

That's definitely a big question, and I don't think I have a complete answer. I'm not sure that anyone does.

What I would say is that, in a broad sense, I think countries that do well in AI are going to be the ones that are able to develop acceptance for AI adoption and use in societies, and I think that we will have to answer those questions in some format probably sooner rather than later.

We do have a bill before Parliament right now, Bill C-27, that is implementing a legislative framework to develop a regulatory framework around AI. I think there's a lot of scope there, as that comes into force and the regulations are developed, to be quite sensitive to what the future of those kinds of issues looks like.

I will applaud some of CCI's other work here. We released a road map on responsible AI leadership in, I think, early September—time has blurred this fall, as I'm sure it has for many of you—that really gets into some of these issues around public trust.

I think one thing Parliament should strongly consider moving forward is creating a parliamentary science and technology officer who would play an analogous function to what the Parliamentary Budget Officer does and very similar to what the sadly now-defunct Office of Technology Assessment used to do in the U.S. Congress. It would give you as parliamentarians and the public timely, actionable information on emerging technology and science issues that would help inform a lot of these debates and give us all a level ground to understand a lot of these emerging technology issues.

I think that's the kind of social infrastructure, if you will, or parliamentary infrastructure that could play a very helpful function in addressing those kinds of issues and give us, I think, a better basis to do so.

René Villemure Bloc Trois-Rivières, QC

You're talking about adequacy. As we all know, adequacy is a form of social capital.

Unless we act or perhaps completely redo Bill C‑27, do you think that social capital would be threatened?

René Villemure Bloc Trois-Rivières, QC

Thank you, Mr. Chair.

Mr. Balsillie, you discussed the difficult interoperability with Europe in the context of Bill C‑27. Could you be more specific on that subject?

René Villemure Bloc Trois-Rivières, QC

That's very interesting. Thank you.

Ms. Fortin LeFaivre, do you think that Bill C‑27 should align with Quebec's legislation and that the latter should prevail?

October 31st, 2023 / 5:20 p.m.


See context

Founder, Centre for Digital Rights

Jim Balsillie

Bill C-27 turbocharges surveillance capitalism. I talked to Shoshana last week, and we worked through this. She is coming here in February. This turbocharge is insane.

October 31st, 2023 / 5:20 p.m.


See context

Founder, Centre for Digital Rights

Jim Balsillie

Yes, in Europe, for adequacy—and don't assume this bill will get adequacy in Europe—the two most sensitive types of information are children's information and political party information, which were not included in Bill C-27.

There's a minimum standard in British Columbia, and the political parties under the budget bill are claiming that they trump that under a judicial review right now, which is effectively no oversight whatsoever. It shows that you're playing with our democratic structures, our global adequacy and what is a constitutional realm for the provinces and the federal level here. I don't know for what purpose. I don't see anything wrong with raising an appropriate standard and then putting together the proper tool kit to look after the country we all love.

René Villemure Bloc Trois-Rivières, QC

As an ethicist, I am very happy to hear you refer to norms and values that must be in harmony in order to create this act, which would be exemplary.

You've previously spoken out about the fact that political parties aren't subject to Bill C‑27. Would you please clarify that view a little further?

October 31st, 2023 / 4:55 p.m.


See context

Founder, Centre for Digital Rights

Jim Balsillie

To contrast it with bill 25 in Quebec and its effect on Quebec, I think the strategic approach of Bill C-27 will disproportionately harm Quebec, worse than any other region in Canada, for several reasons.

Number one, when you commodify social relationships and cultural properties and they can be exfiltrated and exploited, then you diminish the distinct social society in its control within the province.

Second, when you create ambiguities or different thresholds between the federal and the provincial, you'll naturally have lawyers go deeply into exploiting the lower threshold. You're seeing that happen with federal-provincial party data, where they're saying that the feds control federal political data even though law 25 says that's a provincial realm, but the position of the lawyers, in a judicial review happening in British Columbia now, is that it is not true.

Third, businesses will naturally arbitrage to the lowest jurisdiction. Picture a river between Quebec and another province. If there's a high environmental rule on the Quebec side of the river and a lower on the other side, the business will go to the lower part, even though it's all the same river.

The best way to protect Quebec, Quebec society and the Quebec economy is to make sure that every aspect of this bill is equal or superior to the principles that are in law 25, and currently that is not the case.

René Villemure Bloc Trois-Rivières, QC

Thank you very much.

Mr. Balsillie, what do you think the foreseeable effects of Bill C‑27 will be based on jurisdictions and, in particular, in comparison to Quebec's Bill 25?

Francesco Sorbara Liberal Vaughan—Woodbridge, ON

Mr. Balsillie, I had the pleasure of sitting with you when Mr. Breton, the European commissioner, was here. I believe it was last year or something like that. I think we were talking somewhat about the issues that we're talking about today.

The Europeans have been the first movers on a lot of aspects of the new economy or industrial revolution 4.0 or 5.0—whichever clichéd term we want to use. You have brought your views here in terms of what you think is wrong and why. I respect that, of course. We all do.

In terms of what Bill C-27 intends to do in relation to the modernization of privacy and how we deal with privacy and AI, are there aspects of the bill where we are going in the right direction? Is it just absolutely going in the completely wrong direction?

Francesco Sorbara Liberal Vaughan—Woodbridge, ON

Okay.

I would like to follow up. I believe the CMA commented that it enabled small and medium-sized businesses to compete in the global economy through Bill C-27 as it is. Can you elaborate on that? Bill C-27 is a pretty in-depth bill. I almost wish I had gone to law school to understand most of it, but we're trying to get through it. Could you comment on that aspect quickly?

Then I have a follow-up question for Mr. Balsillie, if I have time.