Digital Charter Implementation Act, 2020

An Act to enact the Consumer Privacy Protection Act and the Personal Information and Data Protection Tribunal Act and to make consequential and related amendments to other Acts

This bill was last introduced in the 43rd Parliament, 2nd Session, which ended in August 2021.

Sponsor

Navdeep Bains  Liberal

Status

Second reading (House), as of April 19, 2021
(This bill did not become law.)

Summary

This is from the published bill. The Library of Parliament often publishes better independent summaries.

Part 1 enacts the Consumer Privacy Protection Act to protect the personal information of individuals while recognizing the need of organizations to collect, use or disclose personal information in the course of commercial activities. In consequence, it repeals Part 1 of the Personal Information Protection and Electronic Documents Act and changes the short title of that Act to the Electronic Documents Act. It also makes consequential and related amendments to other Acts.
Part 2 enacts the Personal Information and Data Protection Tribunal Act, which establishes an administrative tribunal to hear appeals of certain decisions made by the Privacy Commissioner under the Consumer Privacy Protection Act and to impose penalties for the contravention of certain provisions of that Act. It also makes a related amendment to the Administrative Tribunals Support Service of Canada Act.

Elsewhere

All sorts of information on this bill is available at LEGISinfo, an excellent resource from the Library of Parliament. You can also read the full text of the bill.

Bernard Généreux Conservative Montmagny—L'Islet—Kamouraska—Rivière-du-Loup, QC

Thank you, Mr. Chair.

Thanks to the witnesses. Welcome to the great Liberal darkness club. This makes me feel like a dog chasing its tail. I use that metaphor because I just saw Ms. McPherson's dog on the screen.

We are all here to discuss a bill that, as Mr. Champagne announced to us three weeks ago, would be subject to eight amendments, some of which will be major.

Mr. Balsillie, earlier you said that Mr. Bains consulted you at the time about Bill C-11 and that you had made recommendations. The current minister, Mr. Champagne, tells us he has consulted 300 organizations and experts.

Ms. Vipond, you clearly weren't in the group. At any rate, many of the witnesses here probably weren't in the consulting group, since they're asking us today to hold more consultations and that they be permanent and ongoing depending on how the bill evolves.

Mr. Balsillie, almost all the comments you've made on this bill thus far have been negative. Can you see anything anywhere in this bill that might be positive, or do you think we should simply toss it out and start over?

Based on what we have before us today, I think we've confused “privacy” with “artificial intelligence”. These are two completely different things, but we're putting everything in the same basket.

We would've liked to hear what you had to say about artificial intelligence. I'm convinced you would have liked to talk to us about that at greater length as well. So allow me to give you the floor.

October 31st, 2023 / 4:45 p.m.


See context

Founder, Centre for Digital Rights

Jim Balsillie

Not at all.

I have a bit of an advantage over everyone here in that I was in the small meeting where then Minister Bains and then deputy minister Knubley presented the original Bill C-11. They said that they were approaching this as some kind of balance, and I said, “Who concocted this concept of a trade-off between the two?” They, in fact, re-enforce each other. It's a false dichotomy.

Bernard Généreux Conservative Montmagny—L'Islet—Kamouraska—Rivière-du-Loup, QC

Thank you, Mr. Chair.

I would like to thank my colleague for sharing his time with me.

I'd like to thank all the witnesses for being here.

Dr. Geist, I'd like to talk about the way this bill, formerly Bill C‑11, has been presented over the past two years. We know that amendments were requested and that the minister didn't really listen, because the new version is no better. So here we are, 18 months later, and you are having to testify about this bill.

During this whole process, which is set to last several months, we will be meeting with about 100 witnesses. How do you feel about this process, when we haven't had access to the eight amendments put forward by the minister, other than the few lines we've be able to get so far? I'm asking because you talked about this earlier.

I'd like you to speak as a witness. I'm not necessarily asking you to speak on behalf of others, but at the very least I'd like people to understand the process we are currently in, which I consider to be skewed. How can you or any of the witnesses who will appear possibly give your opinions on the content of a bill without access to the amendments?

Brad Vis Conservative Mission—Matsqui—Fraser Canyon, BC

Thank you, Mr. Chair.

Thank you to our witnesses here today.

I'm somewhat concerned about this bad bill before us today.

With Bill C-11, the Government of Canada had an opportunity to enshrine the fundamental right to privacy for children, to define what a minor is, to define perhaps an age of consent and do a whole bunch of stuff to ensure that children were protected. That bill died on the Order Paper.

Then, we had Bill C-27 when this Parliament opened up again. The minister again had an opportunity to enshrine the fundamental right for children to protect their privacy in some of the actions they may take online. Then the government had the opportunity to define what sensitive information is—likely in the context of a child. They had an opportunity to define what a socially beneficial purpose was in the context of a child.

The minister came before us a few weeks ago. He said, “I have this bill. It's going to do so much work to protect children, but we have to amend it.” Then we had to put a motion forward to get a copy of those amendments. We're here today. I am not going to relent on this until we have more clarification and I hear from as many witnesses as possible to ensure that children's rights are protected.

My question is open-ended. I'll start with you, Mr. Geist. What clauses of the bill do you believe need to be amended to ensure that a child's fundamental right to privacy and their online actions are not used in a way that will compromise them as adults, or at a future period of time in their life?

Sébastien Lemire Bloc Abitibi—Témiscamingue, QC

Thank you, Mr. Chair.

I'd like to thank all the witnesses.

Mr. Bennett, in your February 12, 2021, submission to the public consultations on Bill C‑11, you distinguished between the concepts of interoperability and harmonization. I believe this is particularly germane to the subject before us, because these two concepts can be confused. You showed the difference between the two with an example I'd like to quote:

For instance, the processes for doing PIAs should be interoperable between the federal government and the provinces. If an organization does a PIA under the authority of one law, it may need the assurance that the PIA will also be acceptable in another jurisdiction. But that does not necessarily mean the harmonization or convergence of rules.

First, can you provide us with a definition of these two distinct concepts?

Second, can you tell us whether the provisions of Bill C‑27 promote the interoperability of processes among the various levels of government or rather the harmonization of rules?

October 26th, 2023 / 4:20 p.m.


See context

Acting Executive Director, Master of Public Policy in Digital Society Program, McMaster University, As an Individual

Dr. Brenda McPhail

I think there will always be differences of opinions as to whether definitions are sufficiently stringent or overly weak.

What would address our concerns? There are three categories of concerns that we have around de-identified and anonymized information. The first is that the definition has been weakened between Bill C-11 and the current iteration, Bill C-27. In the past definition, it included indirect identifiers. You can identify me by my name, but you can also identify me if you have a combination of my postal code, my gender and a few other factors about me. To truly de-identify information to an adequate standard where re-identification is unlikely, I believe—and my co-submitters believe—that the definition should include indirect identifiers.

To some degree, that definition has been weakened because Bill C-27 includes the addition of a new category of information: anonymized information. The problem with that new category is that technically people agree that it's extremely difficult to achieve perfect and effective anonymized information, and by taking anonymized information out of the scope of the bill, what we do is remove it from the ability of the Office of the Privacy Commissioner of Canada to inspect the processing that has happened to ensure that it has been done to a reasonable standard.

Like some of the witnesses you heard from—who would disagree with me about whether or not definitions should be stronger or weaker—I think we all agree on the reality that when personal information is processed, whether it is used to create de-identified information or anonymized information, there should be some checks and balances to make sure that the companies doing it are doing it to a reasonable standard that is broadly accepted. The way to achieve that is by including the ability within the bill for the Office of the Privacy Commissioner to inspect that processing and give it a passing grade, should that be necessary.

The last piece of concern we have with anonymization, which makes that scrutiny even more important, is that the bill conflates anonymization with deletion. It was introduced to great fanfare when this bill was put forward that individuals would now have a right to request deletion of their personal information from the companies with which they deal.

That right, I believe, is rendered moderately illusory. Certainly members of the public would not expect that if they ask for their information to be deleted, an organization could say, yes, they'll do that, and then simply anonymize the information and continue to use it for their own purposes. If we are going to allow anonymized information to be equivalent to deletion, again, it's incredibly important that we are 100% certain that the equivalency is real and valid, that truly no individual can be identified from that information and that it's not going to harm them in its use after they've explicitly exercised their right to ask for deletion.

October 26th, 2023 / 4:10 p.m.


See context

Canada Research Chair in Information Law and Policy, Faculty of Law, Common Law Section, University of Ottawa, As an Individual

Dr. Teresa Scassa

Thank you.

I have concerns about both the CPPA and the AIDA. Many of these have been communicated in my own writings and in the report submitted to this committee by the Centre for Digital Rights. My comments today focus on the consumer privacy protection act. I note, however, that I have very substantial concerns about the AI and data act, and I would be happy to answer questions on that, as well.

Let me begin by stating that I am generally supportive of the recommendations of Commissioner Dufresne for the amendment of Bill C‑27, as set out in his letter of April 26, 2023 to the chair of this committee.

I will address three other points.

The minister has chosen to retain consent as the backbone of the CPPA, with specific exceptions to consent. One of the most significant of these is the “legitimate interest” exception in proposed subsection 18(3). This allows organizations to collect or use personal information without knowledge or consent if it is for an activity in which an organization has a legitimate interest. There are guardrails: The interest must outweigh any adverse effects on the individual; it must be one that a reasonable person would expect; and the information must not be collected or used to influence the behaviour or decisions of the individual. There are also additional documentation and mitigation requirements.

The problem lies in the continuing presence of “implied consent” in proposed subsection 15(5) of the CPPA. PIPEDA allowed for implied consent because there were circumstances where it made sense and there was no legitimate interest exception. However, in the CPPA, the legitimate interest exception does the work of implied consent. Leaving implied consent in the legislation provides a way to get around the guardrails in proposed subsection 18(3). An organization can opt for the implied consent route instead of legitimate interest. It will create confusion for organizations that might struggle to understand which is the appropriate approach. The solution is simple: Get rid of implied consent. I note that implied consent is not a basis for processing under the GDPR. Consent must be expressed, or processing must fall under another permitted ground.

My second point relates to proposed section 39 of the CPPA: an exception to an individual's knowledge and consent where information is disclosed to a potentially very broad range of entities for “socially beneficial purposes”. Such information need only be de-identified—not anonymized—making it more vulnerable to re-identification. I question whether there is social licence for sharing de-identified rather than anonymized data for these purposes. I note that proposed section 39 was carried over verbatim from Bill C-11, when “de-identified” was defined to mean what we now understand as anonymized. Permitting disclosure for socially beneficial purposes is a useful idea, but proposed section 39, especially with the shift in meaning of “de-identified”, lacks necessary safeguards.

First, there is no obvious transparency requirement. If we are to learn anything from the ETHI committee's inquiry into PHAC's use of Canadians' mobility data, transparency is fundamentally important. At the very least, there should be a requirement that written notice of data sharing for socially beneficial purposes be given to the Privacy Commissioner of Canada. Ideally, there should also be a requirement for public notice. Further, proposed section 39 should provide that any sharing be subject to a data-sharing agreement, which should also be provided to the Privacy Commissioner. None of this is too much to ask where Canadians' data are conscripted for public purposes. Failure to ensure transparency and a basic measure of oversight will undermine trust and legitimacy.

My third point relates to the exception to knowledge and consent for publicly available personal information. Bill C-27 reproduces PIPEDA's provision on publicly available personal information, providing in proposed section 51 that “An organization may collect, use or disclose an individual's personal information without their knowledge or consent if the personal information is publicly available and is specified by the regulations.” We have seen the consequences of data scraping from social media platforms in the case of Clearview AI, which used scraped photographs to build a massive facial recognition database. The Privacy Commissioner takes the position that personal information on social media platforms does not fall within the “publicly available personal information” exception.

Not only could this approach be upended in the future by the new personal information and data protection tribunal, but it could also easily be modified by new regulations. Recognizing the importance of proposed section 51, former Commissioner Therrien recommended amending it to add that the publicly available personal information be “such that the individual would have no reasonable expectation of privacy.” An alternative is to incorporate the text of the current regulations specifying publicly available information into the CPPA, revising them to clarify scope and application in our current data environment. I would be happy to provide some sample language.

This issue should not be left to regulations. The amount of publicly available personal information online is staggering, and it is easily susceptible to scraping and misuse. It should be clear and explicit in the law that personal data cannot be harvested from the Internet, except in limited circumstances set out in the statute.

Finally, I add my voice to those of so many others in saying that data protection obligations set out in the CPPA should apply to political parties. It is unacceptable that they do not.

Thank you.

Dr. Brenda McPhail Acting Executive Director, Master of Public Policy in Digital Society Program, McMaster University, As an Individual

Thank you, Mr. Chair and members of the committee, for inviting me here today to speak to the submission authored by Jane Bailey, professor at the faculty of law of the University of Ottawa; Jacquelyn Burkell, professor at the faculty of information and media studies at Western University; and myself, currently the acting executive director of the public policy and digital society program at McMaster University.

It is a privilege to appear before you on this omnibus bill, which needs significant improvement to protect people in the face of emerging data-hungry technologies.

I will focus on part 1 and very briefly on part 3 of the bill in these initial remarks, and I welcome questions on both.

Privacy, of course, is a fundamental underpinning of our democratic society, but it is also a gateway right that enables or reinforces other rights, including equality rights. Our written submission explicitly focuses on the connection between privacy and equality, because strong, effective privacy laws help prevent excessive and discriminatory uses of data.

We identified eight areas where the CPPA falls short. In these remarks, I will focus on four.

First of all, privacy must be recognized as a fundamental human right. Like others on this panel, while we welcome the amendment suggested by Minister Champagne, we would note that proposed section 12 in particular also requires amendment so that the analysis to determine whether information is collected or used for an appropriate purpose is grounded in that right.

Bill C-27 offers a significant improvement over PIPEDA in explicitly bringing de-identified information into the scope of the law, but it has diminished the definition from the predecessor law, Bill C-11, by removing the mention of indirect identifiers. The bill also introduces a new category, anonymized information, which is deemed out of the scope of the act, in contrast to the superior approach taken by Quebec. Given that even effective anonymization of personal data fails to address the concerns about social sorting that sit at the junction of privacy and equality, all data derived from personal information, whether identifiable, de-identified or anonymized, should be subject to proportionate oversight by the OPC, simply to ensure that it's done right.

Third, proposed subsection 12(4) weakens requirements for purpose specification. It allows information collected for one purpose by organizations to be used for something else simply by recording that new purpose any time after the initial collection. How often have you shared information with a business and then gone back a year later to see if it had changed its mind about how it's going to use it? At a minimum, the bill needs constraints that limit new uses to purposes consistent with the original consensual purpose.

Finally, the CPPA adds a series of exceptions to consent. I'll focus here on the worst, the legitimate interest exception in proposed subsection 18(3), which I differ from my colleagues in believing should be struck from the bill. It is a dangerously permissive exception that allows collection without knowledge or consent if the organization that wants the information decides its mere interest outweighs adverse impacts on an individual.

This essentially allows collections for organizational purposes that don't have to provide benefits to the customer. Keeping in mind that the CPPA is the bill that turns the tap for the AIDA on or off, this exception opens the tap and then takes away the handle. Here, I would commend to you the concerns of the Right2YourFace coalition, which flags this exception as one in which organizations may attempt to justify and hide their use of invasive facial recognition technology.

Turning to part 3 of Bill C-27, the AIDA received virtually no public consultation prior to being included in Bill C-27, and that lack of feedback has resulted in a bill that is fundamentally underdeveloped and prioritizes commercial over public interests. The bill, by focusing only on high-impact systems, leaves systems that fail to meet the threshold unregulated. AI can impact equality in nuanced ways not limited to systems that may be obviously high-impact, and we need an act that is flexible enough to also address bias in those systems in a proportionate manner.

A recommender system is mundane these days, yet it can affect whether we view the world with tolerance or prejudice from our filter bubble. Election time comes to mind as a time when that cumulative impact could change our society. Maybe that should be in, and maybe it should be out. We just haven't had the public conversation to work through the range of risks, and it's a disservice to Canadians that we're reduced to talking about amendments to a bad bill in the absence of a shared understanding of the full scope of what it needs to do and what it should not do.

Practically, in our submission, we nonetheless make specific recommendations in our brief to include law enforcement agencies in scope, to create independent oversight and to amend the definitions of harm and bias. We further support the recommendations submitted by the Women's Legal Education & Action Fund.

I would be very happy to address all of these recommendations during the question period.

Thank you.

Bernard Généreux Conservative Montmagny—L'Islet—Kamouraska—Rivière-du-Loup, QC

Thank you, Mr. Chair.

I also thank the witnesses.

Mr. Therrien, you were the Privacy Commissioner when former bill C‑11 was tabled. You had proposed amendments and stated that the bill was a step backwards from what existed at the time.

Your successor proposed 15 amendments, which you say you agree with. However, the government only retained five of them. Of the 10 it did not keep, which ones do you think should fundamentally be included in the current bill?

October 24th, 2023 / 3:55 p.m.


See context

Lawyer and Former Privacy Commissioner of Canada, As an Individual

Daniel Therrien

I characterize Bill C-11 as a step backwards. I think Bill C-27 is a step forward. Some recommendations that I had made as commissioner were accepted—not all, and not some that I think are essential that I spoke to.

Rick Perkins Conservative South Shore—St. Margarets, NS

However, Bill C-11 was tabled—

Rick Perkins Conservative South Shore—St. Margarets, NS

Thank you, Mr. Chair. Thank you, witnesses.

My first series of questions are to Mr. Therrien.

You were the Privacy Commissioner during the development of the replacement for the Privacy Act in the last Parliament, Bill C-11, and presumably in the run-up to the development of this one. The current Privacy Commissioner was here last week and said essentially that he personally wasn't the commissioner who was consulted on it.

This is a critical bill because it's a complete replacement of the Privacy Act. It's not an amendment.

I'll start by asking you if, in the development of Bill C-11, the Minister of Industry of the day—I believe it was Mr. Bains—consulted with you before the bill was tabled in Parliament.

Daniel Therrien Lawyer and Former Privacy Commissioner of Canada, As an Individual

Thank you, Mr. Chair.

Thank you, committee members, for inviting me to participate in your study.

I am here as an individual, but my experience as the federal privacy commissioner from 2014 to 2022 will certainly be reflected in my remarks.

To begin, let me say I agree with my successor, Philippe Dufresne, that the bill before you is a step in the right direction, but that it is necessary to go further in order to properly protect Canadians. I also agree with the Office of the Privacy Commissioner's 15 recommendations for amending Bill C‑27, with some nuances on audits, remedies and appeals. The government has taken up, at least in part, a good number of the recommendations I had made regarding Bill C‑11, the predecessor to Bill C‑27. Among those that were not accepted is the application of privacy law to political parties.

I am very pleased that a consensus appears to have emerged among political parties to recognize in the law that privacy is a fundamental right. I applaud parliamentarians for that decision. The question now becomes how to best translate into law the principle with which you now all agree.

Minister Champagne suggests amending the preamble and the purpose clause of the CPPA. These are steps in the right direction, but they are not sufficient. You should also amend two operative clauses: proposed section 12 of the act on “appropriate purposes”, and proposed section 94, which provides for administrative monetary penalties for certain violations of the law. Without these amendments, the law would still give greater weight to commercial interests than to privacy, which is a fundamental right. This does not appear to be your intent.

Based on my reading of parliamentary debates, it also seems to me there's consensus around the idea that privacy and economic growth through innovation are not in a zero-sum game. The question is generally not on deciding which should prevail—privacy protection or innovation—as both can and should be pursued at the same time. It is only in rare cases that it will not be possible. In those cases, privacy as a fundamental right should take precedence.

Proposed section 12 of the CPPA does not, in my view, faithfully translate this consensus. Rather, it upholds the traditional approach, which is that privacy and economic goals are conflicting interests that must be balanced without considering that privacy is a fundamental right. This may have made sense under the current act's purpose clause, but it will no longer make sense if the CPPA's purpose clause recognizes privacy as a fundamental right, as is currently proposed.

Proposed section 12 is central to the exercise that commercial organizations, the Privacy Commissioner and ultimately the courts will have to go through in order to determine the factual context of each case and the weight given to privacy and commercial interests.

Section 12 as drafted gives more weight to economic interests. It does that in several ways.

The first is through the terminology it uses. It refers to “business needs” and does not refer to privacy as a right, fundamental or otherwise.

When the proposed section does refer to privacy, in paragraphs (2)(d) and (e), it is as an element to consider in achieving business goals, mitigating losses where possible, that is where achieving business goals can be achieved at comparable cost and with comparable benefits.

Nowhere is it mentioned that privacy protection is an objective at least equally as important as economic goals. On the contrary, the focus is on economic goals, and privacy loss as something to be mitigated, where possible, in the pursuit of those goals.

I have provided you with my proposals for amending section 12, and they would be consistent with the amendments proposed at section 5.

With respect to sanctions, all violations of section 12, including the appropriate purposes clause at subsection (1), should potentially lead to administrative monetary penalties. Without sanctions, recognizing privacy as a fundamental right would be a pious wish, without real consequences.

I would go further and recommend that all violations of the CPPA should be subject to these penalties. This would align Canada with most other jurisdictions.

I have a few words on the Artificial Intelligence and Data Act. That part of Bill C-27 is brief, even skeletal, and leaves a lot of room for regulations. While I understand why some are concerned with this, I think this approach is defensible, given the fact that AI technology is relatively nascent and is certainly evolving very quickly; however, the lack of precision in AIDA, in my opinion, requires that certain fundamental principles and values be recognized in the act itself. First and foremost, the act should recognize the importance of protecting fundamental rights, including the right to privacy, in the development and implementation of AI systems.

Finally, some of you expressed concerns in an earlier meeting with the difficulty of detecting violations of the law and the potential value of proactive audits to facilitate detection. As commissioner, I had recommended proactive audits, and I still believe they are a necessary part of an effective enforcement regime. This is particularly true in the case of AI.

Thank you. I would be pleased to take your questions later.

October 19th, 2023 / 4:10 p.m.


See context

Privacy Commissioner of Canada, Office of the Privacy Commissioner of Canada

Philippe Dufresne

Those would be the top priority, starting with the notion of a privacy impact assessment for generative AI. To me, that is a major shortcoming.

If you look at AIDA and if you look at the minister's proposed amendments to AIDA, you see a lot of discussion about risk mitigation, identifying risk and managing risk. This is absolutely essential and critical. However, we need to do this for privacy as well as for non-privacy harms. I'm very much insisting on this.

The other important recommendation, which I would say is the top priority, is making sure that fines are available for violation of the “appropriate purposes” provision. This is a violation of section 12. This is the key central provision. This is at the heart of the bill in a way, but there are no fines for that. That, in my view, should be corrected. It's easily corrected by adding that to the list of the breaches.

Other comparable legislation, like Quebec's, for instance, simply says, “a violation of the law”. The whole law is there. It's all covered. This approach lists offences, and then in Bill C-11 there were more omissions. It's been corrected to some extent, but it needs to be corrected further.

I talked about algorithmic transparency. It is an important element, especially at this time in AI. Again, we can manage that by providing guidance to industry, so it's something that's workable, but I think Canadians need to understand what is going on with their data and how decisions are made about them. If we limit it to matters that have significant impact, we're creating debates and limiting the transparency that Canadians deserve.

That is—

October 17th, 2023 / 5:35 p.m.


See context

Senior Assistant Deputy Minister, Strategy and Innovation Policy Sector, Department of Industry

Mark Schaan

Just as a quick refresher, the government had already begun this effort, in some ways, with the consultations that we led on the overall development of the innovation and skills plan. That identified data and digital as an important pillar of work related to the functioning of the modern economy, so a secondary public, open consultation on data and digital was held on specific pillars, one related to privacy and trust. That netted a significant amount of feedback, which then resulted in the digital charter and its 10 principles.

When the digital charter was released, a subsequent consultation was held on specific proposals related to the modernization of the Personal Information Protection and Electronic Documents Act, or PIPEDA. That feedback then informed what ultimately became the bill—first Bill C-11, and then this bill.

Since this bill was tabled, in June 2022, we've had more than 300 discussions with key stakeholders across the continuum to make sure that we are continuing to understand their understanding of the bill and also things related to it that they think are important.