Digital Charter Implementation Act, 2022

An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts

Sponsor

Status

In committee (House), as of April 24, 2023

Subscribe to a feed (what's a feed?) of speeches and votes in the House related to Bill C-27.

Summary

This is from the published bill. The Library of Parliament has also written a full legislative summary of the bill.

Part 1 enacts the Consumer Privacy Protection Act to govern the protection of personal information of individuals while taking into account the need of organizations to collect, use or disclose personal information in the course of commercial activities. In consequence, it repeals Part 1 of the Personal Information Protection and Electronic Documents Act and changes the short title of that Act to the Electronic Documents Act . It also makes consequential and related amendments to other Acts.
Part 2 enacts the Personal Information and Data Protection Tribunal Act , which establishes an administrative tribunal to hear appeals of certain decisions made by the Privacy Commissioner under the Consumer Privacy Protection Act and to impose penalties for the contravention of certain provisions of that Act. It also makes a related amendment to the Administrative Tribunals Support Service of Canada Act .
Part 3 enacts the Artificial Intelligence and Data Act to regulate international and interprovincial trade and commerce in artificial intelligence systems by requiring that certain persons adopt measures to mitigate risks of harm and biased output related to high-impact artificial intelligence systems. That Act provides for public reporting and authorizes the Minister to order the production of records related to artificial intelligence systems. That Act also establishes prohibitions related to the possession or use of illegally obtained personal information for the purpose of designing, developing, using or making available for use an artificial intelligence system and to the making available for use of an artificial intelligence system if its use causes serious harm to individuals.

Elsewhere

All sorts of information on this bill is available at LEGISinfo, an excellent resource from the Library of Parliament. You can also read the full text of the bill.

Votes

April 24, 2023 Passed 2nd reading of Bill C-27, An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts
April 24, 2023 Passed 2nd reading of Bill C-27, An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts

October 25th, 2023 / 5:45 p.m.


See context

Privacy Commissioner of Canada, Offices of the Information and Privacy Commissioners of Canada

Philippe Dufresne

I would say a few things. One is that we've issued a declaration with my federal, provincial and territorial colleagues called “Putting best interests of young people at the forefront of privacy and access to personal information”. It's available on our website. We give a number of recommendations and expectations for organizations about making sure that they're protecting children and the best interests of the child and that they're treating their information appropriately.

In terms of what people should do—and that's something we've said in our data-scraping statement with my international colleagues—ask yourself if you are comfortable sharing this much information. Do you know enough about the settings and the protections that are there? Is this something you want to potentially see forever?

In Bill C-27, there's a new proposed section to dispose of information, especially for minors. That's good, but whenever you're putting a picture of your children online, ask yourself if you want to take the risk. Have you put the privacy settings in a strong enough way? Are you sharing this with the whole world? If you don't understand enough about what the organization is doing and you find its privacy policy to be complex, I always encourage everyone to ask the organization.

Ask for more information. When stores ask for your birthday, ask them why they want to know your birthday when you're buying jewellery or any kind of item. Why do they need that information?

It's getting that reflex of not just saying, “Yes, sure, I'll give it to you.”

Marilyn Gladu Conservative Sarnia—Lambton, ON

Very good. Thank you.

I want to turn my attention to digital technology and Bill C-27.

One concern that's been raised is people worrying about deepfakes, this generative AI that will make anybody look like they're saying or doing things they didn't.

Did you provide any recommendations to the minister or do you have any thoughts on how to fix that?

October 25th, 2023 / 5:35 p.m.


See context

Privacy Commissioner of Canada, Offices of the Information and Privacy Commissioners of Canada

Philippe Dufresne

Yes, of course.

Under the Privacy Act, the public sector's obligations are less stringent than the private sector's. Departments are required to show that the information is used for purposes related to their respective mandates. For example, they have to show that they have a legal mandate to do X, so they can do it.

Some obligations are more specific, like those at issue in the Canada Post case. When an organization uses information indirectly, the obligation threshold is greater. It has to ask for permission. The first major consideration when a public organization uses information is whether the activity is relevant to its mandate.

We think it's important to impose the obligations of necessity and proportionality, in keeping with international principles and practices in the private sector. The idea is to consider what information the organization is collecting and for what purpose. It's a bit similar to how it works for charter human rights. Is the organization's purpose important enough? Will the measure achieve the purpose? Has the organization done everything possible to minimize the use of the information in achieving its purpose?

We underscored those principles in our report on the pandemic, and we apply them. While we realize they aren't binding, we apply them and use them to inform our recommendations. We've been able to draw some useful lessons. On the whole, the government adheres to the principles. Occasionally, we're of the view that there should have been more information on how the organization assessed the discarded options, but that, on balance, its decision was justifiable.

It's a standard that encourages decision-makers to ask questions about what they're doing and whether they are minimizing the risks. That's more or less what we are asking.

One of my major recommendations for Bill C‑27 is to require organizations to conduct audits and privacy impact assessments, or PIAs. It's about considering what the risks are and which measures can minimize them.

PIAs are good for privacy, and they're good for Canadians.

October 25th, 2023 / 5:25 p.m.


See context

Privacy Commissioner of Canada, Offices of the Information and Privacy Commissioners of Canada

Philippe Dufresne

We explicitly recommended that the term “profiling” be included in the definitions. When organizations use an algorithm, when they infer things from your personal information and, then, use that to build profiles, there are consequences, and they need to be taken into account and regulated. Both Quebec's law and the European regulation refer to the term “profiling”. My office recommended it be explicitly included in Bill C-27.

October 25th, 2023 / 5:20 p.m.


See context

Privacy Commissioner of Canada, Offices of the Information and Privacy Commissioners of Canada

Philippe Dufresne

Yes, Quebec's law 25 definitely has more teeth than existing federal laws, simply because it grants the power to issue orders. Quebec's access to information authority, the Commission d'accès à l'information, or CAI, can issue binding orders and impose heavy fines, similar to the European model under the General Data Protection Regulation. That makes it a more robust piece of legislation on that front. It lays out proactive obligations.

Hopefully, Bill C‑27 will make its way successfully through Parliament and bring federal laws more up to date in that regard. It's not exactly the same as law 25, but it comes close with the power to issue orders, and to impose fines as well as proactive obligations on companies. I think it's a good model, following in the footsteps of Europe and Quebec. I think, federally, we can get there.

To answer your question about working with the CAI, I can report that we do indeed work very closely with Quebec and all the provinces and territories.

I was in Quebec City in September for the annual gathering of federal, provincial and territorial privacy commissioners, which the CAI hosted. We had some very important and useful conversations. We put out two resolutions, including on the protection of young people's privacy. They are joint statements reflecting principles that all the commissioners have agreed upon, despite the legislative differences between the jurisdictions. In this way, the commissioners are trying to make things easier for companies by flagging common elements across the different regimes. My office carries out joint investigations with provinces that have regimes similar to the federal government's, so Quebec, Alberta and British Columbia. We worked together on the investigations into TikTok, ChatGPT and Tim Hortons.

Our collaborative work is not only extensive, but also very useful. We are able to make sure that we are on the same page across the country.

October 25th, 2023 / 5:20 p.m.


See context

Privacy Commissioner of Canada, Offices of the Information and Privacy Commissioners of Canada

Philippe Dufresne

Certainly. There are sometimes discussions about that.

As for the cases you're referring to, sometimes a department tells us it's already doing what we recommend. In the case of the pandemic, we also carried out an assessment of proportionality and necessity, which is not mandatory under the Privacy Act, but which we feel should be. We put forward that analysis.

It's a dialogue. We are always given the reasons for refusal, and dialogue is established.

Some breaches are more serious than others. The really worrying situations are those where there has actually been a major breach or a major consequence, combined with a complete refusal to follow our recommendation. That can undermine trust.

I feel the power to issue orders is important. When an officer of Parliament makes a recommendation to an organization and the latter refuses to implement it, the situation is not satisfactory. I believe there must be sufficient justifications given. If we had the power to issue orders, this wouldn't be a problem. We'd issue them when necessary. With that said, in my opinion, they should only ever be used exceptionally.

The same applies to fines. In Bill C‑27, we would add the possibility of imposing significant financial penalties on organizations. I think this is very important, for the same reason again: to create incentives. The idea is not to use them often, but...

October 25th, 2023 / 5:15 p.m.


See context

Privacy Commissioner of Canada, Offices of the Information and Privacy Commissioners of Canada

Philippe Dufresne

I think it's absolutely essential to modernize this act. We also need to modernize the part of the Privacy Act that deals with the private sector. This law is 20 years old, so it's older than Facebook and social media. It is positive that Bill C‑27 aims to modernize the act with respect to the private sector. I look forward to seeing this bill move forward.

In addition, I hope that a bill to modernize the act for the public sector will soon follow. The Minister of Justice had said, when Bill C‑27 was tabled, that the public sector privacy bill would follow. Consultations were held with first nations and indigenous peoples on certain implications. The Department of Justice published a report on these consultations—I believe it was in September. The work is ongoing. In my opinion, the solution is to move forward with Bill C‑27. The model passed in this legislation can then be adapted to the public sector, as needed. That could be beneficial.

Among our proposals, we suggest that there should be an increasing number of public-private partnerships and that the government should work hand in hand with the industry. At present, we have two laws with different requirements for government and the private sector. This is not optimal, and it creates problems in terms of interoperability. I entirely agree with you that this is becoming important.

In the meantime, the law applies, and our office will continue to implement it to the best of our ability. In fact, this is a message that my counterparts from the G7 countries and I conveyed when we were in Tokyo last summer. At that meeting, we talked about artificial intelligence. To address people's concerns, we said we needed laws on artificial intelligence. There are already some—privacy laws, for instance. They exist and they are enforced.

I've also launched an investigation into ChatGPT, to confirm whether or not it is compliant with the legislation. Tools do exist, but they absolutely must be modernized. We will be there to support Parliament.

October 25th, 2023 / 5:10 p.m.


See context

Privacy Commissioner of Canada, Offices of the Information and Privacy Commissioners of Canada

Philippe Dufresne

I think we have to hold public discussions, be transparent and have obligations to be transparent.

The phenomenon you're describing has accelerated even more with artificial intelligence. We may think we know our personal information will be used by such and such an entity. However, do we really know what anyone can conclude about us based on that information? What inferences can be drawn? Sometimes postal codes or tastes in music, for example, can help someone deduce a person's sexual orientation, income level and so on. People don't know all that.

I recommended that Bill C‑27 provide for a transparency obligation so that, when people reached a decision with the help of artificial intelligence, they could request an explanation in every case. However, the current version of the bill provides that a general account may be provided only in cases that would have a significant impact on the individuals concerned. I recommended that part be deleted because, for the moment, I think it's better to encourage more transparency rather than less.

We have to try to find pleasant ways to explain this. One of my mandates is to try to acquire tools. We provide a lot of information on our website, and we try to explain it all as best we can, but I think we can do better.

We also have to talk about children, because I think the message has to be adapted to suit the audience.

Philippe Dufresne Privacy Commissioner of Canada, Offices of the Information and Privacy Commissioners of Canada

Good afternoon, Mr. Chair.

Good afternoon, members of the committee.

I am pleased to be here today to discuss my 2022‑23 Annual Report to Parliament, which highlights the important work that my office is doing to protect and promote the fundamental right to privacy in a time of unprecedented technological change.

It is encouraging to see this continued focus on the importance of privacy, as it impacts virtually all aspects of our lives.

Many of the public interest issues that you are seized with as parliamentarians—children's rights, online safety and cybersecurity, democratic rights, national security, equality rights, ethical corporate practices and the rule of law—all have privacy implications and, I would argue, all depend on strong privacy protections.

In this digital era, as you will see from some of the work and investigations my office has conducted this year, routine activities of daily life—for example, socializing online, using mobile apps, getting packages delivered or going to the checkout counter—can also raise privacy issues.

Since my appointment as Privacy Commissioner in June 2022, I've identified strategic priorities for my office that helped frame our work over the past year and that will guide the way ahead. These include addressing the privacy impacts of the fast-moving pace of technological advancements—especially in the world of artificial intelligence and generative AI—protecting children's privacy, and maximizing the OPC's impact in fully and effectively promoting and protecting the fundamental right to privacy.

To support these priorities, this past year we have engaged extensively with our domestic and international counterparts to identify and undertake collaborative opportunities.

We have also continued to advocate domestically for the modernization of Canada's privacy laws. I was honoured to appear before the Standing Committee on Industry and Technology last week in the context of their study of Bill C‑27, the digital charter implementation act, 2022, where I made 15 key recommendations needed to improve and strengthen the bill. I was pleased to see a number of them endorsed by Minister Champagne in the form of amendments that will be put forward to the committee, and I look forward to the work of Parliament in reviewing this important bill.

I will now turn to some of our compliance work from the last year.

We accepted 1,241 complaints under the Privacy Act, representing an increase of 37% over the previous year, and 454 under the Personal Information Protection and Electronic Documents Act, or PIPEDA, a 6% increase over the year before.

One of the public sector investigations highlighted in this year's report involved Canada Post's Smartmail marketing program. Our investigation revealed that Canada Post builds marketing lists with information gleaned from the envelopes and packages that it delivers to homes across Canada. It makes these lists available to advertisers for a fee. We found this contravened the Privacy Act, as it was done without the knowledge and consent of Canadians. We recommended that Canada Post stop its practice of using and disclosing personal information without first seeking authorization from Canadians. As a possible solution to remedy this matter, we recommended that Canada Post send a mail notice to Canadians to inform them of this practice and indicate an easy way for Canadians to opt out.

Until the tabling of my annual report, which made this decision public, Canada Post did not agree to implement this solution. After the report was made public, Canada Post issued a statement that it would review its policies. I expect Canada Post to comply with the Privacy Act and I look forward to hearing from them on the next steps to resolve this matter.

The report also highlights some of our private-sector investigations from last year, including our investigation of Home Depot's sharing of the personal information of customers who opted for an electronic receipt instead of the printed one at checkout with a social media company.

Home Depot has since stopped that practice and implemented my offices recommendations. This case underscored the importance of businesses obtaining meaningful consent to share customers' personal information.

Another important area of our work is addressing breaches in the public and private sectors.

We remain concerned about possible under-reporting of breach incidents in the public sector. The number of reported breaches fell by 36% to 298 last year, and only one of those reports involved a cyber-attack. This compares to 681 breach reports from the private sector, of which 278 were cyber-related.

We also engage in groundbreaking policy work, provide advice and guidance to organizations in both the public and private sectors on privacy matters of public interest and importance, and continue to provide advice to Parliament.

We know that privacy matters to Canadians more today than ever before and that they are concerned about the impact of technology on their privacy. Our latest survey of Canadians found that 93% have some level of concern about protecting their personal information and that half do not feel that they have enough information to understand the privacy implications of new technologies. This is why the work of my office to deliver concrete results that have meaningful impacts for Canadians and privacy in Canada is so important.

In closing, I would like to thank this committee for its work over the years, including the many reports and recommendations in the field of privacy. I cite them often. We certainly consider and consult them very often, and I know that Canadians do as well.

I look forward to continuing our efforts to ensure that privacy rights are respected and prioritized by government institutions and businesses alike, and to position Canada as a global leader on privacy.

I would now be happy to answer your questions.

Sébastien Lemire Bloc Abitibi—Témiscamingue, QC

Thank you, Mr. Chair.

Mr. Therrien, in the context of Bill C‑27 and, more specifically, in the context of artificial intelligence, I would like to hear your opinion on industry self-regulation standards. That is, I would say, the new approach that is being put forward, both in Europe and by Mr. Champagne as a temporary or transitional measure. Can we trust industry to regulate itself?

Francesco Sorbara Liberal Vaughan—Woodbridge, ON

Bill C-27, in my humble view, is a groundbreaking piece of legislation. I'll use that term. I think it is groundbreaking in terms of the update it's providing to the act and to privacy.

Mr. Therrien, you're fully versed on privacy issues relating to Canadians. When I think of this bill and I think of my constituents back in the city of Vaughan in my riding of Vaughan—Woodbridge, I would tell them how their privacy is being protected and not being protected on a very granular basis. I would use layman's terms. What would you tell me to tell them in terms of your view of BillC-27?

October 24th, 2023 / 5:10 p.m.


See context

Partner, Canadian Anonymization Network

Adam Kardash

I'll be brief.

My view—and we've thought about this quite carefully—is that there is no public policy rationale for the political parties' processing of personal information not to be subject to a privacy legislative regime. The only question that I think is open is what the appropriate instrument would be and whether that would that go into the CPPA. I think there's some validity to the proposition that it might be a separate instrument. My personal view is that it was something that was missing in Bill C-27. It could have been in there.

Right now, if you compare the privacy protections that are set out in Bill C-27 under the CPPA to the current protections afforded to individuals in respect to the processing of personal information by political parties, you see that they're not even in the same universe. You would just have to post a privacy statement. There's no security breach notification requirement. There are no access rights and no consent rules. It goes on and on. There are no rights of express redress. There's no independent ombudsman who would oversee and take complaints, investigate, etc.

I think this is something that is incredibly important and I'm very thankful to you, Mr. Masse, for bringing that up.

October 24th, 2023 / 5 p.m.


See context

Lawyer and Former Privacy Commissioner of Canada, As an Individual

Daniel Therrien

I think that the CPPA brings us much closer to where we ought to be in 2023. With the new implementation of artificial intelligence, part 2 of Bill C-27 is an attempt to align Canada's legislation to that new technology.

There's no perfect solution in all of these situations. There are people who think that the artificial intelligence act is so skeletal as to be meaningless, and there's some merit to this. I think it's okay for where we are today.

One virtue of the legislation before you is that it continues with the consent model in many circumstances in which consent can possibly be given, but it also recognizes that there are important limits to the consent model, such as legitimate interests and socially beneficial purposes, but I think the missing piece is that these additional flexibilities that reflect the current use of technology have to be implemented within a rights protection framework.

Although the minister's latest amendments bring us a bit closer, we are still quite a way from where we ought to be, and that is why I recommended that proposed sections 12 and 94 on penalties, particularly on penalties, are important, because what's the value of having a recognition of privacy as a fundamental right if there is no penalty when you breach that principle?

October 24th, 2023 / 4:15 p.m.


See context

Lawyer and Former Privacy Commissioner of Canada, As an Individual

Daniel Therrien

In common parlance, when people post personal information on a social media platform and allow certain other people to see it, one might think that this information becomes public. Importantly in this context, one might also think that companies and commercial organizations could use this information as public, rather than personal, information. However, the current law provides that this information remains personal and cannot be used by companies, except in accordance with the law.

I think this is a good aspect of the current law, and the fact that nothing in the current text of Bill C‑27 changes this is a good thing.

October 24th, 2023 / 4:10 p.m.


See context

Lawyer and Former Privacy Commissioner of Canada, As an Individual

Daniel Therrien

I was telling Mr. Perkins that clause 15 of Bill C‑27 will probably need to be amended. Section 6.1 of the current act sets out certain requirements for consent to be considered valid, including the notion that the person giving consent must be able to understand the purposes and consequences of disclosing the information. This terminology does not exist in Bill C‑27 and I believe it would be much better to retain the current wording.