Evidence of meeting #94 for Industry, Science and Technology in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was c-27.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Daniel Konikoff  Interim Director of the Privacy, Technology & Surveillance program, Canadian Civil Liberties Association
Tim McSorley  National Coordinator, International Civil Liberties Monitoring Group
Matthew Hatfield  Executive Director, OpenMedia
Sharon Polsky  President, Privacy and Access Council of Canada
John Lawford  Executive Director and General Counsel, Public Interest Advocacy Centre
Yuka Sai  Staff Lawyer, Public Interest Advocacy Centre
Sam Andrey  Managing Director, The Dais, Toronto Metropolitan University

4:20 p.m.

Bloc

Simon-Pierre Savard-Tremblay Bloc Saint-Hyacinthe—Bagot, QC

Thank you, Mr. Chair.

I'd like to thank the committee for having me here today, even though it's not a committee I usually sit on. It's a pleasure to be here.

I want to thank the witnesses for their presentations.

Mr Konikoff, if you don't mind, I'd like to talk about automated decision systems. As we know, Bill C‑27 grants a new right, namely the right for an individual to receive an explanation about the use of these systems. However, unlike Quebec's Bill 25, Bill C‑27 does not contain provisions that would allow a person to object to the use of an automated decision system or to have a review of the decisions made by such a system.

In your opinion, what are the potential repercussions for consumers and users if Bill C‑27 does not include such provisions?

4:20 p.m.

Interim Director of the Privacy, Technology & Surveillance program, Canadian Civil Liberties Association

Daniel Konikoff

That is a great question.

There are tremendous implications for not having these transparency requirements. You mentioned Quebec, but I can also turn to the GDPR, which is not explicitly to do with AI but has implications for AI, what with the use of data in AI. The GDPR contains a right within it that is the right to not be subject to a decision based solely on automated systems. I think that is something that could potentially serve as a template to be included in AIDA, as well as clearer transparency requirements not only for systems that are high impact, but for systems that are....

4:20 p.m.

Bloc

Simon-Pierre Savard-Tremblay Bloc Saint-Hyacinthe—Bagot, QC

You would therefore be in favour of adding provisions to Bill C‑27, provisions similar to those adopted in Europe and Quebec.

4:20 p.m.

Interim Director of the Privacy, Technology & Surveillance program, Canadian Civil Liberties Association

4:20 p.m.

Bloc

Simon-Pierre Savard-Tremblay Bloc Saint-Hyacinthe—Bagot, QC

According to Bill C‑27 as it currently stands, who should consumers turn to if they want to contest a decision made by an automated system or obtain clarification about that decision?

4:20 p.m.

Interim Director of the Privacy, Technology & Surveillance program, Canadian Civil Liberties Association

Daniel Konikoff

I beg your pardon. Can you repeat that?

4:20 p.m.

Bloc

Simon-Pierre Savard-Tremblay Bloc Saint-Hyacinthe—Bagot, QC

According to Bill C‑27 in its current form, who should consumers turn to if they want to contest a decision made by an automated system? Is there a body, organization or authority they can turn to?

4:20 p.m.

Interim Director of the Privacy, Technology & Surveillance program, Canadian Civil Liberties Association

Daniel Konikoff

No. I don't think so. There is a consumer challenge.

4:20 p.m.

Bloc

Simon-Pierre Savard-Tremblay Bloc Saint-Hyacinthe—Bagot, QC

Would anyone else like to add anything?

4:20 p.m.

Managing Director, The Dais, Toronto Metropolitan University

Sam Andrey

Yes. If it has a significant impact, you have the ability to request information, but there is no ability to appeal, challenge or have a human review it, as you say other jurisdictions have done.

4:20 p.m.

Bloc

Simon-Pierre Savard-Tremblay Bloc Saint-Hyacinthe—Bagot, QC

So that's a problem, in your view.

4:20 p.m.

Managing Director, The Dais, Toronto Metropolitan University

4:20 p.m.

Bloc

Simon-Pierre Savard-Tremblay Bloc Saint-Hyacinthe—Bagot, QC

Mr. Konikoff, what do you think is preventing us from making both the public and private sectors subject to the provisions of the bill? Do we need to go further to ensure data belonging to Quebeckers and Canadians is protected?

4:20 p.m.

Interim Director of the Privacy, Technology & Surveillance program, Canadian Civil Liberties Association

Daniel Konikoff

We go further to protect information. Yes, I absolutely don't think this goes far enough. We've laid out some possibilities to firm that up.

4:20 p.m.

Bloc

Simon-Pierre Savard-Tremblay Bloc Saint-Hyacinthe—Bagot, QC

Mr. McSorley, as you know, last month, the Minister of Innovation, Science and Industry presented a voluntary code of conduct for responsible development and management of advanced generative AI systems.

At a time when technologies are evolving rapidly, is self-regulation the best solution? Do you think it's realistic to think that companies, guided by some invisible hand, will simply regulate themselves?

4:20 p.m.

National Coordinator, International Civil Liberties Monitoring Group

Tim McSorley

Very simply, no. Self-regulation is not appropriate or adequate.

I think this is a clear example of what my colleague Mr. Hatfield was saying around the idea of rushing to regulate, rather than really understanding the system. The entire consultation process around regulating generative AI was done in such a rushed manner. We felt a response to concerns that there hadn't been public consultation about AIDA in general. We have concerns.

We don't have any data right now to know how companies are reacting to the self-regulation, but it's clearly insufficient.

4:25 p.m.

Bloc

Simon-Pierre Savard-Tremblay Bloc Saint-Hyacinthe—Bagot, QC

Ms. Polsky, in a document dated April 14, 2023, the Privacy and Access Council of Canada states that the AI legislation being proposed by the European Union will likely become the global standard for general-purpose generative AI systems.

Why do you think the law being proposed by the European Union could become the global standard?

4:25 p.m.

President, Privacy and Access Council of Canada

Sharon Polsky

We have seen with the GDPR that the comprehensiveness of that regulation and its extrajurisdictional applicability became the global standard very quickly. It was a bar that was set high. Yes, a lot of organizations and a lot of companies whined and complained and said that it was going to cost them a lot of money to comply with the law, and they did it anyhow.

We can see the same thing if Canada takes an approach for AI regulation for privacy that has teeth.

I remember when Jennifer Stoddart declined to be reappointed, and she said that PIPEDA could use a little more teeth. That was tough talk. PIPEDA needs it. We can do it, but there needs to be political will.

4:25 p.m.

Liberal

The Chair Liberal Joël Lightbound

Thank you very much.

I'll now give the floor to Mr. Masse.

4:25 p.m.

NDP

Brian Masse NDP Windsor West, ON

Thank you, Mr. Chair.

Thank you to our witnesses.

I will start with Mr. Lawford. If others are...I will check in.

In particular, I'm curious about your view of the privacy tribunal. I'm hearing what you're saying in terms of the overall message, but at the same time, if there was progress on some of these elements.... The one new factor is the new privacy tribunal, and there are those who are for and against it. We would love to have your opinion to start.

4:25 p.m.

Executive Director and General Counsel, Public Interest Advocacy Centre

John Lawford

We base our opposition to this on the fact that the Privacy Commissioner presently does investigations and, although they are sometimes slow, the results are, in our opinion, fair.

We looked at the Competition Tribunal debacle this year with Rogers and Shaw, and the use of that extra step, if you will, by a company that felt like dragging out a process or winning.... We can't see any likelihood that companies using personal information won't take that extra step and go to the tribunal to challenge every commissioner decision. That very likely adds two years to all negative decisions on companies' parts. You could say that presently you can go from the Privacy Commissioner's decision to the Federal Court, but you have to re-prove the case in front of the Federal Court.

It seems like an unnecessary step. When you add that along with our concerns that you can't bring a class action until after all the proceedings are done, including in front of the tribunal, that will discourage class actions. We believe that some private enforcement does change the behaviour of companies when there are egregious privacy violations.

Our concern is that this is just setting up a structure that is an extra step and may well be less favourable to complainants like the Competition Tribunal is to the competition commissioner.

4:25 p.m.

NDP

Brian Masse NDP Windsor West, ON

Yes. On that, officials said to me that it couldn't happen under this act. Then we have had other testimonies saying that it could happen. What's your opinion on that?

That case against the Competition Bureau is nothing short of outrageous. It undermines the whole point of the Competition Bureau and basically has the public subsidizing Rogers in many different ways. At any rate, what's your take on that possibility?

4:25 p.m.

Executive Director and General Counsel, Public Interest Advocacy Centre

John Lawford

We have a lot of concerns about that, especially since the initial draft of what the privacy tribunal would be like would be that there would be only one privacy expert on that.

The Privacy Commissioner presently has enough expertise to make a proper administrative decision, and then we have courts, if you want to go and say there's a problem above that. That's a much more efficient way, and it's a more predictable way to deal with this rather than creating a quasi court. Quasi courts tend to have quasi judges on them, and you get quasi decisions like we had with Rogers, so we would prefer to avoid that.

4:30 p.m.

NDP

Brian Masse NDP Windsor West, ON

I'm not aware of any other country that has a privacy tribunal.

One thing I would like to ask before I turn it over to some other guests is this: What's your view on the private right of action that the United States has? What's the importance of that versus what we don't have here?

November 2nd, 2023 / 4:30 p.m.

Executive Director and General Counsel, Public Interest Advocacy Centre

John Lawford

In the United States, of course, they don't have a comprehensive privacy law. We're lucky to have PIPEDA here, so a lot of our issues don't have to go to court. However, for those very difficult situations or widespread privacy violations, at least the threat of a class action can focus the minds of the larger corporations. We think that it's a good tool to use and to preserve in this act.

I will just say that, for PIPEDA, there was a private right of action with an amount per privacy violation, which was never proclaimed into force because of lobbying from the industry. It could have been better these last few years, but keeping that possibility open is a concern of ours—yes.