Evidence of meeting #111 for Industry, Science and Technology in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was prices.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Momin M. Malik  Ph.D., Data Science Researcher, As an Individual
Christelle Tessono  Technology Policy Researcher, University of Toronto, As an Individual
Jim Balsillie  Founder, Centre for Digital Rights
Pierre Karl Péladeau  President and Chief Executive Officer, Quebecor Media Inc.
Jean-François Lescadres  Vice-President, Finance, Vidéotron ltée
Peggy Tabet  Vice-President, Regulatory Affairs, Quebecor Media Inc.

5:15 p.m.

Liberal

Ryan Turnbull Liberal Whitby, ON

I'm out of time.

Thanks, Mr. Chair.

5:15 p.m.

Liberal

The Chair Liberal Joël Lightbound

Thank you, Mr. Turnbull. I'm sorry, but your time is up.

Mr. Garon, the floor is yours.

5:15 p.m.

Bloc

Jean-Denis Garon Bloc Mirabel, QC

Thank you, Mr. Chair.

I want to thank the witnesses for joining us.

Mr. Balsillie, I may have misunderstood you. I think that you said that it could be challenging for a government to regulate new and emerging technology and that it could be difficult—perhaps impossible—for a government to identify existential threats, particularly when faced with high‑risk or high‑impact artificial intelligence algorithms.

Suppose we remove part 3 of the bill, which concerns artificial intelligence, and hold further consultations. You said that there was a lack of consultation. What difference would that make to the government's ability to properly regulate this technology?

5:15 p.m.

Founder, Centre for Digital Rights

Jim Balsillie

Thank you for the question.

First of all, I would say it doesn't have democratic legitimacy if it hasn't involved all stakeholders, and that hasn't happened yet.

The second thing about this is that, as I said, the existential risk is gaslighting to take you away from the near-term risks, which the other witnesses are drawing us to, and that's a real tactic.

I would say—and this is most critical and has been part of my journey in learning this—that you'll notice there's been a tremendous effort to stay away from rights by those who don't want effectiveness. We are in a new era and if we were writing our charter of rights, we would incorporate these kinds of rights in an information age: the rights to dignity, privacy and thought, and the rights to not have misinformation or manipulation.

I think you have to get the core pieces right, and those involve determining which human rights matter up front, how we work with those within the context of real harm that is happening, and how not to be gaslit on things that take us away from what the real issue is. Businesses use the tactic of gaslighting and confusing people to keep them away from the root issues.

5:20 p.m.

Bloc

Jean-Denis Garon Bloc Mirabel, QC

In recent decades, we've seen the globalization of culture and the faster flow of information. This has been a cultural issue for Quebec, for example. Culture is becoming more homogenous around the world. I think that, here in Ottawa, artificial intelligence regulation is being treated as a strictly regulatory and technological issue. It's as if the federal government alone were responsible for regulating modernity. Yet cultural issues, at least in Quebec, play a key role at the provincial level, which we refer to as the national level.

What role should the provinces and Quebec play in regulating artificial intelligence? Shouldn't there be more consultation and greater involvement of Quebec, for example, in this regulatory exercise?

5:20 p.m.

Founder, Centre for Digital Rights

Jim Balsillie

Yes. I've always taken a crosscutting effects and rights approach to it, not a technological one, so I agree with those who frame it that way. Beware of those who think the answer to technology issues is more technology.

I think the place that is going to be hurt the most by far by AIDA and Bill C-27 is Quebec. They have by far the most to lose, because they've set a higher bar—an appropriate bar—with law 25, yet clearly this law is lower. Which one is in charge? Also, if you notice, it's ambiguous, and you know the federal is going to win, but corporations are going to arbitrage away from Quebec. It's like pollution laws are easier on one side of the river than the other, so you just move across the river. I think you'll lose. If you don't do strong laws, we all lose, but Quebec will lose the most.

Absolutely, social, cultural, economic, security, this is the mediation realm of the contemporary. It's extremely important, and I think the provinces should be given tremendous accord on this, and that should be clarified in this bill. However, your primary protection is raising the standard of this bill so that, as a minimum, it meets law 25.

5:20 p.m.

Bloc

Jean-Denis Garon Bloc Mirabel, QC

We must avoid a race to the bottom. I understand.

You spoke about transparency. The committee has often discussed transparency, such as a person's right to know that they're dealing with an artificial intelligence image or algorithm, for example. However, if I understood you correctly, transparency isn't enough. It shows only our powerlessness against an algorithm. You said something that I found intriguing. If I understood you correctly, individuals must have the opportunity to challenge the fact that they're dealing with an artificial intelligence model.

What individual has the technological and financial ability to challenge these types of models? Is this proposal realistic for the average person, for example?

5:20 p.m.

Founder, Centre for Digital Rights

Jim Balsillie

Well, I think that can apply to everything, but as an example, the CCLA challenged elements of recent government actions and was successful. If you don't even have the window to do it, then it can never happen. It's to have not only transparency but contestability. Yes, then there's an issue of resources to do it, but that's where we can ask the question of a strong civil society for Canada to deal with this.

I will say again that we will be playing whack-a-mole forever in this if we do not get the fundamental rights up front, because that's going to frame what is the moral imperative here. You could have rights to protect culture as a fundamental right: put it in there, make it explicit and have it referenced throughout the document. Then it's unambiguous what trumps here, but if you notice, those things are transactional and are not really addressed.

5:20 p.m.

Bloc

Jean-Denis Garon Bloc Mirabel, QC

I'll ask you a question that I put to another witness. I would like to hear your answer.

High‑risk and high‑impact models have been defined as models that threaten the health, safety or integrity of individuals. These definitions seem to overlook models that threaten the integrity of minority cultures or cultural diversity, for example. If I understood you correctly, this may fall under the definition of a high‑impact or high‑risk artificial intelligence model.

5:25 p.m.

Founder, Centre for Digital Rights

Jim Balsillie

Sure, you can go at communities. That was one of my comments—communities and groups, first nations, visible minorities—but again, if you get the human rights and a right not to be discriminated against, it gets it right up front, and then, when it happens downstream, that becomes the inalienable reference point of the courts, so be careful.

My strong advice is to focus on the root cause rather than the effects, because the effects are always going to move, but fundamental rights are fundamental rights. I like this idea of culture and sovereign culture being a fundamental right. Otherwise, Quebec is at risk of homogenization and steamroll unless you enshrine it. This is your chance.

5:25 p.m.

Liberal

The Chair Liberal Joël Lightbound

Thank you, Mr. Garon.

Mr. Masse, the floor is yours.

February 14th, 2024 / 5:25 p.m.

NDP

Brian Masse NDP Windsor West, ON

Thank you, Mr. Chair.

Thank you to our witnesses.

I'm going to spend the first part of my time addressing a document that I'm getting from the public record. It came to our attention today. It's from the Assembly of First Nations. In it, they talk about the process:

The first problem with the legislation is the way it has come to stand before the Committee. The legislation was crafted without the due “consultation and cooperation” of First Nations as is the minimum requirement outlined in Article 19 of UNDRIP, which reads in full,

States shall consult and cooperate in good faith with the [I]ndigenous peoples concerned through their own representative institutions in order to obtain their free, prior and informed consent before adopting and implementing legislative or administrative measures that may affect them.

Then, in the conclusion—hopefully, we'll get a response to this committee about this, Mr. Chair, because I would like a formal response from the minister—they say that the minister has not consulted with first nations specifically for that.

I would like to move a motion that this committee write the minister to confirm whether or not first nations—and which first nations—have been consulted in this process. I would hope that the motion would be supported by my colleagues.

5:25 p.m.

Liberal

The Chair Liberal Joël Lightbound

Sure, Mr. Masse.

5:25 p.m.

NDP

Brian Masse NDP Windsor West, ON

I move that the committee write the Minister of Innovation and request confirmation of whether or not first nations, including the Assembly of First Nations, have been consulted about this legislation, and other first nations that may have been consulted as well.

5:25 p.m.

Liberal

The Chair Liberal Joël Lightbound

Okay, everybody has heard the terms of the motion. It's relatively straightforward.

I'm looking around the room to see if we have consent.

(Motion agreed to [See Minutes of Proceedings])

There seems to be no disagreement on that, so thank you, Mr. Masse.

5:25 p.m.

NDP

Brian Masse NDP Windsor West, ON

Thank you, Mr. Chair.

I appreciate my colleagues for that.

I'll move to my questioning. I'll start with Mr. Balsillie.

I want to thank you for the work that you've done on this and many other files. I've been here for a while, and you've appeared several times in front of committees. It has been helpful.

With regard to some of the concerns that you've expressed, I do want to understand the difference that you might want for the data commissioner to be independent from the Privacy Commissioner and the Competition Bureau.

There is work being done with regard to the Privacy Commissioner in this legislation. My concern is that if we don't get that right, then there's no point in doing the second part. Maybe you can add a little bit of information there about how we make the data commissioner much more independent or robust, because you are correct that the challenges that the Privacy Commissioner and the Competition Bureau face are because the legislation they have to work under is not sufficient, in my opinion.

5:25 p.m.

Founder, Centre for Digital Rights

Jim Balsillie

Yes, thank you for that.

What I was trying to say is that this commissioner needs to be independent of ISED and have more powers than the competition commissioner or the Privacy Commissioner, who have been asking for more power. They do not set the standard; they themselves want a higher standard. As I've also said, who came up with this idea of a tribunal? Who pulled that out, and what the heck is that for? It just weakens the courts and creates a middle process.

Also, I think it's worth having a discussion about whether AI should be integrated with the Privacy Commissioner. That question has never been asked. Data and AI hang out together. They're not separate. Privacy is always at play there, and we have an existing regulator who wants to have that authority and whom we have the ability to build with.

If I was designing this, I would start the consultation again on AIDA. I would not include the tribunal. I would ask if this commissioner should be within the Office of the Privacy Commissioner, with enhanced powers and resources. We already have a running system, and we just need to fix the text of Bill C-27, including the consultation with the first nations.

We have a winning path here that isn't expensive and delayed, yet it was all just thrown out there without really thinking.

5:30 p.m.

NDP

Brian Masse NDP Windsor West, ON

Let's make sure I got this right. You're suggesting that there's nothing stopping us right now—I never thought of it this way—from actually creating an AI commissioner now and then having that almost be part of the process going forward on how to do AI. We could actually have the AI commissioner's office set up and running, and then finish this part of the legislation.

5:30 p.m.

Founder, Centre for Digital Rights

Jim Balsillie

Well, put it within.... There's nothing saying that you can't run it within the Office of the Privacy Commissioner and extend its mandate and resources. It has parliamentary direct reporting that is well established and well respected.

By the way, all of these issues of adequacy and so on that we're looking for build upon the Privacy Commissioner's work, so this idea of adequacy in Europe is a living document that's actually contextualized on case decisions, principally from our courts and our Privacy Commissioner. The idea that these are separate structures and that you want parallel, fragmented...never did make sense to me. I don't know what.... Just give the powers to the Privacy Commissioner. Get rid of that silly tribunal. Fix the provisions of Bill C-27 so that they're actually like the GDPR. Have proper consultations on AIDA. If you do that, you're on your way.

I have an expression that I use: Life's hard enough, so don't make the easy things hard.

5:30 p.m.

NDP

Brian Masse NDP Windsor West, ON

Do I have any more time, Mr. Chair? I think I'm out.

5:30 p.m.

Liberal

The Chair Liberal Joël Lightbound

I think so, too. Thank you. I forgot to start the timer, so I'm glad you're so respectful of the rules, Mr. Masse. I appreciate that.

Mr. Vis, the floor is yours for five minutes.

5:30 p.m.

Conservative

Brad Vis Conservative Mission—Matsqui—Fraser Canyon, BC

Thank you, Mr. Chair.

In your testimony, Ms. Tessono, you referenced the European Union's approach, which is different from the Canadian approach, and you talked about prohibitions based on thresholds. Can you clarify for this committee the difference between the approach being taken by the European Union and the high-impact systems outlined by the minister in his letter to the committee on November 28?

5:30 p.m.

Technology Policy Researcher, University of Toronto, As an Individual

Christelle Tessono

Thank you for your question.

The amendment to AIDA proposed by the minister would call for a class of systems that would be considered high-impact, and the class of systems would be subject to a schedule, which would be updated through regulations, if my memory is correct.

The European Union, in contrast, has, in its law, explicit systems that are considered unacceptable. These include social scoring, the use of biometric identification systems in real time, adoption of facial recognition databases compiled through scraped information online, emotional recognition systems and so on.

We don't have that level of specificity in the proposed amendments, even though we have a class. To me, the thresholds that are created by the European Union are stronger because they create requirements for systems that are not considered high-impact in Canada.

Just to clarify, in Canada there are systems that could cause harm and that are excluded. Those systems are in the scope of the EU AI Act, and they will be subject to requirements. Europeans will have stronger protections with respect to systems that are not in scope in Canada.

5:30 p.m.

Conservative

Brad Vis Conservative Mission—Matsqui—Fraser Canyon, BC

During our last panel, some of the witnesses from the big tech companies criticized the high-impact systems being used in Canada with respect to the moderation of content and the obligations this might put on various companies operating in the AI sphere.

Do you have any comments on that?

5:30 p.m.

Technology Policy Researcher, University of Toronto, As an Individual

Christelle Tessono

Yes. I think this is an issue that reflects the lack of conversations between Canadian Heritage and Justice, which is handling the online harms bill, and the people who are handling AIDA. I don't know if they're talking to each other, but the fact that there are already concerns with industry actors speaks to the importance of having collaboration among different departments in the country.

I cannot speak in too much detail with respect to the previous situations of the online harms bill, but what I can say is that we need infrastructure whereby collaboration across departments is fostered.