Evidence of meeting #106 for Industry, Science and Technology in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was going.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Todd Bailey  Chief Intellectual Property Officer and General Counsel, Scale AI, As an Individual
Gillian Hadfield  Chair, Schwartz Reisman Institute for Technology and Society, University of Toronto, As an Individual
Wyatt Tessari L'Allié  Founder and Executive Director, AI Governance and Safety Canada
Nicole Janssen  Co-Founder and Co-Chief Executive Officer, AltaML Inc.
Catherine Gribbin  Senior Legal Adviser, International Humanitarian Law, Canadian Red Cross
Jonathan Horowitz  Legal Adviser, International Committee of the Red Cross, Regional Delegation for the United States and Canada, Canadian Red Cross

12:20 p.m.

Prof. Gillian Hadfield

That information would only be available to governments. It would not be published. It would not be put on the Internet. There would have to be serious security around that information. It would be treated as confidential information and commercially secret information.

12:20 p.m.

Liberal

The Chair Liberal Joël Lightbound

Thank you very much.

Mr. Masse.

12:20 p.m.

NDP

Brian Masse NDP Windsor West, ON

Thank you, Mr. Chair.

Mr. Bailey, you might be the best to answer this in terms of being in private business. There have been some who have called for the building of supercomputers by governments. The United Kingdom is doing that. There are calls for Canada to have the physical capacity to actually outpace the private market. What are your thoughts on that?

It seems that we are almost going to be in a computer arms race with the private sector if the U.K. goes through with its project, which they're funding massively. Other states are considering it, and Canada will probably be as well. A price tag of up to a billion dollars is what has been suggested or has been floated out there. Just look at Parliament Hill: I think it was Pat Martin who once said to just add another zero and another number on the project.

At any rate, could you reflect a bit on that situation?

12:20 p.m.

Chief Intellectual Property Officer and General Counsel, Scale AI, As an Individual

Todd Bailey

Sure.

I'm not sure that government has the capacity to do that. That's not necessarily a criticism. I don't know that it's the government's job to build supercomputers. What the government should be doing is facilitating private industry to do those things and to perhaps be setting out rules. The supercomputers maybe are not as important so much as cloud capacity is. We do see some attempts, not at the national level but with SOSCIP, for example, in Ontario, around creating computing resources that are available. This is part of helping Canadian industry grow as well.

12:20 p.m.

NDP

Brian Masse NDP Windsor West, ON

I'll just throw this out there as I don't really have much more time, but we also have to consider whether this is going to affect our trade agreements.

For example, if the U.K. does this and it enters into a market decision with the private sector.... We don't have a trade agreement with them right now, but for the United States and others, if we're getting into these types of operations, are they going to be consistent with our current trade agreements?

12:20 p.m.

Chief Intellectual Property Officer and General Counsel, Scale AI, As an Individual

Todd Bailey

There's an arms race right now, so to speak, with these large models, but common sense tells you that when things get too big and too expensive, innovation takes you in the other direction.

I don't know if anyone is familiar with small language models, for example. They're actually much better at targeted tasks than these large language models. For example, you can compare that to someone who has to try to know everything in the world versus someone who is an expert on a certain thing. Even within this approach here, which I know is mirroring the EU act, the focus now on general-purpose technology is a little bit misguided, because you can get some of these harms and maybe even worse with smaller language models and with smaller models.

One of the things Professor Hadfield mentioned is that, in the U.S. executive order, they've put a limit on it. It applies only to bigger models. I mentioned earlier that if you look at dry counties in the southern U.S., for example, if you get to the border of a county, you're going to find all the liquor stores you ever want to find. The regulations will affect industry and it will push it in a way that, you know.... What we want to do is set up regulations that direct innovation in the direction that we want.

12:20 p.m.

NDP

Brian Masse NDP Windsor West, ON

Thank you.

Thank you, Mr. Chair.

12:20 p.m.

Liberal

The Chair Liberal Joël Lightbound

Thank you very much, Mr. Masse.

Mr. Généreux.

January 29th, 2024 / 12:20 p.m.

Conservative

Bernard Généreux Conservative Montmagny—L'Islet—Kamouraska—Rivière-du-Loup, QC

Thank you, Mr. Chair.

Thank you to the witnesses.

The amendments being proposed by the minister include, in particular, high-impact artificial intelligence systems and their various uses, which are divided into classes. I don't know whether you had access to the document, but the uses set out in class 4 include moderation of content on on‑line communications platforms, search engines or social media. The important word here is “moderation”. Essentially, it means that the department could monitor what was happening on social media, search engines and so forth.

Do you think that is going too far, in a bill such as this? In Canada, we adopted Bill C‑11, which has passed into law and allows the CRTC to undertake those audits and determine who can and cannot publish something on social media.

12:20 p.m.

Founder and Executive Director, AI Governance and Safety Canada

Wyatt Tessari L'Allié

I understand why we want to include that, because it's true that there are social repercussions. Personally, I think that this may be going too far, but the advantage of adding that to the schedule is that a decision can be made later as to what needs to be included or not and adjustments can be made through regulatory amendments. For that reason, I'm neither for nor against it.

12:25 p.m.

Conservative

Bernard Généreux Conservative Montmagny—L'Islet—Kamouraska—Rivière-du-Loup, QC

What do you think, Mr. Bailey?

12:25 p.m.

Chief Intellectual Property Officer and General Counsel, Scale AI, As an Individual

Todd Bailey

Social media is not an area of expertise for me.

All I would say is that I don't know of a lot of social media platforms that are based here in Canada, so this would be an exercise in regulating foreign companies and so on. We know that's difficult to do, but to me, it makes sense. Social media is a big part of Canadian life and it makes sense that the AI that's shaping the way that traffic goes would be high impact.

12:25 p.m.

Conservative

Bernard Généreux Conservative Montmagny—L'Islet—Kamouraska—Rivière-du-Loup, QC

New terms, such as “deepfake” are constantly cropping up in the area of artificial intelligence. The bill uses current vernacular. However, the new reality of AI will mean new terminology and new expressions that aren't necessarily included here.

Isn't it overly restrictive, in some way? Personally, I think we should take that out.

12:25 p.m.

Founder and Executive Director, AI Governance and Safety Canada

Wyatt Tessari L'Allié

If changes can be made via regulations, it's not an issue. Indeed, these decisions can be made via regulations, instead of being set out in the statute. If the legislation is flexible, it'll allow adjustments to be made as this field evolves.

12:25 p.m.

Conservative

Bernard Généreux Conservative Montmagny—L'Islet—Kamouraska—Rivière-du-Loup, QC

Right. Thank you.

Mr. Bailey, I want to come back to you.

In your opening remarks, you talked about something that I think is important, meaning the currently widening productivity gap in Canada. Clearly, Canada has productivity problems, notably in the industrial sector. Canada's not adapting to new technologies as fast as other countries are. You alluded to AI when you talked about that.

Can regulations such as the ones we are discussing hurt us or, on the contrary, help us implement new technologies? What's your opinion on that?

12:25 p.m.

Chief Intellectual Property Officer and General Counsel, Scale AI, As an Individual

Todd Bailey

My opinion on that, as you've had other witnesses here in front of this committee say, is that Canadian industry is very slow to make these sorts of decisions in the first place. They see this regulation now, and they don't want to do things that are going to get them into trouble, so for sure there is some hesitation. In this country, we talk about the risk that AI is going to take people's jobs. At the same time, we talk about how we don't have enough people to do the jobs that we need done.

What AI offers industry especially—we're not talking about ChatGPT necessarily—is the ability to mechanize effectively the repetitive jobs that no one wants, the low-paying jobs, and to help upskill those people into the higher-paying jobs where we need human intelligence, human empathy and all that sort of thing.

What I want to continue to convey is that with this regulation we're trying to solve a lot of harms, but at the same time, we need to get help. We don't want to create another barrier for Canadian industries to come off the sidelines and begin to adopt AI to solve some of these problems they have.

12:25 p.m.

Conservative

Bernard Généreux Conservative Montmagny—L'Islet—Kamouraska—Rivière-du-Loup, QC

You mentioned earlier in your opening remarks that current legislation already applies to AI.

12:25 p.m.

Chief Intellectual Property Officer and General Counsel, Scale AI, As an Individual

12:25 p.m.

Conservative

Bernard Généreux Conservative Montmagny—L'Islet—Kamouraska—Rivière-du-Loup, QC

So, why should we add a bill such as this to existing legislation? Don't you feel it would be yet another obstacle slowing the implementation of new technologies in Canada?

12:25 p.m.

Chief Intellectual Property Officer and General Counsel, Scale AI, As an Individual

Todd Bailey

I think it's important for this law not to layer on top of the laws that we already have.

If you look at President Biden's executive order, you can see that he has created a great list of the various departments of government where AI is already relating to workers and relating to privacy. We know these things already. For example, where I see that an AI commissioner could play a role—I may have mentioned this earlier—it is not as an enforcer but as a coordinator to help these various departments. For anyone who is involved in technology, you understand the steep learning curve that you have all climbed on AI.

If every time a business, an academic or anyone walks into a room, they have to educate government again to get back up to that level, it would be very helpful to have someone within the machinery of government who understands those issues and is even able to raise issues and reach out to Health Canada or some other parts of government to say, “Hey, here's an issue, here's what the issues are, and this is what you need to do”, and so on.

12:25 p.m.

Conservative

Bernard Généreux Conservative Montmagny—L'Islet—Kamouraska—Rivière-du-Loup, QC

I think it was Ms. Hadfield who suggested earlier having a Canadian AI agency. Is that what you are referring to?

12:30 p.m.

Chief Intellectual Property Officer and General Counsel, Scale AI, As an Individual

Todd Bailey

Yes, that's correct.

In my mind, it's not patterned after the Privacy Commissioner, which is more of a police officer, you might say. It's more that there's not one AI, and AI is not just affecting one department. The question was asked about whether it should be in ISED. It's technology, so it makes sense that this is the place, but it is very difficult to understand technology and it filters across all the rest of government. It makes sense that there's a quarterback somewhere that is able to sort of see the broader...and help coordinate.

12:30 p.m.

Conservative

Bernard Généreux Conservative Montmagny—L'Islet—Kamouraska—Rivière-du-Loup, QC

Mr. Chair, I see that the representative from the Red Cross has raised his hand. I don't know whether it relates to the questions I'm asking.

12:30 p.m.

Liberal

The Chair Liberal Joël Lightbound

Mr. Horowitz.

12:30 p.m.

Legal Adviser, International Committee of the Red Cross, Regional Delegation for the United States and Canada, Canadian Red Cross

Jonathan Horowitz

Thank you for the opportunity.

I just want to return to the question about content and harmful information that can appear on social media or elsewhere.

This is not to comment necessarily on how the bill would manage that issue or how Canada's domestic law would manage that issue, but our emphasis is simply that, if this bill intends to mitigate or prevent harm that can be caused by or through AI systems, there should be an explicit recognition that misinformation and disinformation can cause the types of harms that are listed in the definition, under proposed subsection 5(1), of what constitutes “harm” in the bill.

Immediately what comes to mind is both physical and psychological harms. With respect to the humanitarian assistance community, misinformation and disinformation can lead to the prevention or disruption of the provision of life-saving humanitarian assistance. Of course, in certain contexts social media platforms may cause harm through active child soldier recruitment, through threats of spreading violence to terrorize civilian populations, and so on and so forth.

We just want to ensure that the bill clarifies the risks that can arise from misinformation and disinformation. They should be included among those things you are trying to regulate, mitigate or prevent.

Thank you.