Evidence of meeting #99 for Industry, Science and Technology in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was board.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Barry Sookman  Senior Counsel, McCarthy Tétrault, As an Individual
Elizabeth Denham  Chief Strategy Officer, Information Accountability Foundation
Kristen Thomasen  Assistant Professor, Peter A. Allard School of Law, University of British Columbia, Women's Legal Education and Action Fund
Geoffrey Cape  Chief Executive Officer, R-Hauz, As an Individual
Andrée-Lise Méthot  Founder and managing partner, Cycle Capital, As an Individual

4:25 p.m.

NDP

Brian Masse NDP Windsor West, ON

Thank you, Mr. Chair.

I'll start with Mr. Sookman, who's here, and then I'll go to our two virtual witnesses.

With regard to the process, one of the things that's been difficult is that we still don't have some of the amendments. That was brought up at the beginning of the meeting here.

Mr. Sookman, I'm kind of curious. When you were preparing to come here today, how well could you be prepared for testimony on the full bill itself and whether or not it's your opinion that...? Will you submit more documents or information later on, or will the committee have to circle around again to our original witnesses? We've tried to compartmentalize these things the best that we can. In fact, we split the bill for voting purposes into two sections, but it's still one bill here.

I'm just curious about a witness coming here and the process that we've engaged in. Tell us what you think.

Second, what are we going to have to do once we get the other part of the bill in front of us?

4:25 p.m.

Senior Counsel, McCarthy Tétrault, As an Individual

Barry Sookman

Mr. Masse, I don't envy the predicament you're in. In fact, the predicament you're in is a microcosm of the whole public who's interested in the regulation of artificial intelligence.

It was quite clear from the minister's statement that there were amendments. We haven't seen the amendments on AIDA. All we have right now is the letter of the minister, which describes in a very amorphous, open-ended form what the first focus of the government's going to be, but it's quite clear that there's no definition of what the factors are for what will be “high-impact”, and there's no criteria for future systems.

The reality is that we still have no idea. What really concerns me is that, at the last minute—you know, even if it's next week—we're going to get draft amendments, and those amendments are probably only going to be half of the amendments, because they're probably only going to relate to the pieces that were in the minister's letter. Everything else is going to be, I think, saved for clause-by-clause.

When you look at the work that's gone on around the world trying to come up with an appropriate regulatory framework, it has taken years. The British have really studied this issue and, as Ms. Denham said, had taken the view that what's needed is a hub and spoke, decentralized regulatory framework.

The committee may get amendments, and you're going to get a couple of weeks to review something that should have taken a year or years to evaluate. Also we're in the process of still finding out what the Europeans are doing and exactly what the U.S. Congress is going to do. We are making a big mistake, I believe, if we think that we can get dropped amendments, do a thorough analysis of them and make policy for the country that's going to affect jobs, the protection of the public and innovation for decades.

In my view, whatever comes out is not going to give this committee enough time to study it, and my strong recommendation would be to step back. The government's already said that it's going to take two years to do the regulations. They cannot go ahead with this part. Do the study and introduce a proper bill. We won't lose any time, but we'll get something that's thought through and debated by the public and Parliament.

4:30 p.m.

NDP

Brian Masse NDP Windsor West, ON

Thank you for that.

I'm glad you mentioned Ms. Denham. I do want to move on to a question for her.

You mentioned a little bit about where Canada's privacy record is in the world. I was glad to hear you mention Ms. Stoddart. She really set the standard in many respects for a lot of different things that we saw.

Can you give us more detail on how we can reclaim where we're at with regard to our international reputation on privacy and protection for workers? I always thought this was an advantage for investment in Canada as well. I know that some businesses are a little concerned with some of the things that are potentially in this, but at the same time if we have clear, transparent rules that are consistent, it could be of benefit.

Could you please give us a little road map on how we get back?

4:30 p.m.

Chief Strategy Officer, Information Accountability Foundation

Elizabeth Denham

Yes, Jennifer Stoddart is a personal hero of mine. I worked closely with her in Ottawa as her assistant commissioner.

One of the benefits of Canada is that Canada has been a hard-working member of the OECD, so Jennifer Stoddart was the chair of the privacy and security committee of the OECD for many years. I think she was very influential in that role in bringing together various members of the OECD and others around the world.

As a Canadian working in the U.K., I was able to chair the Global Privacy Assembly, which is the group that brings together 135 privacy authorities from around the world. Again, that was an influential post because it took me to G7 meetings and meeting with the ministers of industry, technology and trade. The privacy commissioners can fulfill a really important diplomatic and bridge-building role around the world.

We have a great example in our current privacy commissioner. He's very well regarded already in international circles, but I think the investment has to go to influence other places and countries around the world. Given the fact that data knows no borders and we're all dealing with the same big companies, there needs to be some collaboration and co-operation.

I can see that Canada can continue to play that role but not when our laws are so 20th century. The update of the laws is so important because, with the Privacy Commissioner in Canada, it's not sustainable to have ombudsman and recommendation-only powers when data is the greatest asset of the 21st century.

4:30 p.m.

NDP

Brian Masse NDP Windsor West, ON

I know I'm out of time, but I'll come back to you later, Ms. Thomasen, so you're not left out.

Thank you, Mr. Chair.

I'll come back to you in my second round, Ms. Thomasen.

Thank you.

4:30 p.m.

Liberal

The Chair Liberal Joël Lightbound

Thank you, Mr. Masse.

Mr. Généreux, the floor is yours.

4:30 p.m.

Conservative

Bernard Généreux Conservative Montmagny—L'Islet—Kamouraska—Rivière-du-Loup, QC

Thank you, Mr. Chair.

Thanks to the witnesses. I'm going to ask my questions in quick succession, and witnesses may answer them after that.

Mr. Sookman, the comments you've made from the outset lead me to believe that we're working backwards. We've heard that many times since we began our study. Do you think there are any positive or useful aspects that should be retained? Are there any that we should develop further?

Earlier you talked about the tribunal. Among the various opinions we've heard to date, certain individuals believe that the tribunal should exist, while others, on the contrary, believe we should immediately go to an appellate court. I would like to hear your opinion on that.

Ms. Denham, attendees at the conference that was held in the UK a few weeks ago came to a certain consensus that there should be a voluntary code. In fact, I believe we've already signed the agreement regarding such a code.

Will that voluntary code replace certain bills in certain countries? Will some countries back off on certain elements that they've already put in place to make room for the voluntary code?

I'll let you go first, Mr. Sookman.

4:35 p.m.

Senior Counsel, McCarthy Tétrault, As an Individual

Barry Sookman

Thank you very much for those questions.

When I looked at the minister's letter, which provides some guidance, what I was very pleased to see are the new areas that are proposed to be regulated. We don't know how they're going to be regulated, but there was a proposal to add in dealing with discrimination in the courts and administrative tribunals, and I think that's very important.

There was also something about introducing some guardrails in the criminal investigations. That can help police officers to do investigations that might be discriminatory. Those sorts of public things are very important.

There was also something about regulation of services. Again, we have no idea whether this is intended to be public or private, but the EU has proposed to ensure non-discrimination in emergency services, for example, which is very important.

However, what we didn't see...and this a problem. All the authority is still with the minister. When you look at what is proposed in the letter, you see various areas that are intended to be regulated. You have courts, peace officers, human rights and content moderation, which should be the purview of the justice minister. You have issues relating to employment, which should be the purview of the labour minister. You have issues related to health regulation, which should be within the Department of Health.

I could go on and on, but I think one thing that could be done would be to give numerous ministers the power to make regulations in their areas. That would help make this decentralized.

I know I'm out of time, so I won't get to the tribunal question.

4:35 p.m.

Conservative

Bernard Généreux Conservative Montmagny—L'Islet—Kamouraska—Rivière-du-Loup, QC

I want to make sure that Ms. Denham can respond to my question, please.

4:35 p.m.

Chief Strategy Officer, Information Accountability Foundation

Elizabeth Denham

Yes, your question was about the outcome of the AI Safety Summit that was held several weeks ago at Bletchley Park.

I think, first of all, there was the fact that an international declaration on the risks that generative AI brings to society, an agreement, was signed by 25 countries, including China. The second piece was an agreement to agree on a multilateral agreement.

When you say it's a voluntary agreement, right now it is, although I think a multilateral agreement on how to test high-risk AI systems is a step in the right direction. However, it absolutely is voluntary at this point, and I think what we're seeing is a very fast-moving regulatory environment that's trying to sprint to keep up with the technology.

4:35 p.m.

Conservative

Bernard Généreux Conservative Montmagny—L'Islet—Kamouraska—Rivière-du-Loup, QC

Ms. Thomasen, what do you think of the answers that we've received to date regarding this bill?

This is a very general question, but I think it's important for all the witnesses to have an opportunity to express their general opinion of the bill.

We've known from the start that certain aspects should have been treated separately and that we shouldn't have put everything into a single bill. What you think of that?

4:35 p.m.

Assistant Professor, Peter A. Allard School of Law, University of British Columbia, Women's Legal Education and Action Fund

Dr. Kristen Thomasen

Yes, thank you.

You asked about any positive aspects of the bill, and we did emphasize, in our written submission, some aspects of the bill that we think are very commendable. Dealing with discriminatory bias is important. The recognition that psychological harm is an aspect of harm is important and commendable. Taking a regulatory approach to give clarity to how AI systems can be operated and what obligations and transparency requirements exist for companies using AI, I think, is all very important.

The companion document that accompanies the legislation contains a lot of very important points, as well, and perspectives that we don't see reflected in the draft legislation itself, which we noted, in our submission, is a missed opportunity. There's a discussion about collective rights in the companion document that is crucial when we're talking about AI systems because of the way these systems work on large quantities of data, drawing out inferences based on an assessment of a large group of people. The idea that harm can be actually materialized at the collective level, rather than solely at the personal level, is something that the law needs to acknowledge. This would be relatively new for our laws. It wouldn't be brand new because there are areas of law that recognize collective rights, of course. However, it's something that we're going to have to see recognized more and more, and I think that exists in the companion document. That should be integrated into this bill if it's going to go forward.

I would just say, very generally, that a lot of what we're talking about highlights that we actually need to step back when we think about AI regulation in Canada. The AIDA did not benefit from the consultation. I think that would have been useful in advance of its drafting. It could take a much more holistic approach. Mr. Sookman has highlighted some of this. Ms. Denham has highlighted some of this. There are many considerations that have to go into how we would establish a framework for regulating AI in Canada that we don't see here and that I think are going to be difficult to integrate, solely through textual amendments, into what we have in front of us.

4:40 p.m.

Conservative

Bernard Généreux Conservative Montmagny—L'Islet—Kamouraska—Rivière-du-Loup, QC

Thank you very much.

4:40 p.m.

Liberal

The Chair Liberal Joël Lightbound

Thank you.

MP Van Bynen, the floor is yours.

4:40 p.m.

Liberal

Tony Van Bynen Liberal Newmarket—Aurora, ON

Thank you, Mr. Chair.

I'd like to go back to Ms. Denham and take advantage of her international experience.

Bill C-27 creates a new artificial intelligence and data act, which appears to be based on at least part of the European Union's artificial intelligence act, which also proposes a risk-based framework for artificial intelligence systems.

How do you compare those two pieces of legislation?

4:40 p.m.

Chief Strategy Officer, Information Accountability Foundation

Elizabeth Denham

The EU AI act is comprehensive in terms of its scope. It's a comprehensive act, so it applies to the public sector, the third sector and the private sector. It's a product-safety statute, so it doesn't give individuals any rights. What it does require is that companies categorize the AI system they are procuring or developing. It has to be slotted into one of the risks. There is prohibited AI, high-risk AI and low-risk AI, and it's up to the company to determine the risk they're creating for others. Then, according to that risk.... Due diligence, accountability, transparency and oversight are tied to the level of risk.

To give you an example, there is a group of prohibited AI uses. One of them is live facial recognition technology used by police in public spaces. The EU has already decided that's prohibited. There are many low-risk.... Chatbots, for example, may be considered a low AI or algorithmic risk.

What companies and governments need to do is prove they have a comprehensive AI data governance program in place and an inventory of all AI systems in use, and then stand ready to demonstrate this to AI regulators across the EU.

What the EU has is first-mover advantage, in terms of a comprehensive AI law. That's what it has. This doesn't mean the rest of the world is going to copy and paste their approach. That said, any company outside the EU that is directing services to citizens and organizations in the EU will be subject to the EU law. That means the world is paying attention to what the EU is doing, in the same way they did with the GDPR. There is first-mover advantage there.

I think what the U.S. is doing is extremely interesting. It's difficult to get anything through Congress these days. We know that. Instead, there is an executive order on AI, which requires all government agencies—the supply chains and procurement in every single agency, be it Health and Human Services or the Department of Defense—to comply with AI principles. They also have to stand ready to demonstrate that they are being responsible. I think that is going to be hugely influential but quite different from the approach the EU is taking.

4:45 p.m.

Liberal

Tony Van Bynen Liberal Newmarket—Aurora, ON

The other aspect you raised earlier is with respect to teeth and enforcement regarding violations of some of the statutes. I was particularly interested in your reference to the stop processing order and the disgorgement of data order. If I recall correctly, there was recently a $1.2-billion fine, which seemed to be heralded as a huge fine. However, for a violating organization, this could simply be considered the cost of doing business—

4:45 p.m.

Chief Strategy Officer, Information Accountability Foundation

Elizabeth Denham

It could be a rounding error.

4:45 p.m.

Liberal

Tony Van Bynen Liberal Newmarket—Aurora, ON

—in some cases.

Could you expand a little more on those two concepts—the stop processing order and the disgorgement of data order? I think these are elements that need to be seriously considered in order to give regulators the effect they need to deal with these global enterprises.

4:45 p.m.

Chief Strategy Officer, Information Accountability Foundation

Elizabeth Denham

I can't recall at this moment whether the CPPA has a power in there for the commissioner to order the deletion of data for data that may have been gathered illegally or not properly consented to.

I can't remember if the commissioner has that power, but it is certainly something that is necessary in the modern world. It could be much more effective in changing business practices, rather than fining a large data platform, a social media platform, hundreds of millions of dollars. If the data is collected illegally, and if the data is used in a significant contravention of the act, requiring a company to delete that data has an enormous effect.

I think of a case that I had when I was the commissioner in the U.K. One government department had illegally collected the biometric data of claimants. They had to provide voice prints, which is biometric data and which is sensitive information under the GDPR. The order was that the government department had to delete all of that data and start over again. That was more significant. Among peer departments in the government, that lesson was learned really quickly. Rather than fine the government department for collecting data illegally, the data had to be destroyed.

You'll see that the U.S. Federal Trade Commission has acted in several cases around data deletion and data disgorgement. At the end of the day, companies want sustainable business practices. They want assurances that they're doing the right thing. That's the lens through which we should be looking in Canada at the powers of the Privacy Commissioner, for example.

4:45 p.m.

Liberal

Tony Van Bynen Liberal Newmarket—Aurora, ON

I've run over time. I'm wondering if you could send us some examples of that for consideration and review.

Thank you.

4:45 p.m.

Chief Strategy Officer, Information Accountability Foundation

Elizabeth Denham

Absolutely.

4:45 p.m.

Liberal

The Chair Liberal Joël Lightbound

Go ahead, Mr. Sookman.

4:45 p.m.

Senior Counsel, McCarthy Tétrault, As an Individual

Barry Sookman

If you look at proposed subsection 93(2) of the CPPA, the commissioner has very broad powers to order compliance with the act, including “take measures to comply with this Act”, “stop doing something that is in contravention of this Act”, and so forth.

If there is an entity that is in contravention, the commissioner has the right to effectively get an injunction and have that injunction enforced by the court. That would include when an entity has data that should not have been collected. That could invoke the data deletion requirement. I believe the commissioner will have those powers.

4:45 p.m.

Liberal

The Chair Liberal Joël Lightbound

Thank you very much, Mr. Sookman.

Mr. Lemire, the floor is yours.