Evidence of meeting #19 for Access to Information, Privacy and Ethics in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was frt.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Owen Larter  Director, Responsible Artificial Intelligence Public Policy, Microsoft
Mustafa Farooq  Chief Executive Officer, National Council of Canadian Muslims
Rizwan Mohammad  Advocacy Officer, National Council of Canadian Muslims

4:15 p.m.

Conservative

Damien Kurek Conservative Battle River—Crowfoot, AB

Sure.

4:15 p.m.

Chief Executive Officer, National Council of Canadian Muslims

Mustafa Farooq

I think the reality is that we still don't know—and to the best of my knowledge, this committee has not been told—whether CSIS uses facial recognition technology. That's a knowledge deficiency Canadians deserve to have the answers to. Depending on those sets of answers, we may also be thinking about what that looks like in terms of penalties for non-disclosure.

4:15 p.m.

Conservative

Damien Kurek Conservative Battle River—Crowfoot, AB

We obviously spent a lot of time in this committee on the governmental implications. I'm curious, though, if you have any further thoughts about the private applications. We all use some FRT, probably—I'm making assumptions here—with the face recognition to log into our phones and whatnot. We all use a little bit of FRT in some sense.

Would you have any comments on not just the public implications of the use of this technology but in terms of the private application as well, whether it be on technology like personal electronic devices or otherwise, such as within stores and that sort of thing?

4:15 p.m.

Chief Executive Officer, National Council of Canadian Muslims

Mustafa Farooq

Unfortunately, we're not experts in the area of how this could be looked at from a consumer or a corporate perspective, but I will say that there are significant concerns within our communities generally about how large tech companies are taking in this data, how it's being used, and how it's being sold and given potentially to authoritarian regimes. I'm not saying this about any particular tech company, but certainly those are concerns we're hearing broadly from our community.

4:15 p.m.

Conservative

Damien Kurek Conservative Battle River—Crowfoot, AB

I know I'm getting very close here, and maybe I'll simply ask you this. You mentioned judicial benchmarks as your suggestion. If you would have further information that you could provide to this committee as to what you feel an appropriate judicial benchmark would be for the application of FRT, for example, in a law enforcement context, certainly this committee would appreciate that. Thank you very much.

With that, my time is up. Thank you to the witnesses.

4:15 p.m.

Conservative

The Chair Conservative Pat Kelly

Thank you.

Next we go to Mr. Bains for five minutes.

May 5th, 2022 / 4:15 p.m.

Liberal

Parm Bains Liberal Steveston—Richmond East, BC

Thank you, Mr. Chair, and thank you to our guests who are joining us today.

My question is for the gentleman joining us from the National Council of Canadian Muslims.

We've heard from witnesses before this committee that agencies have been using facial recognition technology. You also mentioned in your comments that in places like Vancouver and other parts of the country, if there are rallies or things like that where people are gathered, the technologies are being used. Someone mentioned that the VPD was also using it in British Columbia.

I'm curious about that. To your knowledge, to what level are these agencies using this technology? I will have follow-up questions after that.

4:20 p.m.

Chief Executive Officer, National Council of Canadian Muslims

Mustafa Farooq

I wouldn't want to speak on behalf of any particular agency, obviously. To the best of my knowledge, after complaints were brought forward to the Vancouver police specifically, that jurisdiction imposed a moratorium now on FRT technologies.

However, we know that is not a universal standard. When folks say they are doing something, we know that there continue to be concerns around whether they are actually doing it.

4:20 p.m.

Liberal

Parm Bains Liberal Steveston—Richmond East, BC

Thank you.

Have you been engaged by any of these Canadian police authorities about facial recognition technology?

4:20 p.m.

Chief Executive Officer, National Council of Canadian Muslims

Mustafa Farooq

Other than very peripheral conversations, no.

4:20 p.m.

Liberal

Parm Bains Liberal Steveston—Richmond East, BC

Have you put forward recommendations for improving Canada's legal framework for governing artificial intelligence technology? Have you made submissions?

4:20 p.m.

Chief Executive Officer, National Council of Canadian Muslims

Mustafa Farooq

Other than to this committee, we have put forward nothing formal, other than other concerns around online harm regulation and the role that AI plays in that conversation.

4:20 p.m.

Liberal

Parm Bains Liberal Steveston—Richmond East, BC

Is there a reason you have not been able to engage with these agencies? Have you reached out?

4:20 p.m.

Chief Executive Officer, National Council of Canadian Muslims

Mustafa Farooq

Quite simply, it's very hard to engage in a conversation when basic facts aren't being acknowledged.

When CSIS tells us that they're not going to answer a basic question—which is the same question they haven't answered for you right now—about whether facial recognition technology is being used, it becomes very hard to get any sense of accountability. It becomes very hard to have a conversation. When the RCMP tells us one thing, tells Canadians one thing and tells the Privacy Commissioner one thing, it becomes very hard to have a good-faith, honest conversation about what the future could actually look like.

I think all of us are interested in a world in which law enforcement uses facial recognition technology responsibly. Folks are right when they say that there are potential good-use cases, especially in child pornography and cases like that. The reality is that our agencies here are simply not meeting the standard that Canadians expect them to, for all of the reasons that you all know about, vis-à-vis systemic racism and so many of these other challenges.

4:20 p.m.

Liberal

Parm Bains Liberal Steveston—Richmond East, BC

Thank you.

If I have time, I have a quick question for Mr. Larter.

Several witnesses have raised concerns that facial recognition technology has been shown to misidentify racial individuals more often than white individuals. We've heard that on numerous occasions here in this committee. How does your organization address that risk?

4:20 p.m.

Director, Responsible Artificial Intelligence Public Policy, Microsoft

Owen Larter

That's a really important question, and it's one of the major risks that we think needs to be addressed around facial recognition use.

I'll come back to what I said before in terms of the internal safeguards we've mentioned. One of the most important is the testing piece. We make sure that we are opening up our facial recognition system to independent third party testing to make make sure that we are training and testing it ourselves in such a way that we are confident it is performing accurately and in a way that minimizes gaps across different demographic groups.

I would really like to emphasize as part of my contribution today that the testing piece is a really important part of making sure that technology is performing in an accurate manner, which it can do. It's made incredible strides in the best-performing algorithms in recent years, but there are many algorithms out there that aren't as accurate. You need to be able to test them to make sure that when, for example, police are using them, they are using the most accurate systems.

4:20 p.m.

Conservative

The Chair Conservative Pat Kelly

Thank you.

4:20 p.m.

Liberal

Parm Bains Liberal Steveston—Richmond East, BC

Thank you. Do I have any more time?

4:20 p.m.

Conservative

The Chair Conservative Pat Kelly

I'm afraid, Mr. Bains, that you are out of time.

We'll go now to Monsieur Villemure for two and a half minutes.

4:20 p.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

Mr. Larter, I will turn to you again. Since we only have two and a half minutes, let's be brief.

For Microsoft, what constitutes surveillance?

4:20 p.m.

Director, Responsible Artificial Intelligence Public Policy, Microsoft

Owen Larter

I guess surveillance would be observing individuals or groups.

4:20 p.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

Is that with or without their consent?

4:20 p.m.

Director, Responsible Artificial Intelligence Public Policy, Microsoft

Owen Larter

It could be done without their consent, and that would problematic. I think it could be done with their consent. In large part, you would think of surveillance being done perhaps without an individual's consent.

4:20 p.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

When we know that, sometimes, facial recognition can indicate—not all studies support this—a person's political or sexual preferences, in a way, we can say that there is no more freedom possible. We are monitored at all times.

4:25 p.m.

Director, Responsible Artificial Intelligence Public Policy, Microsoft

Owen Larter

I think these are real concerns that need to be addressed. Again, that's why we're advocating regulation.

I think we're skeptical about some of the claims around what facial recognition can do—like intuit an individual's political beliefs just by looking at that person—so I think a conversation around regulation that identifies those uses that are permitted, and also, importantly, those uses that are not permitted, is a really important thing to have.