Evidence of meeting #19 for Access to Information, Privacy and Ethics in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was frt.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Owen Larter  Director, Responsible Artificial Intelligence Public Policy, Microsoft
Mustafa Farooq  Chief Executive Officer, National Council of Canadian Muslims
Rizwan Mohammad  Advocacy Officer, National Council of Canadian Muslims

4:30 p.m.

Chief Executive Officer, National Council of Canadian Muslims

Mustafa Farooq

Is the question whether more human interaction would ameliorate the problems of FRT? Am I understanding that correctly?

4:30 p.m.

Conservative

James Bezan Conservative Selkirk—Interlake—Eastman, MB

That's what I'm asking. There would always have to be a human check on anything that FRT and AI would suggest as a person of interest.

4:30 p.m.

Chief Executive Officer, National Council of Canadian Muslims

Mustafa Farooq

Respectfully, I actually don't think that would be entirely sufficient.

The reality is that while of course human checks are important, we also know that there is a problem of systemic racism and bias within our police agencies. I don't think that human checks would be fully sufficient. We think that the courts are the place to get those checks and balances, with clear information. How much FRT is being used by a given agency? How is that data being stored? Timelines of destruction of data should be provided to you as parliamentarians. We think that's really important.

4:30 p.m.

Conservative

The Chair Conservative Pat Kelly

Thank you.

Now we will go to Ms. Khalid for five minutes.

4:35 p.m.

Liberal

Iqra Khalid Liberal Mississauga—Erin Mills, ON

Thanks, Chair. I'll go to Mr. Larter first.

Mr. Larter, NCCM today proposed a moratorium on facial recognition technologies. What does your organization think about a moratorium for non-commercial uses of FRT?

4:35 p.m.

Director, Responsible Artificial Intelligence Public Policy, Microsoft

Owen Larter

We think there's definitely the need for regulation, as we've been advocating today. We would suggest investing time and resources in creating that regulation. It takes a lot of time and investment to get any initiative progressed, so focusing on creating regulation, starting with law enforcement uses, is what we would suggest.

I think we would also suggest taking an incremental approach to regulation in this area. It is the case that the technology is developing rapidly. It has improved markedly in recent years. Starting with regulation of law enforcement use, which is the most acute need as we see it, would be what we'd recommend, and investing time and effort into that rather than on advancing a moratorium. That would be our suggestion.

4:35 p.m.

Liberal

Iqra Khalid Liberal Mississauga—Erin Mills, ON

Thank you.

Mr. Farooq, do you agree with what Mr. Larter is saying?

4:35 p.m.

Chief Executive Officer, National Council of Canadian Muslims

Mustafa Farooq

Respectfully, I think we may have a difference of opinion on this question.

Given the potential risks posed to Canadians through FRT, and given the fact that unfortunately our law enforcement agencies have not been appropriately forthcoming, we think a moratorium is appropriate in the non-commercial context. That's the same position that other witnesses who have appeared before this committee have taken. That would be until the full privacy and context regulations can be developed.

4:35 p.m.

Liberal

Iqra Khalid Liberal Mississauga—Erin Mills, ON

Thank you.

You've outlined how difficult it is to get an open and transparent answer from law enforcement agencies across all levels. How would you propose that a moratorium would specifically be implemented, and how would it be enforced?

4:35 p.m.

Chief Executive Officer, National Council of Canadian Muslims

Mustafa Farooq

I think that there are a number of measures. Of course, I think it would require potential regulatory or statutory change.

We would be happy to provide a more extensive answer in terms of precise suggested legislative language in our written brief as well.

4:35 p.m.

Liberal

Iqra Khalid Liberal Mississauga—Erin Mills, ON

Thank you. I would appreciate that.

Mr. Larter, your company does business all across the world. Are you aware of any states that are using FRT in surveilling their populations?

4:35 p.m.

Director, Responsible Artificial Intelligence Public Policy, Microsoft

Owen Larter

My apologies; the lighting in my room appears to have gone out. I hope people can still see and hear me properly.

I think there are definitely a number of countries across the world that are using facial recognition, particularly non-democratic countries, to surveil their populations in ways that I don't think any of us would necessarily think are positive. We don't engage in that type of activity to help with that type of surveillance approach.

4:35 p.m.

Liberal

Iqra Khalid Liberal Mississauga—Erin Mills, ON

If you are able to, we'd love to have you identify what some of these countries are and how exactly they are surveilling.

The second thing is that you're a big proponent of regulation of FRT. Are you a proponent of any regulation that you see across the world that you think Canada should adopt, in terms of ensuring that facial recognition technologies are used appropriately not only in the non-commercial sector but also in the commercial sector as well?

4:35 p.m.

Director, Responsible Artificial Intelligence Public Policy, Microsoft

Owen Larter

Yes. I think there have been some positive developments at the state level in the U.S.

Washington state is one to which I would draw the committee's attention. There is a law that went into effect, as of July last year, that lays out some important transparency and accountability measures. It provides for the testing that I mentioned, and it provides, very importantly, for the human oversight piece as well, ensuring that any system output is reviewed by an individual before a decision is made, and makes sure that individual is appropriately trained to do so.

Washington state is certainly one that I think is worth looking at.

4:35 p.m.

Liberal

Iqra Khalid Liberal Mississauga—Erin Mills, ON

Chair, how much time do I have?

4:35 p.m.

Conservative

The Chair Conservative Pat Kelly

You have 25 seconds.

4:35 p.m.

Liberal

Iqra Khalid Liberal Mississauga—Erin Mills, ON

In that sense, then, I will give a verbal notice of motion:

That, notwithstanding the motions adopted by the committee on December 13, 2021, and on January 31, 2022, concerning the regular scheduled meetings of the committee regarding the production of reports this spring, given the substantial matters brought forward in the course of our deliberations on facial recognition technologies, the committee extend its hearings on the study of facial recognition by three meetings, and that the committee commence consideration of a report in September 2022.

4:40 p.m.

Conservative

The Chair Conservative Pat Kelly

Thank you, Ms. Khalid.

Again, I think we got that on notice, and there are other motions that were placed on notice today as well, but thank you for that.

With that, we will go next to Mr. Villemure for two and a half minutes.

4:40 p.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

Thank you, Mr. Chair.

I will once again address Mr. Larter from Microsoft.

Mr. Larter, you will excuse my pugnacity, but I am very interested in what you are doing.

Is it possible that criminal entities, a foreign power, or some third party could infiltrate and falsify data obtained using artificial intelligence?

4:40 p.m.

Director, Responsible Artificial Intelligence Public Policy, Microsoft

Owen Larter

It's a good question.

I think there's definitely a need for robust cybersecurity around technology in general. Microsoft is making significant investments on that front to ensure that the variety of technologies we provide are secure and that our customers remain secure. There are certainly threats that we all need to be mindful of in ensuring that technology is developed and used—

4:40 p.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

To your knowledge, have there ever been such security breaches in Microsoft technologies anywhere in the world?

4:40 p.m.

Director, Responsible Artificial Intelligence Public Policy, Microsoft

Owen Larter

My focus is more on the AI systems piece and using it responsibly, so this is a bit outside my area of expertise. However, I think there's always the threat of malign actors, so I think responding robustly and with significant investment, which is what we are doing, is the right thing to be doing.

I'm afraid I can't give much more of a specific answer than that because this is sort of outside the area that I have a focus in.

4:40 p.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

Thank you very much.

If you could provide this information by consulting with your colleagues, we would appreciate it.

What do you think should be the boundaries of facial recognition?

4:40 p.m.

Director, Responsible Artificial Intelligence Public Policy, Microsoft

Owen Larter

I think this is a fundamental question as part of the regulatory discussion. I think deciding what is a permissible use and what is not a permissible use is very important.

We have some suggestions on this front. We think that indiscriminate mass surveillance is not something that should be permitted. We also think that discriminating against an individual on the basis of race, gender, sexual orientation or other protected characteristics should be prohibited.

Also, the democratic freedoms piece, which we discussed today, is really important, and I'm pleased to hear that it's part of the discussion. That is one to address as well in making sure that the technology is not used in a way that undermines fundamental freedoms like freedom of assembly. I think those are some core uses that we would suggest.

One that is maybe more specific to the law enforcement context as well is that we think it's important that the output of facial recognition is not used as the only reason or the only piece of evidence to take a material decision—for example, to arrest someone.

4:40 p.m.

Conservative

The Chair Conservative Pat Kelly

I'm going to have to move on. We're significantly over time for Mr. Villemure.

We will now go to Mr. Green for the final questions.