Evidence of meeting #19 for Access to Information, Privacy and Ethics in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was frt.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Owen Larter  Director, Responsible Artificial Intelligence Public Policy, Microsoft
Mustafa Farooq  Chief Executive Officer, National Council of Canadian Muslims
Rizwan Mohammad  Advocacy Officer, National Council of Canadian Muslims

4:05 p.m.

Director, Responsible Artificial Intelligence Public Policy, Microsoft

Owen Larter

Yes, I think there are a couple of bits to it. There are the internal safeguards that I mentioned, including the sensitive use review process. Any deployment of technology with the kinds of clients you're talking about would have gone through that sensitive use review process to make sure that we were—

4:05 p.m.

NDP

Matthew Green NDP Hamilton Centre, ON

Internal to Microsoft?

4:05 p.m.

Director, Responsible Artificial Intelligence Public Policy, Microsoft

Owen Larter

[Inaudible—Editor]

4:05 p.m.

NDP

Matthew Green NDP Hamilton Centre, ON

Yes, but that would have been the same in the States, wouldn't it?

4:05 p.m.

Director, Responsible Artificial Intelligence Public Policy, Microsoft

Owen Larter

Yes, exactly. That's—

4:05 p.m.

NDP

Matthew Green NDP Hamilton Centre, ON

So why the difference in policy, sir?

4:05 p.m.

Director, Responsible Artificial Intelligence Public Policy, Microsoft

Owen Larter

Because it's on a case-by-case basis. In Canada we would be looking at any deployment to make sure that it was being done robustly. I would say—

4:05 p.m.

NDP

Matthew Green NDP Hamilton Centre, ON

Yet in a market the size of the United States of America, Mr. Larter, you have banned it. You're waiting for a regulatory framework. This whole committee was in fact set up because we don't have, arguably, a regulatory framework, and we've heard that in previous testimony.

I am asking you, as the director of public policy for responsible AI at Microsoft, why there is the double standard between this market and the market in the States.

4:10 p.m.

Director, Responsible Artificial Intelligence Public Policy, Microsoft

Owen Larter

Yes, it's an important question, and we don't see it as a double standard.

The reason we're here today is that we want to play a participatory role in creating facial recognition in general. We think there is a real opportunity in Canada. We do think that in the U.S. in particular there is that lack of any general privacy framework, which is a problem in terms of this framework of human rights protection that—

4:10 p.m.

NDP

Matthew Green NDP Hamilton Centre, ON

Mr. Larter, I'm going to take back my time. Thank you for that statement. I encourage you to tune in to the rest of the testimony, as you may find that our current frameworks here in Canada aren't actually adequate.

With that, I'll pivot my questions to our friends from NCCM and Mr. Mohammad, who I think had some very salient points in the opening remarks.

Sir, your website states that you've received hundreds of human rights-related complaints from members of the public who feel that they've been discriminated against. In some of my earlier lines of questioning, I likened this use to racial profiling, street checks and likewise. In your view, is FRT being used as a method for racial profiling?

4:10 p.m.

Advocacy Officer, National Council of Canadian Muslims

Rizwan Mohammad

I'd like to invite our CEO to address your question.

4:10 p.m.

NDP

Matthew Green NDP Hamilton Centre, ON

Sure. We have two minutes, and I have a couple more questions.

4:10 p.m.

Chief Executive Officer, National Council of Canadian Muslims

Mustafa Farooq

Thank you very much.

I think the reality is that the answer is yes, we think there is a high possibility of this happening.

The reality is that we get calls all the time, which people don't hear about, from folks who are undergoing surveillance from CSIS, from the RCMP, and the issues that result out of that. The reality is that this is across the sector. We know already that the CBSA pilot-tested a piece of technology called AVATAR at the airports, which was supposed to be a sort of lie detector that's been used and that, by the way, has now been banned in other jurisdictions. We have grave concerns for how this technology can continue to be weaponized to profile people for potential terrorism.

4:10 p.m.

NDP

Matthew Green NDP Hamilton Centre, ON

Given the nature of your advocacy and work in the community, has your organization received any human rights complaints related to or connected to artificial technology, including the use of facial recognition?

4:10 p.m.

Chief Executive Officer, National Council of Canadian Muslims

Mustafa Farooq

Not at this point, but I think in large part that may have to do with the fact that many folks don't necessarily know that they've been caught in these kinds of things. We sometimes hear concerns around people attending peaceful rallies, whether that's in Vancouver or Hamilton or other places. There are pictures being taken by law enforcement. We don't necessarily know what's always being done with those things, but in large part that has to do with the fact that there has been a lack of disclosure.

4:10 p.m.

NDP

Matthew Green NDP Hamilton Centre, ON

As I recall, quite some work has been done in the community around no-fly lists and the targeting of Muslim-sounding names and profiles. Sometimes those as young as six or eight months old are being put on a list and can't fly.

In your opinion, could this technology be used surreptitiously to provide these same types of racially profiled and targeted acts of discrimination by the government on your community?

4:10 p.m.

Chief Executive Officer, National Council of Canadian Muslims

Mustafa Farooq

Absolutely.

4:10 p.m.

NDP

Matthew Green NDP Hamilton Centre, ON

Thank you very much.

Thank you, Mr. Chair.

4:10 p.m.

Conservative

The Chair Conservative Pat Kelly

Thank you. You left him only about two or three seconds to answer, and we got it in under the wire.

With that, we move to the next round of five minutes. It is Mr. Kurek. Go ahead.

May 5th, 2022 / 4:10 p.m.

Conservative

Damien Kurek Conservative Battle River—Crowfoot, AB

Thank you very much, Mr. Chair, and thank you to our witnesses here today.

Let me start, as I often do, by inviting the witnesses to feel free to submit further documentation to this committee if they, in testimony today, are not able to have an adequate chance to expound on their answers. It is certainly welcome, and it helps us.

Mr. Larter, as an example to frame my question, in the initial design of cameras, the chemicals used were specifically created around the acknowledgement of generally a white person's face. I've done some reading and seen some documentation on that being the case, so there are technical limitations to FRT.

I'm wondering if you can comment on whether Microsoft has taken that into account in the development of its FRT, and on the possible implications that would have specifically when it comes to things like different races, genders, etc.

4:10 p.m.

Director, Responsible Artificial Intelligence Public Policy, Microsoft

Owen Larter

That's a really important question, so thank you for it.

As I mentioned, I do think one of the big risks that need to be addressed through regulation is the potential risk of discriminatory performance in facial recognition technology. Something we've been very mindful of as we've been developing our technology is making sure we have representative datasets that we're training the facial recognition on so that it can perform accurately, including across different demographic groups.

We would say that this is where the testing piece is very important and that you don't just take our word for it. We think it's important that vendors make available their facial recognition for that reasonable, independent third party testing that I mentioned, so that you're able to scrutinize how companies selling facial recognition are doing in terms of the algorithms they are building. That type of scrutiny, I think, is really important in terms of raising the bar—

4:15 p.m.

Conservative

Damien Kurek Conservative Battle River—Crowfoot, AB

Thank you. I, like Mr. Green, acknowledge that we have a short amount of time. In about 30 seconds, could you share with this committee the relationship between FRT and artificial intelligence?

4:15 p.m.

Director, Responsible Artificial Intelligence Public Policy, Microsoft

Owen Larter

Yes, sure. Artificial intelligence pertains to a broad range of systems, of which facial recognition is one. Facial recognition is a type of technology that is able to perform human-like observation or human-like recognition. We would class facial recognition as a type of AI alongside a variety of other AI systems.

4:15 p.m.

Conservative

Damien Kurek Conservative Battle River—Crowfoot, AB

Thank you very much.

To our friends from the NCCM, we heard you reference needed amendments within the RCMP Act. Are there any other acts that, in your opinion, would need to be amended to ensure that we address some of the challenges that are faced when it comes to things like racial profiling?

4:15 p.m.

Chief Executive Officer, National Council of Canadian Muslims

Mustafa Farooq

I think we may want to look at the CSIS Act as well.