Evidence of meeting #15 for Access to Information, Privacy and Ethics in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was used.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Rob Jenkins  Professor, University of York, As an Individual
Sanjay Khanna  Strategic Advisor and Foresight Expert, As an Individual
Angelina Wang  Computer Science Graduate Researcher, Princeton University, As an Individual
Elizabeth Anne Watkins  Postdoctoral Research Associate, Princeton University, As an Individual

12:30 p.m.

NDP

Matthew Green NDP Hamilton Centre, ON

Okay.

Professor Jenkins, coming out of this study, given the challenges of accurate facial recognition both by humans and by artificial intelligence, do you have any further recommendations to mitigate the negative consequences of relying on facial recognition for security purposes specifically?

12:30 p.m.

Prof. Rob Jenkins

One of the main concerns is mistaken identity and just the idea that an innocent person could be apprehended, accused and even sentenced for a crime they didn't commit. That's clearly an error that we want to avoid, and we also want to avoid the opposite error of failing to apprehend someone who could be a great danger to other people.

That's not new. We've been trying to mitigate those problems ever since we've had eyewitness testimony, but it takes on a new form at the scale that face recognition technologies are being deployed. To my mind, that's the main difference.

12:30 p.m.

NDP

Matthew Green NDP Hamilton Centre, ON

Ms. Wang brought up the point of interpretability and the idea that with humans, at least you can contest a decision. However, as it stands now, there's a difficulty in contesting decisions that are being made.

Do you have any input on ways in which we can mitigate interpretability and those various ways in which we can contest decisions?

12:30 p.m.

Prof. Rob Jenkins

It's possible to ask a human how they reached a particular judgment, but we don't have a great deal of insight into why we make the decisions we make sometimes. Often, we're inventing justifications post hoc that sound plausible to others, and that's news to us as much as it is to them.

I'm not sure I can make recommendations that would transfer readily from that situation to decisions made by AI.

12:30 p.m.

NDP

Matthew Green NDP Hamilton Centre, ON

You would agree that—oh. Thank you.

12:30 p.m.

Conservative

The Chair Conservative Pat Kelly

Thank you. I gave you an extra minute, more or less.

Now we'll have Mr. Bezan for up to five minutes and/or a share, if you're going to do that.

Go ahead.

12:30 p.m.

Conservative

James Bezan Conservative Selkirk—Interlake—Eastman, MB

Thank you, Mr. Chair.

I want to direct my questions toward Professor Watkins.

You talked about the whole issue of putting in place a moratorium on the use of FRT until we have the proper guardrails in place through legislation and regulation. When is it appropriate to use FRT in the workplace, by government agencies and by individuals?

12:30 p.m.

Postdoctoral Research Associate, Princeton University, As an Individual

Dr. Elizabeth Anne Watkins

A step toward answering that question would be to use such legislative and regulatory tools as an algorithmic impact assessment, in tandem with consultation with marginalized groups.

I can't speak for the workers as to what kinds of safety and security technologies they would like to see in their workplaces. Consult with these groups to ask them what kinds of technology they are okay with and they would prefer to comply with. Provide them with alternatives, where they can opt out of technologies that they do not wish to comply with, yet still access their means of livelihood. Those would be good steps.

12:35 p.m.

Conservative

James Bezan Conservative Selkirk—Interlake—Eastman, MB

Do you believe that police agencies should be allowed to use FRT?

12:35 p.m.

Postdoctoral Research Associate, Princeton University, As an Individual

12:35 p.m.

Conservative

James Bezan Conservative Selkirk—Interlake—Eastman, MB

It's not just a moratorium; you're talking about a complete ban on using FRT by police agencies, border service agencies and the government in general.

12:35 p.m.

Postdoctoral Research Associate, Princeton University, As an Individual

Dr. Elizabeth Anne Watkins

In high-risk scenarios, where lives and livelihood are on the line, not only are these technologies at present unreliable, but they also presume that social constructs, like race and gender, are machine-readable in a person's face. That is simply untrue.

12:35 p.m.

Conservative

James Bezan Conservative Selkirk—Interlake—Eastman, MB

When you start talking about companies like Clearview AI, which have a track record of mistakenly identifying people and having a prejudice in their AI technology with FRT, should those companies be banned?

12:35 p.m.

Postdoctoral Research Associate, Princeton University, As an Individual

Dr. Elizabeth Anne Watkins

I think their technologies should not be used in high-risk scenarios.

12:35 p.m.

Conservative

James Bezan Conservative Selkirk—Interlake—Eastman, MB

They would be still, in your mind, okay to be used by an employer in the workplace, even though they have a track record that definitely indicates a prejudice.

12:35 p.m.

Postdoctoral Research Associate, Princeton University, As an Individual

Dr. Elizabeth Anne Watkins

The workplace is a very high-risk scenario. They should not be used in the workplace. They should not be used in a public space. They should not be used by police.

Frankly, I think there ought to be a moratorium until we know more about how these tools are impacting communities.

12:35 p.m.

Conservative

James Bezan Conservative Selkirk—Interlake—Eastman, MB

Ms. Wang, would you like to weigh in on this? You've done extensive study on how FRT and Clearview, in particular, has been used to marginalized people.

Do you agree with what Professor Watkins has been saying here?

12:35 p.m.

Computer Science Graduate Researcher, Princeton University, As an Individual

Angelina Wang

Yes, I do. We understand too little right now. We shouldn't deploy them yet, if ever.

12:35 p.m.

Conservative

James Bezan Conservative Selkirk—Interlake—Eastman, MB

Okay.

You've looked at the RCMP, I believe. Would the Canada Border Services Agency...?

We often have national security threats. It's probably best, in your opinion, then, that we should not be using FRT in any of our law enforcement agencies and border control agencies here in Canada.

12:35 p.m.

Computer Science Graduate Researcher, Princeton University, As an Individual

12:35 p.m.

Conservative

James Bezan Conservative Selkirk—Interlake—Eastman, MB

Do you want to take that last minute?

12:35 p.m.

Conservative

Damien Kurek Conservative Battle River—Crowfoot, AB

Thank you very much.

I'm going to throw it open to all the witnesses and I will go through one by one.

Could you list off as quickly as possible examples of FRT, either public or private, just for the committee's reference? That would be very helpful.

We will start with Dr. Watkins.

12:35 p.m.

Postdoctoral Research Associate, Princeton University, As an Individual

Dr. Elizabeth Anne Watkins

I'm sorry. Can you repeat the question? It's examples of already deployed FRT?

12:35 p.m.

Conservative

Damien Kurek Conservative Battle River—Crowfoot, AB

Yes. Just off the top of your head, do you have any examples for the committee to use as reference points of FRT that's in use?

12:35 p.m.

Postdoctoral Research Associate, Princeton University, As an Individual

Dr. Elizabeth Anne Watkins

As far as I know, FRT is currently being used on Uber drivers, on Amazon delivery drivers and on at-home health care workers who are required to log into their workplace using electronic visit verification.

In terms of FRT instead of FVT, as far as I know, many police departments across the U.S. are using FVT except in those cities where there have been bans and moratoriums of which there are a handful.