Evidence of meeting #15 for Access to Information, Privacy and Ethics in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was used.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Rob Jenkins  Professor, University of York, As an Individual
Sanjay Khanna  Strategic Advisor and Foresight Expert, As an Individual
Angelina Wang  Computer Science Graduate Researcher, Princeton University, As an Individual
Elizabeth Anne Watkins  Postdoctoral Research Associate, Princeton University, As an Individual

11:25 a.m.

Conservative

Ryan Williams Conservative Bay of Quinte, ON

Okay.

Thank you, Mr. Chair. I'll let my 14 seconds go.

11:25 a.m.

Conservative

The Chair Conservative Pat Kelly

All right. Thank you.

With that, I'll go to Mr. Fergus for six minutes.

11:25 a.m.

Liberal

Greg Fergus Liberal Hull—Aylmer, QC

Thank you very much, Mr. Chair.

I'd like to thank all the witnesses for being present here today. I appreciate it.

I have questions for several witnesses, so I'd appreciate it if the witnesses could be brief, yet pithy, in their comments.

Mr. Jenkins, in a question that you had from my colleague, Mr. Williams, you were asked about setting up fingerprinting versus facial verification. From what I heard from Dr. Wang, you and other witnesses, they're not quite the same thing.

Can you compare the two in terms of their accuracy and how facial recognition technology is used, as opposed to fingerprinting? I'm assuming it is really just a process of trying to match up a dataset to another dataset. Is that correct?

11:30 a.m.

Prof. Rob Jenkins

There are some general similarities.

In both cases, the idea is to take a sample from the world—be it somebody's fingerprint line or their facial image alike—and compare it with some stored representation that you have and that you're expecting will provide a match.

The difficulty arises when the variability in the live capture from the person you're trying to identify...it can vary over time. You always have to account for that variability in attempting the match to the gallery of stored information.

Now—

11:30 a.m.

Liberal

Greg Fergus Liberal Hull—Aylmer, QC

In other words, the situation changes remarkably for the presentation of one's face, as opposed to the presentation of one's fingerprints. It might not be quite an apples-to-apples and oranges-to-oranges comparison.

11:30 a.m.

Prof. Rob Jenkins

I think that's fair to say. We know for sure that different pictures of one person's face can be more varied than pictures of different people's faces. That's the nub of the problem.

11:30 a.m.

Liberal

Greg Fergus Liberal Hull—Aylmer, QC

Thank you very much for that.

Dr. Wang, thank you very much for your presentation. If I may suggest, I know that you only brought to our committee a couple of the problems that your research has identified. If there are others that you would like to share with this committee.... We have a common saying here that if we don't hear it or if we don't read it, we can't report on it. We would certainly appreciate it if you felt you had the time and could send us more examples of what you consider some of the limitations of facial verification.

I'd like to go back to the two big problems that you identified, which are brittleness and interpretability.

I was wondering if you could talk a bit more about the brittleness of it. Bad actors could circumvent the system, but there's also the vulnerability of people who have no intention of circumventing it, but are yet victims of the biases. I think you talked about machine learning and that all it does is extenuate the biases that would exist in society in general.

Am I correct?

11:30 a.m.

Computer Science Graduate Researcher, Princeton University, As an Individual

Angelina Wang

Yes, you are.

For brittleness, because we don't really know what the model is picking up on in order to make certain identifications, we don't know what patterns it's relying on. Because humans might know that people are likely to wear makeup and put on glasses, they can control for these kinds of changes. If someone were to inadvertently do something a bit different with their face and how they're presenting themselves, this might not be tested for and the model might misidentify them.

11:30 a.m.

Liberal

Greg Fergus Liberal Hull—Aylmer, QC

I know this goes beyond what you testified today, but in some of the readings we've had, we've talked about the limitations of the technology, such as camera technology. There are clear biases in the faces that the technology will favour. It was created throughout, and has evolved since we started taking pictures. It favours white males, in particular. For every other category or group, there are varying levels of greater and greater inaccuracy.

Could you talk a bit more about that? Even if we were to try to correct for machine learning, we would still have a problem with the technology itself, and the biases that might be introduced by that technology.

11:30 a.m.

Computer Science Graduate Researcher, Princeton University, As an Individual

Angelina Wang

Ever since cameras were invented, they have always worked a lot worse on people with darker skin tones. They haven't accounted for different lighting differences. The cameras have always been developed primarily on people with lighter skin tones. A lot of times in different lighting conditions, it just will not work as well on people with different skin tones. People's faces may blend into the background more, depending on what they look like.

11:35 a.m.

Liberal

Greg Fergus Liberal Hull—Aylmer, QC

Therefore, as a result, it perpetuates that bias that's already built into the system.

11:35 a.m.

Computer Science Graduate Researcher, Princeton University, As an Individual

Angelina Wang

Exactly. The image quality will be different for different people.

11:35 a.m.

Liberal

Greg Fergus Liberal Hull—Aylmer, QC

Mr. Khanna and Dr. Watkins, I'm coming up close to the end of my time, but I'm going to see if I can get in a really quick question.

Mr. Khanna, you mentioned that politicians have to get ahead of the game.

Can you give us, very briefly, how we should get ahead of the game to try to put the right type of framework around FRT?

11:35 a.m.

Strategic Advisor and Foresight Expert, As an Individual

Sanjay Khanna

Yes. I think you should use a technique called scenario planning. I think for the purposes that you're using it, the Oxford scenario planning approach out of Oxford University is quite useful, because it involves multi-stakeholder engagements and—

11:35 a.m.

Conservative

The Chair Conservative Pat Kelly

It was good that you got a clear answer.

If you have additional information that you'd like to provide to the committee, I welcome you to do so.

Mr. Fergus actually took his clock down to zero before he was finished asking his question.

11:35 a.m.

Liberal

Greg Fergus Liberal Hull—Aylmer, QC

I'm always pushing the envelope.

11:35 a.m.

Conservative

The Chair Conservative Pat Kelly

Yes. Indeed, you are.

Mr. Villemure, you have the floor for six minutes.

11:35 a.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

I want to say hello to all the witnesses. Thank you for your remarkable availability.

In this first round, my questions will be for Mr. Khanna and Mr. Jenkins.

Mr. Khanna and Mr. Jenkins, I have a very general question for you. I would ask that you respond in a few seconds and then we can dig deeper.

Does facial recognition mean the end of personal freedom?

I will turn the floor over to you, Mr. Khanna.

11:35 a.m.

Strategic Advisor and Foresight Expert, As an Individual

Sanjay Khanna

I'll take that.

Very briefly, it depends on the contextual environment of governance of the technologies. I also think that the nature of the government within which these technologies are being employed is very important. The legislative governance and other oversight mechanisms can change. In certain contexts and kinds of government, it could very well potentially mean that—

11:35 a.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

Thank you very much, Mr. Khanna.

Mr. Jenkins, yes or no.

Does it mean the end of personal freedom?

11:35 a.m.

Prof. Rob Jenkins

Do you really want a yes or no?

11:35 a.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

If at all possible.

11:35 a.m.

Prof. Rob Jenkins

Not on its own. No.

11:35 a.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

Thank you very much.

Mr. Jenkins, in your research, you talk about intrapersonal variability.

Could you elaborate on that?

11:35 a.m.

Prof. Rob Jenkins

Yes. Each of us has one face, which has its own appearance. That appearance changes a lot of times, not only over the long term as we grow and age, but also from moment to moment, as viewpoints change, the lighting around us changes or as we change our facial expression or talk.

There's an awful lot of variation, and this is a problem. What you're trying to do, of course, in the context of facial recognition, is to establish which of the people you know or have stored in some database you are looking at right now. That variability is difficult to overcome. You're always in the position of not knowing whether the image you have before you could count as one of the people you know or it is somebody new.

I think the variability is fundamental to the problem that we're discussing. Different people vary in their appearance, but each person also varies in their appearance. Separating those two sources of variability to understand what you're looking at is computationally difficult.