Evidence of meeting #15 for Access to Information, Privacy and Ethics in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was used.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Rob Jenkins  Professor, University of York, As an Individual
Sanjay Khanna  Strategic Advisor and Foresight Expert, As an Individual
Angelina Wang  Computer Science Graduate Researcher, Princeton University, As an Individual
Elizabeth Anne Watkins  Postdoctoral Research Associate, Princeton University, As an Individual

12:20 p.m.

Computer Science Graduate Researcher, Princeton University, As an Individual

Angelina Wang

I think that is for me.

What the revised tool mostly does is it tries to find different patterns and correlations present in datasets that are likely to propagate into models that are trained on the dataset. It is not guaranteed by any means to find all the possible correlations that could arise. It just surfaces potential ones to the users so they can be more aware of those dataset creations when they are using a model that has been trained on such a dataset.

12:20 p.m.

Liberal

Parm Bains Liberal Steveston—Richmond East, BC

Thank you.

I would like to share the rest of my time with my colleague, Mr. Fergus.

12:20 p.m.

Liberal

Greg Fergus Liberal Hull—Aylmer, QC

Thank you very much, Mr. Bains. I appreciate it.

Moving on a little bit, Dr. Wang, you mentioned earlier in your testimony, and I want to make sure I got this right, that even if we were to solve for bias and discrimination, there are some concerns that have been brought into the use of machine learning in terms of identifying folks. Can you talk a little bit about that?

12:20 p.m.

Computer Science Graduate Researcher, Princeton University, As an Individual

Angelina Wang

Sure, yes.

Two of the points that I brought up are interpretability and brittleness. For brittleness, back actors are able to just trick the model in different ways. In the specific study I'm referring to they print a particular pattern on a pair of glasses, and through this, they can actually trick a model into thinking they're somebody completely different.

The other part is transparency. Models right now are very uninterpretable, because they have been able to pick up on whatever patterns the model has figured out as able to help it best with its task. We don't necessarily, as people, know what patterns the models are relying on. They could be relying on—

12:20 p.m.

Liberal

Greg Fergus Liberal Hull—Aylmer, QC

I'm sorry to interrupt. It seems, in other words, the machines are not able to tell us what it is they're using to make that kind of evaluation.

12:20 p.m.

Computer Science Graduate Researcher, Princeton University, As an Individual

Angelina Wang

Yes, exactly.

12:25 p.m.

Liberal

Greg Fergus Liberal Hull—Aylmer, QC

Mr. Khanna, you raised the possibility of a digital charter of rights for Canadians. This is a very intriguing idea. If you were to blue-sky a little bit, what would you expect would be some of the elements inside that kind of charter?

12:25 p.m.

Strategic Advisor and Foresight Expert, As an Individual

Sanjay Khanna

One would be sanctity of personal data, so protection of certain data, like facial data, that's very intimate to the individual. I think that's part of it, but I think it would also aim to ensure alignment with the Canadian Charter of Rights and Freedoms and also be potentially a secure repository of data that can be exchanged and verified and is much more cybersecure than might be out there.

12:25 p.m.

Liberal

Greg Fergus Liberal Hull—Aylmer, QC

Could I very quickly ask you, Mr. Khanna, to talk about the sanctity of personal data? Does that mean we would have the right, for example, to our images, that our facial images would be ours? It's our property. Has the horse left the barn on that? Can we pull that back in?

12:25 p.m.

Strategic Advisor and Foresight Expert, As an Individual

Sanjay Khanna

I think the horse has left the barn to a great extent, to the extent that you can't draw out of Clearview AI what has been already taken. When starting to think about this, particularly as children age, we aren't the only ones who have been exposed to facial recognition technology. There are current and multiple generations that are going to be affected by this. Thinking about those who haven't yet been exposed, for whom the horse hasn't left the barn, is super important.

12:25 p.m.

Conservative

The Chair Conservative Pat Kelly

Thanks. We let you go quite a bit over time there, but the testimony was good and important, and for once we're not quite jammed up against a hard stop here.

Next is Monsieur Villemure.

Go ahead, please.

12:25 p.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

Mr. Jenkins, how could we inject a little ethics into all this facial recognition technology? Could radical transparency or the right to be forgotten be the way to go?

I will turn the floor over to you for two and a half minutes to talk to us about this.

12:25 p.m.

Prof. Rob Jenkins

Yes, I certainly think transparency is important. We should aim for the situation where members of the public can understand how these technologies are being used; how they could be effective; how they could be affected by them; and how they may have been affected by them.

We know from studies of the use of these technologies in the U.S., for example, that there's very little in terms of an audit trail, and I think auditing the use of face recognition technologies is going to be an important part of using them more widely.

12:25 p.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

Is the concept of radical transparency that people usually refer to enough or not enough?

12:25 p.m.

Prof. Rob Jenkins

I think it's probably not enough on its own. I think it's an important component of an ethical system.

12:25 p.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

My last question has to do with the notion of consent.

When our image is captured as we're walking in the street, it's pretty much impossible for us to give consent.

Because that would be next to impossible, what can we expect in terms of consent or protection?

12:25 p.m.

Prof. Rob Jenkins

Yes, it's a very difficult question. I don't have a straight answer, but I'm slightly cautious about comparing face recognition technologies against a perfectly accurate and bias-free system, because that's not an option that's on the table.

We certainly know what we get from the kinds of decision systems that have been in use for decades. Current systems involve errors and involve bias, and we don't consent to being captured on CCTV in my country or by the eyes of other people. I think it's a complicated matter.

12:30 p.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

Thank you very much.

12:30 p.m.

Conservative

The Chair Conservative Pat Kelly

Thank you.

Now we have Mr. Green.

12:30 p.m.

NDP

Matthew Green NDP Hamilton Centre, ON

Just to clarify, it's five minutes?

12:30 p.m.

Conservative

The Chair Conservative Pat Kelly

No. It's two and half, with a little generosity.

12:30 p.m.

NDP

Matthew Green NDP Hamilton Centre, ON

Okay, there we go. I appreciate the generosity.

We will go to Professor Khanna.

Professor Khanna, I'm hoping to explore even deeper into the relationship between corporate use of this technology and the state. As I'm to understand, companies and organizations you have advised utilize FRT. What regulations and measures must these companies adhere to in order to protect data and privacy and Canadians currently?

12:30 p.m.

Strategic Advisor and Foresight Expert, As an Individual

Sanjay Khanna

Just to clarify, I haven't advised companies on their use of FRT. It would be a bit of an accident if they happened to be using it. The projects I've worked on haven't been FRT specific. A project I've worked on recently with the World Congress on Justice for Children on the future of child justice—

12:30 p.m.

NDP

Matthew Green NDP Hamilton Centre, ON

If I could, with two minutes and 30 seconds, you referenced guardrails. Based on your experience, does Canada have an appropriate framework to regulate the use of facial recognition technology by private and state agencies?

12:30 p.m.

Strategic Advisor and Foresight Expert, As an Individual

Sanjay Khanna

Not yet, and that's what I'm hoping you and other legislators will get your fine minds around.