Evidence of meeting #15 for Access to Information, Privacy and Ethics in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was used.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Rob Jenkins  Professor, University of York, As an Individual
Sanjay Khanna  Strategic Advisor and Foresight Expert, As an Individual
Angelina Wang  Computer Science Graduate Researcher, Princeton University, As an Individual
Elizabeth Anne Watkins  Postdoctoral Research Associate, Princeton University, As an Individual

11:50 a.m.

Strategic Advisor and Foresight Expert, As an Individual

Sanjay Khanna

I do on a personal level, absolutely.

11:50 a.m.

Conservative

Damien Kurek Conservative Battle River—Crowfoot, AB

Okay, I appreciate that.

I'll put the same question to Dr. Jenkins.

Would you support a moratorium until there's a framework in place?

11:50 a.m.

Prof. Rob Jenkins

I'm not sure I have a strong view on the moratorium. I'm certainly attuned to the errors that can arise in these systems, and I tend to focus on those more so than the benefits. It may not be my place to speak for the good people of Canada.

11:50 a.m.

Conservative

Damien Kurek Conservative Battle River—Crowfoot, AB

Okay, I was just asking for your perspective on that, but thank you.

I think this committee, both in this study and others, has heard a lot about the concept of consent. Certainly, when you use facial recognition on an iPhone, an android or a computer, you're consenting for your picture to be used to log in and whatnot. That is very, very different from the widespread use of scraping the Internet for images and law enforcement making a determination. That's an important differentiation.

To Dr. Khanna, in 2016 it was reported that the federal government tested facial recognition technology on millions of travellers at Toronto Pearson International Airport. What type of negative ramifications could there be for those several million travellers who passed through border control at terminal 3 at Pearson between July and December 2016 when this pilot project was running? Could you outline what some of those concerns might be in a very real-world example?

11:50 a.m.

Strategic Advisor and Foresight Expert, As an Individual

Sanjay Khanna

I think part of the concern is that we don't know. There hasn't been transparency about what some of the implications and knock-on impacts may have been, and if there were, they may even be not clear to those who may have suffered harms that they are unaware of.

It's a very tricky and challenging space to get into, which is part of the reason why transparency is such a threat to people who sometimes circumvent the law in order to gather and test what can happen through that sort of surveillance.

I'll stop there before speculating further on that question.

I just want to add very briefly that there is this question of.... There's that song. I always feel like "somebody's watching me", and Canadians can now not feel paranoid that they might be feeling that way.

11:50 a.m.

Conservative

Damien Kurek Conservative Battle River—Crowfoot, AB

Sure, I think that's certainly one of the big challenges.

I'll go to Dr. Jenkins, if I could, on that similar vein of questioning. There are about 45 seconds here, I think.

On ethical concerns relating to a pilot project like I described at Pearson International Airport, would you have any comments that you could share with the committee?

11:50 a.m.

Prof. Rob Jenkins

One of my concerns would be the possibility of misidentification that is then difficult to detect or undo. I think around 100,000 passengers per day travel through Heathrow Airport, so, if we had an accuracy of 99% in that context, we'd be talking about 100 misidentifications per day, which soon adds up. It just doesn't seem sustainable to me.

11:50 a.m.

Conservative

Damien Kurek Conservative Battle River—Crowfoot, AB

With that, I'll simply use the last few seconds of my time to say thank you and again extend the offer. Please feel free to send further information to the committee as you think further about these very important issues.

Thank you very much.

11:50 a.m.

Conservative

The Chair Conservative Pat Kelly

Thank you, Mr. Kurek, for keeping us on schedule.

Now we have Ms. Saks for five minutes.

April 4th, 2022 / 11:50 a.m.

Liberal

Ya'ara Saks Liberal York Centre, ON

Thank you, Mr. Chair.

Thank you to our witnesses today.

I'm going to start with a pretty open-ended question, but I feel that there's reason to ask it.

We've heard a lot about what's wrong with this technology and why it's bad. Is there anything good about it?

Is anyone willing to take a stab to start?

11:50 a.m.

Prof. Rob Jenkins

We use automatic face recognition as a blanket term, but it can be used in many different applications. Someone mentioned the convenience of unlocking a phone or accessing account details quickly using it privately in a way similar to a password. I think that is a very different situation than using it for ambient surveillance at the scale of an entire nation.

11:55 a.m.

Liberal

Ya'ara Saks Liberal York Centre, ON

Okay.

Going on with that, Dr. Watkins mentioned the benefits of one-to-one facial verification versus general facial recognition, so there is some advantage use to the technologies. As Mr. Khanna mentioned, as legislators we have to think about how we're behind the ball here. The curve is trending further ahead of us. At the same time, is there a way in which we can set up basic fundamental legislative guardrails at this point, whether they're anchored in privacy or in preventing scraping from open-source platforms, that could create a safety net, to start? We're constantly going to be dealing with novel and emerging technologies, but are there key principles we can look at in guardrail legislation that we should be considering?

I'm wondering if Mr. Khanna or Dr. Watkins would have any suggestions here.

11:55 a.m.

Conservative

The Chair Conservative Pat Kelly

Ms. Saks, I'm just pausing for a brief moment. I think there may have been other witnesses who wanted to answer your first open question.

11:55 a.m.

Liberal

Ya'ara Saks Liberal York Centre, ON

Oh. I apologize. Thank you.

11:55 a.m.

Conservative

The Chair Conservative Pat Kelly

You sort of addressed your second question to Mr. Khanna, so I'll let him answer that now. If Dr. Watkins wants to go after and answer either question, then let's do that.

Go ahead, Mr. Khanna.

11:55 a.m.

Strategic Advisor and Foresight Expert, As an Individual

Sanjay Khanna

Mr. Chair, my response is that I think there could be something akin to—this is not the right phrasing—a digital charter of rights for Canadians that allows them to own and have a portable and secure form of biometric data that is considered to be sacrosanct.

I know that's a bit ambitious as a thought, but it's something that comes to mind as we have this conversation.

11:55 a.m.

Liberal

Ya'ara Saks Liberal York Centre, ON

Go ahead, Dr. Watkins.

11:55 a.m.

Postdoctoral Research Associate, Princeton University, As an Individual

Dr. Elizabeth Anne Watkins

Thank you so much for asking this question. This is such an important question that I've been having recently with colleagues. When I beat the drum about needing to get rid of facial verification, a lot of people will then say, “Well, then, what next? What instead?” It's because these systems are often in place to guarantee worker privacy, to prevent fraud and to protect security. Workers deserve to be safe and secure and to be protected from bad actors. But there need to be alternatives in place so that facial recognition and verification is not the only way and there are ways to give workers other options. They can opt out of the verification process and opt in with perhaps a password or fingerprints.

Again, I think algorithmic impact assessments would be a really great first step to start to shed light into some of these areas where we simply don't know the types of effects and impacts these technologies are having on communities across contexts. Some information-gathering missions in the form of impact assessments, in partnership between the private and public sectors to start to assess what these impacts and effects are, would go a long way.

11:55 a.m.

Liberal

Ya'ara Saks Liberal York Centre, ON

Thank you.

Through you, Mr. Chair, I have one more open-ended question.

We hear a lot of talk about a moratorium. For me, my key question is about how to implement a moratorium. My key concern is actually about the relationship between private and public enforcement, that there are contracts set up in third party structures and currently there is a loophole.

To Dr. Watkins, Mr. Khanna or Dr. Jenkins, what would be key guardrails in a moratorium?

11:55 a.m.

Postdoctoral Research Associate, Princeton University, As an Individual

Dr. Elizabeth Anne Watkins

One thing that struck me in reading all the language around bans that have erupted in the past few years is that they're a great start. However, these bans typically only address the way in which these technologies are used by state-backed agencies—by police departments, for example. They don't curtail the way in which surveillance tools are used in retail stores, for example, or the ways in which these types of data can then be sold to law enforcement or the back doors, exactly as you're saying, between public and private sharing of data that's been collected without consent and without knowledge.

So some kind of regulations or guardrails around how data is transferred between public and private would be a good step.

11:55 a.m.

Conservative

The Chair Conservative Pat Kelly

Thank you.

With that, we will move to Monsieur Villemure for two and a half minutes.

Noon

Bloc

René Villemure Bloc Trois-Rivières, QC

Thank you very much, Mr. Chair.

Ms. Watkins, I listened to your testimony and I feel your overall message could be summed up in two words: “Be careful”.

Would you agree with that?

Noon

Postdoctoral Research Associate, Princeton University, As an Individual

Dr. Elizabeth Anne Watkins

It depends to whom you're addressing such care be taken. If the definition of care includes, for example, consultation with workers or consultation with labour interests or workers' advocates to.... I have not spoken to all workers in Canada and the U.S., and I can't speak for all of them. I know that there are some workers who do advocate for facial recognition because they say they want their accounts to be secure and they want to be safe and riders to be safe, which are all good goals. But taking care, to whom it is addressed, I don't know.

Noon

Bloc

René Villemure Bloc Trois-Rivières, QC

Thank you very much, Ms. Watkins.

Mr. Khanna, of the scenarios you spoke of earlier, which one would you choose to develop facial recognition technology as you know it right now?

Noon

Strategic Advisor and Foresight Expert, As an Individual

Sanjay Khanna

Picking up on the comments of my fellow panellists, and I think they've covered very good ground, you have three geographies here, U.K., U.S. and Canada, where these technologies have been employed and where a large number of lessons have been learned, particularly in the academic community, which is teaching us a great deal about what we need to safeguard, and they're doing so independently, so I think drawing on those things is critical. In terms of scenarios, again, it's looking at these technologies like FRT in the broader context of AI, machine learning and other technologies that feed into, are part of and are embedded in FRT. Governments need to look at this holistic context. We're in a digital society and things are getting ahead of us. How do we create the safeguards as our society faces greater social and economic inequities in the years ahead in part just because of COVID and things leading up to it?

Thank you.