Evidence of meeting #15 for Access to Information, Privacy and Ethics in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was used.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Rob Jenkins  Professor, University of York, As an Individual
Sanjay Khanna  Strategic Advisor and Foresight Expert, As an Individual
Angelina Wang  Computer Science Graduate Researcher, Princeton University, As an Individual
Elizabeth Anne Watkins  Postdoctoral Research Associate, Princeton University, As an Individual

12:35 p.m.

Conservative

Damien Kurek Conservative Battle River—Crowfoot, AB

Thank you.

Dr. Wang.

12:35 p.m.

Computer Science Graduate Researcher, Princeton University, As an Individual

Angelina Wang

The ones I can think of are HireVue and some of these interviewing platforms.

12:35 p.m.

Conservative

Damien Kurek Conservative Battle River—Crowfoot, AB

Thank you.

Dr. Jenkins.

12:35 p.m.

Prof. Rob Jenkins

It's often used in border control in a number of countries and in processes related to border control, such as passport renewal, to verify that the person submitting the document is who they claim to be.

It's also used in retrospective review of crowd footage to try to identify suspects who may have been captured in CCTV footage, for example.

12:40 p.m.

Conservative

The Chair Conservative Pat Kelly

Mr. Kurek asked a question that invited a long answer from four members. I am going to ask Mr. Khanna to very quickly respond to your question if he would like. Then we're going to go directly to Ms Khalid.

12:40 p.m.

Strategic Advisor and Foresight Expert, As an Individual

Sanjay Khanna

Okay. This would need to be confirmed, but there are stories of it being used in children's toys, children's applications and things like that. That needs to be verified, but I recall seeing that in a UN report.

Thanks.

12:40 p.m.

Conservative

The Chair Conservative Pat Kelly

Thank you.

As what I believe will be our last questioner, Ms. Khalid, go right ahead.

12:40 p.m.

Liberal

Iqra Khalid Liberal Mississauga—Erin Mills, ON

Thank you very much, Mr. Chair and, through you, thank you to the witnesses for your very compelling testimony today.

Just to add to the list that you have provided, I will say that in 2018 Taylor Swift used facial recognition technology to identify some of her stalkers. That was a very fascinating, interesting and, I think, complex use of technology.

I know we have been talking about moratoriums. Perhaps I will start by asking our witnesses what a moratorium would achieve in an environment in which technology and innovation occur at such a fast pace?

Perhaps I will start with Dr. Khanna.

12:40 p.m.

Strategic Advisor and Foresight Expert, As an Individual

Sanjay Khanna

A moratorium, as you know, is a pause. It's to gather information and insight both from within organizations and from outside organizations and to gather and assess and then determine what kinds of guardrails might be imposed should that moratorium be lifted.

If this is a question of “'math' destruction”, as Dr. Watkins has described, then it does make sense to employ a moratorium and create that pause for better decision-making.

12:40 p.m.

Liberal

Iqra Khalid Liberal Mississauga—Erin Mills, ON

How long do you think that pause should last? Is it to find a perfect solution that we're looking for, or is it to find a workable balance between public safety, privacy and convenience of the general public?

12:40 p.m.

Strategic Advisor and Foresight Expert, As an Individual

Sanjay Khanna

I would defer to my colleagues who have studied the developments in artificial intelligence and facial recognition technology more closely.

12:40 p.m.

Liberal

Iqra Khalid Liberal Mississauga—Erin Mills, ON

If any one of you wants to take that on, please go ahead.

12:40 p.m.

Prof. Rob Jenkins

Twenty years ago I used to go around telling everyone that the trouble with these facial recognition systems was that they didn't work. These days I find myself spending more time saying the trouble with these facial recognition systems is that they do work.

Over the past five years there has been impressive progress in how well these systems can identify faces. That's not to say that errors are not made. Errors are made, and sometimes they are surprising and difficult to predict, but it's absolutely right that the landscape is changing very quickly and it would change through the duration of a moratorium.

12:40 p.m.

Liberal

Iqra Khalid Liberal Mississauga—Erin Mills, ON

Thank you.

I'll change tracks a little bit, although I'm not sure who to address this question to. Do any of you know if there is current technology or a system that allows Canadians to take themselves off all facial recognition databases, or all artificial intelligence databases, to be completely anonymized?

12:40 p.m.

Prof. Rob Jenkins

I suspect that there are people on the panel who know more about the technology than I do, but if algorithms are trained on a huge set of images, and one of those images, or more than one, is of you, then the cake is already baked. It's difficult to unbake the cake and remove the influence of any one individual from the database on the algorithm that emerges from it.

12:40 p.m.

Liberal

Iqra Khalid Liberal Mississauga—Erin Mills, ON

Thanks for that. Basically, privacy laws and the protection of privacy laws in this instance are kind of that balance and not black or white, where you opt in or opt out. You're kind of there, baked into that cake, as you said, Dr. Jenkins.

In that case, then, we see that social media companies, for example, or other platforms build in these algorithms, the artificial intelligence that creates convenience in shopping. They purchase datasets for companies to buy their customers, basically, so that they can advertise to them. Are there any regulations that you think could be part of a potential bill of rights that would protect Canadians in the way in which their data is sold to these companies?

Does anybody want to take that on? I'm sorry. I just don't know who to address this to. It's a complex one.

12:45 p.m.

Conservative

The Chair Conservative Pat Kelly

Do you want to rephrase it really quickly, or direct it to a specific witness?

12:45 p.m.

Liberal

Iqra Khalid Liberal Mississauga—Erin Mills, ON

Dr. Khanna, if you want to take it, go ahead.

12:45 p.m.

Strategic Advisor and Foresight Expert, As an Individual

Sanjay Khanna

This is where we don't really know how to protect Canadians in that way on the commercial side. That's why there has been discussion of data portability, where Canadians would have the right to their own data but also to earn money from it should they consent to it being used transactionally.

There has been a lot of push-back against that. In Australia, News Corp was finally pushed by...or Google had to pay publications for the data they were using online. That could be done, at least conceptually, for citizens as well.

12:45 p.m.

Conservative

The Chair Conservative Pat Kelly

Thank you very much to all the witnesses.

Actually, there are a couple of things I want to address from the chair here.

First of all, just at the very end, Mr. Khanna, in your response to Mr. Kurek's question, you referenced a report that talked about the use of FRT with respect to children's toys. I wonder if you could supply that report or give the information to our clerk so that the report might be available to the committee for its report to Parliament.

12:45 p.m.

Strategic Advisor and Foresight Expert, As an Individual

Sanjay Khanna

I'd be happy to do so.

12:45 p.m.

Conservative

The Chair Conservative Pat Kelly

That would be very much appreciated.

I want to ask a question. In the examples that have come about today, I guess the most “benign” use of FRT, if that's the word for it, or one of the more benign uses spoken of, is the one that many of us are familiar with. That's the facial recognition to unlock an iPhone or a mobile device. An individual has consented to this use and has supplied a photo of themselves for their convenience and for the biometric security around their own phone. On a personal level, I find a fingerprint much more convenient and easier, if the device will allow that, than a photo, and more reliable.

If this is one for which there seems be, on this panel or around the table, one of the more easily supported uses of this, are there problems, even at that level, of where a consumer is readily, or at least relatively readily, consenting to this type of use?

I'll maybe ask each of our panellists to weigh in on this for a quick moment. Would this be an acceptable use of FRT? Would this be included in the moratoriums that some are asking for?

Let me start with you, Dr. Watkins, just for a quick answer.

12:45 p.m.

Postdoctoral Research Associate, Princeton University, As an Individual

Dr. Elizabeth Anne Watkins

Thank you so much. This is a great question.

I urge the committee to think about consent in a context in which consent takes place. Consent can often be much more complex than it looks from the outside. It's not always a yes or a no, or “no I don't want to do this, so I'm going to go to the next alternative”. Often, there are no alternatives. Often, there are financial pressures that people are facing that force them to comply with these kinds of protocols.

For example, the facial verification that's in place in many gig companies, there is no alternative. If they don't comply with facial verification, they're simply off the app.

12:45 p.m.

Conservative

The Chair Conservative Pat Kelly

Go ahead, Dr. Jenkins.

12:45 p.m.

Prof. Rob Jenkins

I agree with all of that. Informed consent goes a long way, but it has to be informed.