Evidence of meeting #12 for Access to Information, Privacy and Ethics in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was use.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Alex LaPlante  Senior Director, Product and Business Engagement, Borealis AI
Brenda McPhail  Director, Privacy, Technology and Surveillance Program, Canadian Civil Liberties Association
Françoys Labonté  Chief Executive Officer, Computer Research Institute of Montréal
Tim McSorley  National Coordinator, International Civil Liberties Monitoring Group
Clerk of the Committee  Ms. Nancy Vohl

4:05 p.m.

Liberal

Greg Fergus Liberal Hull—Aylmer, QC

We can no longer hear you, Mr. Labonté.

I'll take advantage of this pause to ask Ms. McPhail a final question.

Ms. McPhail, would it be preferable to start from scratch and ban the use of facial recognition for the time being, until a legal framework is developed to specify how it can be used, and under what circumstances?

4:05 p.m.

Conservative

The Chair Conservative Pat Kelly

To be clear, Monsieur Labonté, we did lose your audio, and Mr. Fergus had posed another question.

4:05 p.m.

Liberal

Greg Fergus Liberal Hull—Aylmer, QC

I think we've also lost Ms. McPhail.

4:05 p.m.

Conservative

The Chair Conservative Pat Kelly

I've also lost your interpretation right now.

We're losing people all over on this call.

We'll suspend the meeting due to technical difficulties.

4:10 p.m.

Conservative

The Chair Conservative Pat Kelly

The meeting is resumed. The Zoom system-wide glitch is hopefully resolved.

I'm going to ask Mr. Fergus to repeat his question, and we'll restart with that.

4:10 p.m.

Liberal

Greg Fergus Liberal Hull—Aylmer, QC

Thank you, Mr. Chair.

My question is for Ms. McPhail.

Ms. McPhail, would it be preferable for the time being to ban any use of facial recognition, whether in the private or public sector, until we can come up with a framework that identifies appropriate uses of the technology? Do you think that would be the best way of proceeding?

4:10 p.m.

Director, Privacy, Technology and Surveillance Program, Canadian Civil Liberties Association

Brenda McPhail

I do. The CCLA has called for a moratorium, which is similar to a ban, until we sort this out, and until we have exactly this kind of conversation with our democratically elected representatives, and people across Canada, to think this through. Are there uses of this technology that are going to benefit us, or are there not? For those that may benefit us, what are the appropriate safeguards to put in place?

That's going to be a long and difficult conversation, but it's an absolutely fundamentally necessary one. A moratorium on the use of this technology would give us the space and time to engage this in a thoughtful, careful, and considered way.

4:10 p.m.

Conservative

The Chair Conservative Pat Kelly

Thank you.

With that, Mr. Fergus is out of time.

I'll now give the floor to Mr. Garon.

Welcome to the committee, Mr. Garon.

You have six minutes.

4:10 p.m.

Bloc

Jean-Denis Garon Bloc Mirabel, QC

Thank you very much, Mr. Chair.

I'm glad the connection was restored, because I wanted to ask Mr. Labonté most of my questions.

Mr. Labonté, we know that having more information can often lead to better decisions. Nevertheless, more than once in our history, we decided to place limits on our ability to obtain information. For example, I led the effort on searches without a warrant. We prevented the police from conducting searches without a warrant.

I'm wondering whether we are once again pondering a serious social issue, in this instance whether facial recognition technology has the potential to virtually put an end to our freedom and privacy.

What are your thoughts on this matter?

4:10 p.m.

Conservative

The Chair Conservative Pat Kelly

Monsieur Labonté.

4:10 p.m.

The Clerk of the Committee Ms. Nancy Vohl

Mr. Labonté, can you hear us?

4:10 p.m.

National Coordinator, International Civil Liberties Monitoring Group

Tim McSorley

Excuse me. I have been having trouble hearing the questions in the English interpretation. I'm not sure if others are having the same problem for audio as well.

4:10 p.m.

Conservative

The Chair Conservative Pat Kelly

Thank you.

I'll ask the clerk to quickly see if we can establish whether or not we have adequate contact.

The meeting is suspended.

4:15 p.m.

Conservative

The Chair Conservative Pat Kelly

We will resume the meeting.

I would ask the members who are participating virtually to indicate if at any point they lose audio so that I know if there's a problem.

I will restart Monsieur Garon's round, because I don't believe anybody heard his question.

Go ahead. You have six minutes.

4:15 p.m.

Bloc

Jean-Denis Garon Bloc Mirabel, QC

Thank you, Mr. Chair.

Mr. Labonté, I'm going to repeat the question I just asked.

Gathering more data can lead to better decisions. Nevertheless, more than once in our history, out of concerns pertaining to privacy and individual rights, we decided to restrict information gathering. For example, searches without a warrant are now prohibited.

I am wondering how likely it is that one day, if facial recognition is used inappropriately on a wide scale, it could considerably reduce or even do away with our freedom and privacy. I know that it's a rather philosophical question, but I'd like your opinion on it.

4:15 p.m.

Chief Executive Officer, Computer Research Institute of Montréal

Françoys Labonté

The answer is yes, I do believe that's possible, if data collection is done without people's consent and without them properly understanding the purposes for which the information is being used. That in fact is what explains recent personal information protection legislation. Questions like these have been on the radar for people working in technology for a long time. Regulations are being implemented, but the questions have been around for a long time. Clear guidelines are definitely required.

On the other hand, there is an important factor to consider from the CRIM's perspective. The CRIM is no longer working on these technologies. The most competitive players at the moment are the ones that collected enormous amounts of data for use in training artificial intelligence models. Now, ordinary mortals no longer have access to the amounts of data required to achieve high performance levels.

It's true, though, that the risk you mentioned is real. That's why it's essential to regulate data harvesting to make people aware of how it is going to be used and to require informed consent.

4:15 p.m.

Bloc

Jean-Denis Garon Bloc Mirabel, QC

There are companies like Palantir, which use military technology to produce what they call social observation.

What do you think of these companies and practices?

4:20 p.m.

Chief Executive Officer, Computer Research Institute of Montréal

Françoys Labonté

It always comes back to the same question. To develop technologies like these, companies collected an enormous amount of data, presumably without the informed consent of the people providing it. It happened. It's a reality. That's what I was saying in a very pragmatic manner in my presentation. Now, some of these players have a significant competitive advantage that needs to be regulated in the future.

What can we do about it? It may be a wide-ranging question, but it's very pragmatic. If we were to ask someone today to return all the images they used to create their models, it would be a challenge for them, because you can't go back in time. That's really the challenge here. We are trying to modulate the future Technical difficulty.

4:20 p.m.

Bloc

Jean-Denis Garon Bloc Mirabel, QC

Mr. Labonté, you spoke about people who had not consented to supplying their data. We're talking about very complex technologies, the details of which we don't know much about. We don't know what the algorithms are.

Would ordinary citizens be prepared to give their informed consent to allow these companies to use their data?

4:20 p.m.

Chief Executive Officer, Computer Research Institute of Montréal

Françoys Labonté

Generally speaking, people don't consent to allow a company to use their information any way they want. For example, if someone feels that it's important to allow people to follow them on social media, they consent to make a picture of their face available solely for that purpose, but they would not consent to allow third parties to use the image of their face for profiling or for developing commercial products.

This aspect is dealt with in regulations that are being drafted or that have recently come into force, but it's still very difficult to give informed consent. At CRIM, because it's a research centre, when we work on projects with an ethics committee and ask subjects for consent, such consent is very specific, clear, for a particular purpose, and often for a limited period of time.

In the world today, the speed at which things are happening makes it difficult right now to give informed consent. For example, when people download an application, they don't even read the consent form that accompanies it, or do not understand what it really means.

In fact, giving informed consent…

4:20 p.m.

Conservative

The Chair Conservative Pat Kelly

Thank you, Mr. Labonté.

Mr. Green, it's over to you now for six minutes.

March 24th, 2022 / 4:20 p.m.

Bloc

Jean-Denis Garon Bloc Mirabel, QC

Thank you very much.

Welcome to all the guests.

Mr. McSorley, in some of the preliminary research that I have conducted on the brittleness and inconsistencies of facial recognition technology, I've heard it called the modern-day phrenology. Luke Stark equates facial recognition to the plutonium of AI. He states that:

...facial recognition technologies, by virtue of the way they work at a technical level, have insurmountable flaws connected to the way they schematize human faces. These flaws both create and reinforce discredited categorizations around gender and race, with socially toxic effects. The second [point] is [that] in light of these core flaws, the risks of these technologies vastly outweigh the benefits, in a way that's reminiscent of hazardous nuclear technologies.

They use that metaphor to say that it, “simply [by] being designed and built, is intrinsically socially toxic, regardless of the intentions of its makers”.

In July 2020 the International Civil Liberties Monitoring Group co-signed a letter with OpenMedia asking for the federal government to enact a ban on facial recognition surveillance from the federal law enforcement and intelligence agencies.

Through you, Mr. Chair, to Mr. McSorley, given the inconsistencies, the brittleness and the surveillance capitalism of third parties—

4:25 p.m.

Conservative

The Chair Conservative Pat Kelly

I'm just going to interrupt for a moment.

4:25 p.m.

Bloc

Jean-Denis Garon Bloc Mirabel, QC

I was on a roll.

4:25 p.m.

Conservative

The Chair Conservative Pat Kelly

Yes.

You have four minutes and 19 seconds left when we go to time back in, but did I hear a point of order or a question or concern about audio?