Evidence of meeting #11 for Access to Information, Privacy and Ethics in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was use.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Cynthia Khoo  Research Fellow, Citizen Lab, Munk School of Global Affairs and Public Policy, University of Toronto, As an Individual
Carole Piovesan  Managing Partner, INQ Law
Ana Brandusescu  Artificial Intelligence Governance Expert, As an Individual
Kristen Thomasen  Professor, Peter A. Allard School of Law, University of British Columbia, As an Individual
Petra Molnar  Lawyer, Refugee Law Lab, York University

11:40 a.m.

Conservative

Damien Kurek Conservative Battle River—Crowfoot, AB

Thank you very much.

Ms. Piovesan, do you have anything to add?

11:40 a.m.

Managing Partner, INQ Law

Carole Piovesan

I very much agree with Ms. Khoo's comments.

I would add that there is an element of stakeholder engagement as well. We need to be able to reach out to the community, particularly those affected, and have a meaningful discussion about how these technologies are used and what the implications of these technologies are before they're rolled out.

We've often heard about the concept of radical transparency. When we're talking about something as profound as facial recognition technology, applying and adopting a concept of radical transparency can be extremely helpful.

Also, underscoring the point of explainability, will we be able to understand how these algorithms operate? Will we able to understand the output they provide, and can we have independent verification to ensure that they are accurate and reliable?

11:40 a.m.

Conservative

Damien Kurek Conservative Battle River—Crowfoot, AB

Thank you for that.

It's interesting. I have an article in front of me where the headline is “Toronto police used Clearview AI facial recognition software in 84 investigations”, and the byline has in part “At least 2 cases went to court”. This is something that is not just theoretical; it is actually happening.

Especially as we are faced with a war in Europe and some of the discussions around what facial recognition and artificial intelligence look like in a military context, when you think about the Geneva Convention, it has to do with bombs that are dropped from airplanes. However, this is a whole new space with massive consequences.

I've had a whole host of constituents who have reached out with concerns about digital ID, the social credit system and some of the challenges that are associated with the state tying your information to aspects of your interaction with the government. I'm wondering if both of you, hopefully—

I'm out of time.

11:40 a.m.

Conservative

The Chair Conservative Pat Kelly

You're going to have to allow the witness about 15 seconds for a reply. Then we'll have to pick it up in another round.

I'm not sure which witness wants to weigh in there for a quick moment to address his questions.

11:40 a.m.

Research Fellow, Citizen Lab, Munk School of Global Affairs and Public Policy, University of Toronto, As an Individual

Cynthia Khoo

I think we thought he was done. Maybe we'll try again in the next round.

11:40 a.m.

Conservative

The Chair Conservative Pat Kelly

All right. We'll hold that thought.

I will go now to Ms. Hepfner.

11:40 a.m.

Liberal

Lisa Hepfner Liberal Hamilton Mountain, ON

Thank you very much, Mr. Chair.

Thank you, witnesses, for your testimony this morning. It's been really interesting and I think really technical. I might sound a little bit repetitive, but I want to make sure I understand your points of view.

I want to start with PIPEDA. We know that the government is looking at this digital policy framework right now and trying to adapt it. It was developed before facial recognition software existed. I'm wondering if both of you can comment on exactly what sorts of improvements you'd like to see in that legislation. How can we make that legislation more adaptive to these technologies? It won't be the last new technology that we have to deal with. These things keep coming up, and the legislation doesn't keep pace. I'm wondering if you have ideas about how to make it more flexible so that when new technologies like this come into our society, we're more able to deal with the privacy concerns.

Maybe we'll start with Ms. Khoo.

11:45 a.m.

Research Fellow, Citizen Lab, Munk School of Global Affairs and Public Policy, University of Toronto, As an Individual

Cynthia Khoo

I think when it comes to improving PIPEDA, the number one thing, the most important thing that I think a lot of privacy advocates have been calling for since the creation of PIPEDA, is to fix the enforcement piece. There are a lot of cases where PIPEDA, in terms of its legal content and what it does and doesn't allow, would not in fact allow the activity that occurs.

When it came to Facebook and Cambridge Analytica, for example, that was found illegal under PIPEDA. When it came to Clearview AI, they successfully.... PIPEDA captured that activity, but it was the fact that the OPC didn't have the power to then issue orders. They would have had to drag the company into court. They don't have the power to issue fines, let alone fines at the level of the GDPR.

I think the single most impactful change that could be made would be to give the Office of the Privacy Commissioner of Canada some teeth to actually enforce the orders against companies that are already found to be engaging in illegal activity under PIPEDA, or what comes after PIPEDA.

11:45 a.m.

Managing Partner, INQ Law

Carole Piovesan

I would agree on the enforcement point. I think what was interesting under Bill C-11 was that it contemplated a tribunal that would oversee, and potentially have more serious consequences over, specific violations of the act. It's something that I'm hoping we'll continue to see in the next round.

Another point that we saw to an extent in Bill C-11 was a broadening of various elements of consent as a basis for collecting, using and disclosing personal information. Again, we have to be mindful that PIPEDA is a private sector privacy law. We have to be mindful of some of the positive uses of facial recognition technology, which is why I say it has to be regulated using a scalpel, not an axe. There are some very beneficial uses, but we need appropriate safeguards to ensure that those uses are properly controlled and contained and that they feed the public interest and don't subvert our values. It's very important that we see that in whatever new reform to PIPEDA we ultimately get.

11:45 a.m.

Liberal

Lisa Hepfner Liberal Hamilton Mountain, ON

Thank you very much.

That fits perfectly into my next question—namely, what are the societal benefits that we can see from this technology? We've heard a lot about the privacy and discrimination concerns. Other than obviously the commercial benefits to the companies that have this software, what are the societal benefits to this sort of software?

11:45 a.m.

Managing Partner, INQ Law

Carole Piovesan

I'm happy to start. I referenced the use of facial recognition in health care, where we have seen some examples of FRTs being used to monitor patients and make sure their well-being isn't changing, particularly if they're on bed rest and may not be vocal. We've seen some very positive uses of FRTs in the health care sector. Of course, we would want to be very cautious about both the collection and the use of that technology. The retention of that data is very important. The limited disclosure of that data is extremely important. But we can see that there are some very notable positive benefits when you look at it in the health context.

I personally use facial recognition to unlock and verify my identity for my bank and my phone. Again, we want strict controls in place. We see it as a convenience. Is it a necessary convenience? No, not necessarily, but it can be a very secure way to go through a payment process or an airport, or to conduct a financial transaction.

There can be positive societal benefits. The issue becomes whether or not there is appropriate disclosure and notice on the collection of that data and how it will be used. Then, is there an appropriate retention period that is ultimately in the control of the individual? That is exactly what PIPEDA is intended to do, to wrest some of that control over informational privacy back into the hands of users, with appropriate—

11:45 a.m.

Conservative

The Chair Conservative Pat Kelly

I'm going to have to move on. We're getting very close on time here. Thank you for that response.

I will go now to Monsieur Villemure.

You have two and a half minutes.

11:45 a.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

Thank you, Mr. Chair. Two and a half minutes go by very quickly.

Ms. Piovesan, I will address you again.

You referred to the general data protection regulation, or GDPR. I would like to know what GDPR best practices we could draw on.

At the same time, there was talk of consent being difficult to obtain, but at the end of the day, is it impossible to obtain it?

11:50 a.m.

Managing Partner, INQ Law

Carole Piovesan

You know, consent can be very difficult, depending on the use case, particularly the scalability of facial recognition technology, but it should not be thrown out as a requirement just in and of itself. We need to include consent as a key requirement. We are talking about an immutable biometric data point on an individual. Having appropriate notice, with some ability for that individual to make decisions about how they share that information or how it's collected, is really critical. I don't want to suggest that consent should never be a consideration when you're talking about facial recognition technology. That's absolutely not true.

When we look at GDPR, we can certainly draw inspiration from the profiling requirements that I know Quebec has done in terms of looking at a right to recourse and a right to objection on automatic profiling solely by automatic means. That's one element that we should consider, but again, I very much encourage the committee to look at the EU's artificial intelligence act. It's not final, but there is some real inspiration that we can draw from there. The draft algorithmic accountability act out of the U.S. is worth looking at as well.

11:50 a.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

Okay.

Tell me a little more about radical transparency.

11:50 a.m.

Managing Partner, INQ Law

Carole Piovesan

Radical transparency really speaks to the entire disclosure process—to allowing people, putting it out there, letting people know who your vendors are, what your uses are, where you are collecting that data and why you are doing so. It's very much about engaging the public as opposed to, as you heard Ms. Khoo mention a number of times, this concept of secrecy that undermines the trust we already have. It also starts to subvert some of those really important Canadian values.

Radical transparency is starting with the principle that we are going to get out there and let our constituents know what we're doing with facial recognition, or any type of really advanced technology that can impact on their rights, bring them into the discussion in a meaningful way, and then report on some of those outputs, including reporting on vendor relationships.

11:50 a.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

Thank you very much.

11:50 a.m.

Conservative

The Chair Conservative Pat Kelly

Thank you.

There are two and a half minutes for Mr. Green.

11:50 a.m.

NDP

Matthew Green NDP Hamilton Centre, ON

Thank you.

I want to pick up on a topic that my colleague from the Bloc has raised on occasion and that I share an interest in, and that's surveillance capitalism. My questions are for Ms. Khoo on the relationship between private companies. We referenced Clearview. There are others that we know of, including Amazon. We even know that Ring technology for doorbells provides an opportunity to privatize and capitalize on public space surreptitiously, without the knowledge of people.

I wonder if you could comment on that, and after that talk about the susceptibilities for abuse by both the private sector and government. I think about the ways in which it's used voyeuristically. You brought up the gender-based analysis there.

I'm wondering if you could just touch on those two topics.

11:50 a.m.

Research Fellow, Citizen Lab, Munk School of Global Affairs and Public Policy, University of Toronto, As an Individual

Cynthia Khoo

Absolutely. Thank you.

In terms of capitalizing on public space, this is something we are definitely concerned about. Amazon Ring is actually the poster child for that. To my knowledge, it has not come up here yet. Again, Professor Thomasen can speak more to this. I think Amazon Ring was looking at Windsor at one point.

We know that there are open partnerships—well, now there are open partnerships—between Amazon and police. Police were essentially conscripted as the marketing department of Amazon Ring doorbells, which raises numerous concerns from the perspective of both the private sector and the public sector, but mostly the public sector.

Surveillance capitalism is an aspect of this public-private surveillance ecosystem, because it has to do with incentive structures. You have the private companies with their own incentives to collect as much data as possible to capitalize on it. A lot of their funding then comes from government through government grants. Whether it's through the guise of innovation or whether it's because they have lobbied government behind the scenes to give them these particular grants, the government funds them. It's partly because they buy into an innovation story or they think, hey, if the company collects all this data, then maybe eventually we'll have access to that data too. It's essentially government and private companies working hand in hand to build out this network of surveillance.

The second thing you mentioned was abuse. I think we have so many examples of that. Actually, in responding to the earlier question about the potentially beneficial uses of facial recognition technology, my mind went to—

11:55 a.m.

Conservative

The Chair Conservative Pat Kelly

I'm really sorry, but I'm going to have to cut you off just to get the last two rounds in. Perhaps, hold your thoughts.

I have to shorten the last two rounds. We'll go to three minutes each for Mr. Bezan and Ms. Khalid.

11:55 a.m.

Conservative

James Bezan Conservative Selkirk—Interlake—Eastman, MB

Thank you, Mr. Chair. That's unfortunate, because I have a lot of questions.

I thank both of our witnesses for their great testimony.

Let's start with Ms. Khoo. You and Kate Robertson sent a letter off to the Privacy Commissioner of Canada back on October 22, 2021. Did you get a response to that letter, and if you did, can you share it with the committee?

11:55 a.m.

Research Fellow, Citizen Lab, Munk School of Global Affairs and Public Policy, University of Toronto, As an Individual

Cynthia Khoo

To my knowledge, I don't think we got a response, but I'll double-check with my colleague Kate Robertson and we can follow up with you.

11:55 a.m.

Conservative

James Bezan Conservative Selkirk—Interlake—Eastman, MB

Okay, please do, because I really believe the stuff that you have in there, talking about three parts as to algorithmic policing technologies. You talk about the moratoriums, ask that the federal government have a judicial inquiry, and that governments must make reliability, necessity and proportionality prerequisite conditions, as well as transparency, more directives on algorithmic policing technologies, or predictive policing—which is even scarier—and so on.

You talk in the letter about “the worst human rights and charter violations that could occur as a result of Canadian government and law enforcement agencies using algorithmic police technologies.” You want to mitigate that.

How do we, as parliamentarians, do that as we go forward in wanting to write the proper regulations that respect our charter rights and ultimately bring that balance of transparency, of people's ability to opt in and opt out, and maximize on the technology that's coming down the pipe at us?

11:55 a.m.

Research Fellow, Citizen Lab, Munk School of Global Affairs and Public Policy, University of Toronto, As an Individual

Cynthia Khoo

As parliamentarians, the first thing you could do and our first recommendation is to launch an inquiry, essentially, or a national commission into these technologies.

For the purpose of this committee, my recommendation was a moratorium on facial recognition, but in that report we actually called for a moratorium on all algorithmic policing technologies pending the results of that inquiry, whether it's a national commission or a judicial inquiry, to do a much more in-depth constitutional and human rights analysis than we were able to do within our reports, so that you actually are able to lay out the contents of what's appropriate and what's not and what safeguards are required, and then actually implement them.

Without doing that, this train is moving ahead in the meantime. We need a national pause to buy ourselves time to figure out what to do with this technology so that we don't realize it way after the fact.