Evidence of meeting #12 for Access to Information, Privacy and Ethics in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was use.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Alex LaPlante  Senior Director, Product and Business Engagement, Borealis AI
Brenda McPhail  Director, Privacy, Technology and Surveillance Program, Canadian Civil Liberties Association
Françoys Labonté  Chief Executive Officer, Computer Research Institute of Montréal
Tim McSorley  National Coordinator, International Civil Liberties Monitoring Group
Clerk of the Committee  Ms. Nancy Vohl

4:40 p.m.

Liberal

Parm Bains Liberal Steveston—Richmond East, BC

Thank you.

If I have time, I'd like to switch to Dr. LaPlante.

In January 2021 you co-authored an article in RBC Capital Markets, “Ensuring AI Remains a Force for Good”. You talk about the Respect AI program as a way to build public trust. One of the ways you indicate this can be done is by using technology to expose bias.

Back to my colleague Mr. Fergus's question about the technology that's capturing the images, can you provide the committee with some ideas or examples on how technology can be used to root out these inherent biases?

4:40 p.m.

Conservative

The Chair Conservative Pat Kelly

Please give a very brief answer.

4:40 p.m.

Senior Director, Product and Business Engagement, Borealis AI

Dr. Alex LaPlante

I think that's going to be a very difficult one to answer quickly, but one thing I will maybe suggest that you look into is this concept of ethics by design, which is essentially taking ethical considerations throughout your development cycle from initial data collection through algorithmic development and through questions you should ask yourself around productionization and the monitoring of those systems. There's a lot of detail you can pull together on that. There are a number of organizations that practise in that, as does Borealis.

4:40 p.m.

Conservative

The Chair Conservative Pat Kelly

Mr. Bains and Ms. Saks, I called you in the wrong order from what I was provided. I wrote you down out of order, so I apologize. If I should ever, at the committee, call speakers who aren't expecting to be called, just give me a quick correction, and we'll get the person who should be called.

On that note, I'll give the floor to Mr. Garon for two and a half minutes.

4:40 p.m.

Bloc

Jean-Denis Garon Bloc Mirabel, QC

Thank you, Mr. Chair.

I'll continue with Mr. Labonté.

I'd like to get back to the question from my colleague Mr. Fergus. Earlier, he asked if it was necessary to start over from scratch, and take the time required to come up with appropriate regulations. But then companies like Clearview AI have already gathered and stored a staggering number of photographs.

Have we already waited too long to establish a regulatory framework?

4:40 p.m.

Chief Executive Officer, Computer Research Institute of Montréal

Françoys Labonté

It's too late to regulate data harvesting, because we can't go back in time. However, we can regulate the use of technology.

What people don't always understand very clearly is that the idea driving the technologies we are talking about is acquiring lots of data and using it to train the systems. The desired outcome is facial recognition, meaning the ability to identify whether someone is such and such a person. It creates an explicit model. In technology and engineering, there is usually a data entry phase during which information is processed with a view to results. That's not the model here. Now, implicit models are created which, on the basis of observation, processing and the analysis of many data sets, can provide the expected results. The players who succeeded in doing that by collecting all kinds of data over the past 10 years now have a competitive advantage compared to these models. It's something that's very difficult to reproduce. They are true black boxes.

My view is that this would be extremely difficult, because you can't travel back in time.

4:40 p.m.

Bloc

Jean-Denis Garon Bloc Mirabel, QC

I have only 30 seconds left, so I'll ask you a brief question.

Given the quantity of data out there, is regulating its use like regulating tax evasion, in the sense that countries would have to coordinate with one another to provide a proper framework?

4:40 p.m.

Chief Executive Officer, Computer Research Institute of Montréal

Françoys Labonté

That would very likely be necessary.

The crux of the matter is the stockpile of data. A glance at the numbers that show how the situation has evolved show that the quantity of data being stored in cloud platforms is exponential. Once the wheel starts to turn, it's difficult to turn it back.

4:45 p.m.

Conservative

The Chair Conservative Pat Kelly

Thank you, Monsieur Labonté.

We have Mr. Green for two and a half minutes.

4:45 p.m.

NDP

Matthew Green NDP Hamilton Centre, ON

Thank you.

I'll start my line of questioning with Dr. LaPlante. In her testimony, I believe she spoke about the need for private sector accountability. I wonder if she would contemplate and share any legislative frameworks that would provide true accountability should third party corporations use this in bad faith or in ways that are egregious violations of privacy.

4:45 p.m.

Senior Director, Product and Business Engagement, Borealis AI

Dr. Alex LaPlante

I'll focus my comments on AI regulation broadly. Right now in Canada, we lack an end-to-end regulation, and there are several changes that need to be made. I'll point you in the direction of a recent framework published by the EU Commission—this is a draft framework, but it's very likely to go into practice in 2022—that tackles issues of artificial intelligence and the risks associated, anything from privacy and human rights to very technical concepts of robustness and stability.

Ultimately, every time we develop one of these systems, we should be doing an impact assessment. As I noted in my remarks, the oversight of these systems should be based on risk materiality, meaning for very high-risk systems. There should be some level of scrutiny in the requirements around their usage and testing. Testing covers a very broad range of technical concepts, like robustness, stability, bias and fairness, and thresholds have to be put in place. Granted, these are context-dependent, so they would have to be put in place by the developers in order for us to ensure that there is accountability.

I will also note—and this is something I think is often forgotten—that these systems are stochastic. This means that when we put them in production, we may have a really good sense of how they'll behave today, but as our data changes in the future, we need to make sure we're continually monitoring the systems to ensure that they are working in the way we had initially intended. If they're not working in that way anymore, they need to be pulled from production and reassessed before they are put back out. This is particularly true in high-risk use cases like criminal identification.

4:45 p.m.

NDP

Matthew Green NDP Hamilton Centre, ON

On that, my last question is to Ms. McPhail. From your perspective, where law enforcement has used this technology, do you know of any instances where there have been false positives that have caused material harm to the innocent people who were identified?

4:45 p.m.

Conservative

The Chair Conservative Pat Kelly

Answer very briefly. Thank you.

4:45 p.m.

Director, Privacy, Technology and Surveillance Program, Canadian Civil Liberties Association

Brenda McPhail

In Canada, in part because police forces have been cautious and measured in adopting this technology and are using it in relatively limited ways, I do not know of such examples.

In the United States, where the uptake has been faster and less cautious, our sister organization, the American Civil Liberties Union, currently has litigation in several states fighting for men—all of them Black—who were misidentified by this technology. One in particular, Mr. Williams, had police come to his home, handcuff him and drag him out of his home in front of his minor children.

4:45 p.m.

Conservative

The Chair Conservative Pat Kelly

Thank you, Ms. McPhail.

We go now to Mr. Bezan.

4:45 p.m.

Conservative

James Bezan Conservative Selkirk—Interlake—Eastman, MB

Thank you, Mr. Chair.

I want to thank the witnesses. It has been very informative and eye-opening for all of us here, knowing what is at stake.

I'll just follow up on the questioning Mr. Green started off on.

With my background in national defence and security, I hadn't even thought about how facial recognition technology is being used to violate the charter rights, and even the Criminal Code and the National Defence Act, which say you can't spy on someone directly or indirectly unless warrants have been issued or, in case of an imminent threat, ministerial authorization was given. There are checks and balances through that whole process.

When we start looking at the mass collection and mass surveillance using FRT, how do we even say it's possible when we know that there are supposed to be all these checks and balances under the Criminal Code, the charter and the National Defence Act as it applies to CSE? You think about CSIS and the Canada Border Services Agency, never mind the RCMP, OPP and all the other policing organizations that are out there.

I would be interested on a quick take from Mr. McSorley and Ms. McPhail on that.

4:45 p.m.

National Coordinator, International Civil Liberties Monitoring Group

Tim McSorley

While it's true that there are rules in place to minimize mass surveillance from those agencies, as was mentioned, in recent draft guidance to law enforcement agencies, the Privacy Commissioner raised the concern that because the laws around this are currently a patchwork, there are concerns that there are loopholes and that there will be ways for federal agencies and law enforcement agencies to engage in mass surveillance that otherwise would be considered unlawful.

There's a lack of clarity around that right now. The lack of discussion and the lack of forthcomingness from federal agencies to discuss their use of facial recognition technology is what raises these deep concerns that they could be engaging in forms of surveillance that are unlawful or which otherwise would be considered unlawful, but are doing so because of this patchwork of legislation.

There are also debates around what's considered mass surveillance. For example, the RCMP scrape information about individuals online and keep those in databases. We know they have been doing that. This is beyond facial recognition, but they would argue they have a right to collect that information, whereas others have been challenging it as we have, saying that it's a form of mass surveillance that needs to be regulated.

4:50 p.m.

Conservative

James Bezan Conservative Selkirk—Interlake—Eastman, MB

You're saying then, Mr. McSorley—I'll let Ms. McPhail jump in on this as well—that the scraping of images off social media of people who participate in mass protests like we recently had here in Canada, as well as mass surveillance and FRT, would be violations of their civil liberties, in your both opinions?

Ms. McPhail.

4:50 p.m.

Director, Privacy, Technology and Surveillance Program, Canadian Civil Liberties Association

Brenda McPhail

Mr. Chair, yes, I believe so.

In our current legislative regime, there are wide gaps that seem to have been exploited at this time to allow some uses of this technology in ways that have yet to be critiqued or examined in front of a judge. I think that's going to happen probably in the near future here in Canada, but it can be pre-empted if we sit down and think very carefully through whether there are ways this can be done safely.

In some cases, the answer is going to be no. CCLA supports a complete ban on mass surveillance uses of this technology.

In some cases, such as the current police use of facial recognition technology in conjunction with mug shot databases, for example, even those uses are not necessarily uncontroversial. We simply haven't thought about them. Police use of FRT for mug shot databases is being conducted on legacy databases that have their own issues of bias and discrimination that we have known about for a really long time.

I think it's not just the mass surveillance aspects of this, but also the more targeted ones that we haven't grappled with.

4:50 p.m.

Conservative

James Bezan Conservative Selkirk—Interlake—Eastman, MB

If we get talking about targeted ones, we have with us Ms. LaPlante from Borealis AI, which is working with RBC. We know that the RCMP and the government wanted to freeze the bank accounts of people who participated in the recent protest.

How do we start...?

Would some of the technology that Borealis AI has be used in allowing the government to freeze the bank accounts of certain individuals whose faces were scraped from social media or mass surveillance through other means, such as drones and cameras?

4:50 p.m.

Conservative

The Chair Conservative Pat Kelly

I would ask for a brief response. Mr. Bezan used all his time asking, so give a very brief response.

4:50 p.m.

Senior Director, Product and Business Engagement, Borealis AI

Dr. Alex LaPlante

It's a resounding no.

As I mentioned, we take ethics very seriously in the design of any of our algorithmic systems. This was definitely not a use case that would have come across our desk at RBC.

4:50 p.m.

Conservative

The Chair Conservative Pat Kelly

Thank you for that.

Ms. Saks, go ahead for five minutes.

4:50 p.m.

Liberal

Ya'ara Saks Liberal York Centre, ON

Thank you, Mr. Chair.

Thank you to all of our witnesses today.

Through you, Mr. Chair, I'd like to start off my questions with Ms. McPhail.

Obviously, we're dealing with massive amounts of data and a massive proliferation of the use of FRT and AI. As Mr. Labonté mentioned earlier, there are grey zones in its use in the retail sector. Other witnesses talked about health care and other beneficial uses, and we know there is that debate back and forth.

In the request for a moratorium, my question to you is where we start.

There are gaps in the legislation right now that don't target the private sector, and they're the ones manufacturing this technology, so who exactly are we putting a moratorium on?

4:55 p.m.

Director, Privacy, Technology and Surveillance Program, Canadian Civil Liberties Association

Brenda McPhail

CCLA particularly supports a moratorium for police and national security uses of this technology, because those are situations where the consequences, if we get them wrong, are literally life-altering for individuals.

That said, it would be beneficial to have a general moratorium, because what we know is that private sector vendors are selling technologies to public sector actors, including law enforcement and national security bodies. The way that our current privacy law regime works is that those two sides, public and private, are governed in some ways under different sets of regulations, which only exacerbates the difficulty of effectively regulating this area.

We really need a coherent approach to thinking through how to develop protections in this regard.