Evidence of meeting #12 for Access to Information, Privacy and Ethics in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was use.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Alex LaPlante  Senior Director, Product and Business Engagement, Borealis AI
Brenda McPhail  Director, Privacy, Technology and Surveillance Program, Canadian Civil Liberties Association
Françoys Labonté  Chief Executive Officer, Computer Research Institute of Montréal
Tim McSorley  National Coordinator, International Civil Liberties Monitoring Group
Clerk of the Committee  Ms. Nancy Vohl

4:25 p.m.

Liberal

Iqra Khalid Liberal Mississauga—Erin Mills, ON

Yes, thank you, Mr. Chair. We did lose it in between there, so I missed about 30 seconds of what Mr. Green had to say.

My apologies, Mr. Green, for interrupting you.

4:25 p.m.

NDP

Matthew Green NDP Hamilton Centre, ON

I won't start again, but I'll simply ask if Mr. McSorley can give me a thumbs-up that he can hear me at this moment. Perfect.

I will ask through you, Mr. Chair, if he could elaborate on the dangers related to the use of AI technologies like facial recognition by national intelligence agencies such as CSIS and the RCMP for the purpose of mass surveillance. I'll take a specific point of reference that in May 2021 our own Department of National Defence—our military—used technologies to surveil Black Lives Matter in a surreptitious way.

Perhaps Mr. McSorley would like to just comment on its use and on the dangers that I've outlined in my preceding comments.

4:25 p.m.

National Coordinator, International Civil Liberties Monitoring Group

Tim McSorley

We would agree completely with your characterization of the dangers posed by facial recognition technology. We see just layers upon layers of concerns.

As has been pointed out by other witnesses today, especially Dr. McPhail, the idea is that there are layers of problems regarding the accuracy of this technology. There are concerns about whether or not we know, without proper regulation, and with so many companies proposing their technology to law enforcement agencies, that they will even be using the most accurate—or will they be using the most accessible, the ones that are targeted more and marketed more towards law enforcement? There's the whole question of the use of law enforcement and intelligence agencies of third party contractors and how that's carried out, the lack of transparency there, and problems with accuracy and bias in the technology that may be promoted to them.

Even if those were to be addressed, as has been mentioned, the targeting of communities of colour is already well known. It cannot be solved simply by improving the technology, but rather, as Dr. McPhail said, it can be exacerbated, because then all of a sudden we have this great tool for better surveilling populations that are already over-policed and over-surveilled. We need to be incredibly—

4:25 p.m.

NDP

Matthew Green NDP Hamilton Centre, ON

If I may, through you, Mr. Chair, to Mr. McSorley, given the fact that there's been an ongoing theme in this committee and in this study that there are tendencies for the government and for intelligence and security forces to do indirectly what it can't do directly, I'd like to extend the question, because in the same letter that you co-signed, there was a call for reforms to the Personal Information Protection and Electronic Documents Act, or PIPEDA.

Based on your work, what types of reforms are needed to safeguard human rights and privacy in Canada to ensure that third party vendors don't do indirectly what the government can't do directly?

4:25 p.m.

National Coordinator, International Civil Liberties Monitoring Group

Tim McSorley

First of all, we need private sector privacy laws that are based on a human rights approach; that are based clearly on proportionality and necessity; that have clear rules around consent; that bring in oversight of artificial intelligence and regulation of artificial intelligence used by the private sector; and also bring in stringent regulations if not bans—it needs to be further studied—on the provision of the use by law enforcement and national security of third party and private contractors in order to carry out those activities that they cannot do themselves.

For example, as I mentioned earlier, the RCMP has disputed that they need to verify the lawfulness of services provided by third party contractors. If the leading federal law enforcement agency in the country says that they can use technology found to be unlawful and that it's not their problem, in so many words, we have a serious problem. That needs to be addressed in the private sector laws just as in the public sector laws, because current private sector laws allow for the sharing of information from the private sector to the public sector in law enforcement because of national security exceptions.

That needs to be a primary focus in reforming Canada's private sector privacy laws.

4:30 p.m.

NDP

Matthew Green NDP Hamilton Centre, ON

Thank you.

4:30 p.m.

Conservative

The Chair Conservative Pat Kelly

Mr. Williams, you have five minutes.

4:30 p.m.

Conservative

Ryan Williams Conservative Bay of Quinte, ON

Thank you very much.

Thank you to all the witnesses.

I will continue on with Mr. McSorley.

Sir, in June of 2021 you called on the public safety minister to develop a clear proposal for independent oversight of FRT and AI-based policing tools. What is your vision for what the independent oversight would look like?

4:30 p.m.

National Coordinator, International Civil Liberties Monitoring Group

Tim McSorley

First of all, we think we need a broader consultation to decide what are no-go zones. As we've said, we believe a clear part of that no-go zone would be on the use of facial recognition for mass surveillance. Beyond that, there needs to be oversight in terms of ensuring that as law enforcement and intelligence agencies adopt new technology, they are reviewed beforehand, before they are implemented, in order to ensure that they meet the right standards that are set by Canada's privacy legislation.

Right now it's up to the law enforcement agencies themselves, essentially, as we've seen with the adoption of Clearview AI, to make those decisions themselves. It wasn't clear that the minister knew to what degree the RCMP was using Clearview AI facial technology. The concern is that it's being adopted without any kind of political or other oversight.

The National Security and Intelligence Review Agency is currently undertaking a review of the use of biometric surveillance by Canada's national security agencies, but that could take, again, a couple of years before it becomes public. We need action by the minister now in order to ensure that we don't have law enforcement adopting these technologies in secret, and that they publicly share what they believe the privacy impact will be through the privacy impact assessments and allow for a full and clear debate.

4:30 p.m.

Conservative

Ryan Williams Conservative Bay of Quinte, ON

Your organization wrote an open letter to the minister in 2020. Did you ever receive a response from the minister?

4:30 p.m.

National Coordinator, International Civil Liberties Monitoring Group

Tim McSorley

We had a follow-up conversation with the director of policy in the minister's office, but it was more of a listening session rather than clearly stating what the minister's actions would be. The only new information we obtained was clarification that CBSA was not using real-time facial recognition at that moment. They could not share anything about CSIS's use of facial recognition technology, but there was no clear commitment from the minister's office to take further action.

4:30 p.m.

Conservative

Ryan Williams Conservative Bay of Quinte, ON

Is it true that in response to some of the findings of the Privacy Commissioner, the RCMP agreed to conduct privacy assessments of third party tools that would establish new oversight function in new technology; and if it's so, has it actually been set up in a way in which it can protect the rights of Canadians?

4:30 p.m.

National Coordinator, International Civil Liberties Monitoring Group

Tim McSorley

That's a good question.

We know that the RCMP committed to making improvements to its policies, even though they did reject the overall finding that they're responsible for the lawfulness of third party technology. We haven't seen anything released publicly about that yet, and in fact, it speaks to one of the problems we see right now that, in theory, federal agencies need to undertake privacy impact assessments before new technology or new privacy-impactful projects are undertaken, but those assessments are often not done at all. If they are done, they may be kept secret. There's supposed to be an executive summary shared, but often, especially from law enforcement and intelligence agencies, those aren't shared, based on the idea that it would have an impact on their operations, whereas we feel that there needs to be pressure to have a greater degree of transparency and accountability there.

4:30 p.m.

Conservative

Ryan Williams Conservative Bay of Quinte, ON

Okay.

You've answered quite a bit of this already, but I just want to give you a chance to further expand if you'd like. Your third recommendation from the letter was for an establishment of clear and transparent policies and laws regarding the use of facial recognition.

What do you see these policies and laws looking like, and what reforms do you think the Privacy Act and PIPEDA require?

4:30 p.m.

National Coordinator, International Civil Liberties Monitoring Group

Tim McSorley

Our expertise is more on the public sector side, so I'll speak more to that.

There needs to be clear establishment of no-go zones, again, for example, in terms of mass surveillance of public places. There need to be clear rules around the issuance of privacy impact assessments.

We believe it would be powerful to have mandatory third party and independent review of algorithmic and biometric surveillance tools used by law enforcement so that they would be assessed for their human rights impact as well as for their accuracy and concerns around bias.

We believe one thing that could also help is that there would be a government agency specifically for following, studying and creating a repository and directory of the use by federal agencies of algorithmic and biometric tools in general, but especially in regard to surveillance.

4:35 p.m.

Conservative

Ryan Williams Conservative Bay of Quinte, ON

Thank you, sir.

4:35 p.m.

Conservative

The Chair Conservative Pat Kelly

Now we'll go to Mr. Bains for five minutes.

4:35 p.m.

Liberal

Parm Bains Liberal Steveston—Richmond East, BC

Thank you, Mr. Chair; and thank you to all our guests for joining us today.

My questions are coming from Richmond, British Columbia. I'm concerned about this and the use of AI. As you know, in British Columbia, we have a strong BIPOC community, predominantly Asian and South Asian. We also heard from a witness the other day about a flag that the VPD is using AI.

My question is directed to Dr. McPhail. Vancouver Police Chief Adam Palmer assured the police board in April 2021 that his officers will not use facial recognition technology for investigations until a policy is in place.

Do you know if any FRT policy has been put forward to the police services board?

4:35 p.m.

Director, Privacy, Technology and Surveillance Program, Canadian Civil Liberties Association

Brenda McPhail

I do not know, in the context of Vancouver, whether such a policy has been put forward.

I do know that in Toronto what we believe to be the first such policy was recently put through, and the grapevine has suggested that many other police forces across Canada were waiting on that to happen in order to take a look at it and to construct their own policies accordingly. However, I apologize; I don't know specifically about the state of that policy in Vancouver.

4:35 p.m.

Liberal

Parm Bains Liberal Steveston—Richmond East, BC

Have you been apprised of that flag the previous witness may have indicated, which is that AI is already being used?

4:35 p.m.

Director, Privacy, Technology and Surveillance Program, Canadian Civil Liberties Association

Brenda McPhail

Yes, I believe that came from the extensive research conducted in the Citizen Lab report on algorithmic policing across Canada.

There are a number of forces across Canada, including Vancouver's, that are currently engaged in using these kinds of tools. It's happening quietly, under the radar, generally without any public revelations at the point of procurement, at the point of policy development or at the point of implementation. We have a real crisis of accountability when it comes to police use of these technologies.

4:35 p.m.

Liberal

Parm Bains Liberal Steveston—Richmond East, BC

It's without a policy in place. Is that correct?

4:35 p.m.

Director, Privacy, Technology and Surveillance Program, Canadian Civil Liberties Association

Brenda McPhail

Either there is no policy in place or there's not a policy that's available for public view. I've done extensive access to information requests on similar topics, most specifically focused on facial recognition technology, and it's like pulling teeth to get access to this information in any sort of reasonable way.

4:35 p.m.

Liberal

Parm Bains Liberal Steveston—Richmond East, BC

In December of 2021, the CCLA supported the decisions of the B.C., Alberta and Quebec commissioners, which included binding orders to Clearview AI to cease collecting personal information in those provinces and to delete all personal information already collected without consent. Are you aware of any action that Clearview AI has taken on those orders?

4:35 p.m.

Director, Privacy, Technology and Surveillance Program, Canadian Civil Liberties Association

Brenda McPhail

Indeed, Clearview AI has filed legal applications, lawsuits, against the commissioners in B.C., Alberta, Quebec and federally disputing those orders and challenging them on a series of grounds that range from the difficulty or impossibility of complying with those orders to challenging the constitutionality of Canada's privacy laws and arguing that they have a free expression right to data scraped from the Internet.

This is going to be ongoing litigation, and it's very worth the committee's attention.