Evidence of meeting #20 for Access to Information, Privacy and Ethics in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was rcmp.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Superintendent Gordon Sage  Director General, Sensitive and Specialized Investigative Services, Royal Canadian Mounted Police
Colin Stairs  Chief Information Officer, Toronto Police Service
Roch Séguin  Director, Strategic Services Branch, Technical Operations, Royal Canadian Mounted Police
André Boileau  Officer in Charge, National Child Exploitation Crime Centre, Royal Canadian Mounted Police

11:05 a.m.

Conservative

The Chair Conservative Pat Kelly

I call this meeting to order.

Welcome to meeting number 20 of the House of Commons Standing Committee on Access to Information, Privacy and Ethics.

Pursuant to Standing Order 108(3)(h) and the motion adopted by the committee on Monday, December 13, 2021, the committee is resuming its study of the use and impact of facial recognition technology.

Today's meeting is taking place in a hybrid format, pursuant to the House order of November 25, 2021. Members are attending in person in the room and remotely by using the Zoom application.

I think everyone here is probably fairly familiar with how this works, so I won't go into more detail. If you're on Zoom, please be sure to unmute yourself when you begin to speak, and certainly select the official language that you wish to receive or simply the floor feed, if that is what you wish.

This is a resumption of the testimony we were receiving from the RCMP and the Toronto police that was cut very short due to votes both before and after our committee meeting began a week ago last Thursday.

With that, I'm going to dispense with opening remarks and go straight to our questioning. We are also monitoring what is going on in the House. There is a notice of time allocation. If it is moved and we end up having a vote this morning, then we will deal with that when it happens. I think we'll have quite a bit of time for questions to resume with these witnesses.

With that, Mr. Kurek will be going first.

Mr. Kurek, you have six minutes.

11:05 a.m.

Conservative

Damien Kurek Conservative Battle River—Crowfoot, AB

Thank you very much, Mr. Chair.

Thank you to the witnesses for agreeing to appear again before this committee.

To the RCMP, is the RCMP, via contractors or itself or any peripheral organization that's involved with the RCMP and its law enforcement duties, currently using FRT?

11:05 a.m.

Chief Superintendent Gordon Sage Director General, Sensitive and Specialized Investigative Services, Royal Canadian Mounted Police

Good morning, Mr. Chair.

No. There is not any FRT technology being used by the RCMP at this time that I'm aware of.

11:05 a.m.

Conservative

Damien Kurek Conservative Battle River—Crowfoot, AB

Thank you very much.

Again to the RCMP, do you have any numbers on how many individuals were tried and/or convicted because of the RCMP's use of Clearview AI specifically or any other facial recognition technology?

11:05 a.m.

C/Supt Gordon Sage

Yes, I can comment on that.

Facial recognition technology has been used on only three occasions. On two occasions it was with the child exploitation centre that I'm in charge of, where they were able to identify victims of this horrible crime and create safeguard measures to protect the victims who were located in Canada. On a third occasion it was utilized to track an offender, a fugitive, who was internationally abroad.

There have been no prosecutions using this technology. It's simply been used for identification on two different files with our child exploitation centre. One was when a person from outside the country was trying to exploit two children in Canada to perform sexual acts. We were able to identify the victims and provide safeguards to protect the victims from the person who was trying to offend.

Another situation in which it was used was on an international case. There was a file from 2011 on a victim who was not able to be identified through traditional means. We were able to use facial recognition technology within our scope to identify this victim, who actually was in the States. The entire international community was trying to find this victim for a series of about nine to 10 years and were unsuccessful. We were able to use facial recognition to identify this victim who was situated in the States. We reached out to the Americans, and they were able to confirm that in fact this person was charged and convicted in the States from their information on their charges.

I guess the importance of the facial recognition is that the international community had continued to look for this victim for a series of nine to 10 years and were unable to do so. We were able to use facial recognition to identify this victim. In fact, a court process had completed in the United States of America, and he was convicted on that American charge. It had nothing to do with what we did in Canada.

11:05 a.m.

Conservative

Damien Kurek Conservative Battle River—Crowfoot, AB

Do you mind going through the process that the RCMP has used in the past to allow for the use of FRT during the course of an investigation?

11:05 a.m.

C/Supt Gordon Sage

When it was initially rolled out, our members started to utilize it on those three cases only.

A lot of members were testing the technology to see if it worked. They were using a lot of searches on their own pictures, on their own profiles, to see if this technology worked. They also used media searches. They took photographs of celebrities and ran them through Clearview to see if it worked.

In fact, by testing this technology, we realized that it wasn't always effective. There were certainly some identification problems, and that's why we use it only as a tool in the tool box and do not rely on it, because you do need that human intervention to identify who the victim is. It is not always correct. It was absolutely critical that we did have that human intervention when we utilized it.

Many of the queries were testing the program. The only three cases were the three cases that I just spoke of.

11:05 a.m.

Conservative

Damien Kurek Conservative Battle River—Crowfoot, AB

Thank you very much.

For the Toronto Police Service, during testimony on April 28, you acknowledged that the Toronto Police Service uses FRT in limited circumstances. Is the use of facial recognition technology in an investigation disclosed to either the court or the individual over the course of an investigation after an arrest?

11:10 a.m.

Colin Stairs Chief Information Officer, Toronto Police Service

I believe it is. I'm not an expert on the procedural aspects, but I believe it is shared.

11:10 a.m.

Conservative

Damien Kurek Conservative Battle River—Crowfoot, AB

Okay. Thank you very much.

I know I'm starting to run out of time, but generally, how does the Toronto Police Service assess new technologies to determine whether or not they would be an effective tool for use by the service?

11:10 a.m.

Chief Information Officer, Toronto Police Service

Colin Stairs

Part of the Clearview AI issue was that we didn't have a proper assessment process, so we're in the process of putting that in place. We've had consultations on the board policy that looks at AIML, and we're in the process of drafting the procedure that will sit underneath that.

Essentially, it starts by a determination of what the benefit of the technology might be that would drive us to even look at it. Then there's a set of flags, which would increase the risk around a set of various risk factors that we determine through the consultation that we ran on the public policy, and those risk factors would flag it into a separate process, ultimately to go through public consultation around that specific technology and a risk assessment to determine whether it needs to go forward.

11:10 a.m.

Conservative

The Chair Conservative Pat Kelly

Thank you, Mr. Stairs.

We ended up going a fair bit over the time with Mr. Kurek's round.

Go ahead now, Ms. Saks.

May 9th, 2022 / 11:10 a.m.

Liberal

Ya'ara Saks Liberal York Centre, ON

Thank you, Mr. Chair.

Thanks to my colleague, Mr. Kurek.

I'm actually going to continue with that line of questioning about risk and levels through you, Mr. Chair, to Mr. Stairs and the Toronto Police Service.

What are the levels of risk when making that evaluation? Can you outline them?

11:10 a.m.

Chief Information Officer, Toronto Police Service

Colin Stairs

Sure. There is extreme risk, which is something that we would not do. It would be banned. There's high risk, and medium, low and very low. The reason we needed more strata was to account for AIML applications we're getting that are baked into existing and sort of very simple and non-controversial types of applications.

11:10 a.m.

Liberal

Ya'ara Saks Liberal York Centre, ON

Thank you.

Just to follow on that, in a real-time scenario, what would qualify as “risk” in justifying the use of that technology?

11:10 a.m.

Chief Information Officer, Toronto Police Service

Colin Stairs

Risk might be a risk for human rights. It might be risk to the procedural integrity of the investigation. It might be that the information would be incorrect or that results would be unpredictable.

11:10 a.m.

Liberal

Ya'ara Saks Liberal York Centre, ON

Okay. I have two questions just to follow up.

Would there be human intervention in that kind of level of risk assessment with the use of the technology? Also, are there transparency measures in place?

11:10 a.m.

Chief Information Officer, Toronto Police Service

Colin Stairs

Do you mean in terms of determining the risk level or in terms of actually using a system that had a higher risk level?

11:10 a.m.

Liberal

Ya'ara Saks Liberal York Centre, ON

I mean in using a system with a higher risk level.

11:10 a.m.

Chief Information Officer, Toronto Police Service

Colin Stairs

One of the determinants is that there has to be a human in the loop in order to.... That's a significant risk element: Anything that doesn't have a human in the loop is considered high or extreme.

11:10 a.m.

Liberal

Ya'ara Saks Liberal York Centre, ON

Okay. Thank you, and in terms of transparency...?

11:10 a.m.

Chief Information Officer, Toronto Police Service

Colin Stairs

The board policy calls for all of our technology to be posted and to be evaluated under this frame. We are not going to be transparent about the very low risk and low risk, because we expect there are going to be a great number of them and the load on our service was going to be very high.

11:10 a.m.

Liberal

Ya'ara Saks Liberal York Centre, ON

Thank you very much.

I'm going to switch over to the RCMP now.

On the OPC report in relation to Clearview AI, it was outlined in previous testimony that there were things the RCMP did agree with and things that they did not agree with.

What came out of that was a national tech onboarding strategy in March 2021. Where are we with that, and what is it?

11:10 a.m.

C/Supt Gordon Sage

Roch Séguin would be the best one to answer that question.

11:10 a.m.

Roch Séguin Director, Strategic Services Branch, Technical Operations, Royal Canadian Mounted Police

Good morning, Mr. Chair and honourable members of the committee. Thank you for this opportunity to speak to you today.

We've made significant progress in the implementation of the national technology onboarding program, which is the main caveat to meeting all of the recommendations from the OPC. Every technology will be assessed, not only from the privacy aspect but also from a bias, ethics and legal perspective, before being used in any operation or investigation going forward.

As per the recommendation, we have until June 2022 to implement the program, so we still have a bit of time. We're working very hard right now to complete that. There's a slight risk that not all the training will be given by that time frame, and we may have a capacity issue, because we're having challenges with recruitment of additional resources within the program. However, the key foundation pieces for that program will be in place by June 2022.