Evidence of meeting #20 for Access to Information, Privacy and Ethics in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was rcmp.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Superintendent Gordon Sage  Director General, Sensitive and Specialized Investigative Services, Royal Canadian Mounted Police
Colin Stairs  Chief Information Officer, Toronto Police Service
Roch Séguin  Director, Strategic Services Branch, Technical Operations, Royal Canadian Mounted Police
André Boileau  Officer in Charge, National Child Exploitation Crime Centre, Royal Canadian Mounted Police

11:30 a.m.

Conservative

Ryan Williams Conservative Bay of Quinte, ON

You'd run it through, find databases....

11:30 a.m.

Chief Information Officer, Toronto Police Service

Colin Stairs

We would take the fingerprint from the scene and run it against our fingerprint database, and if we got a match, we would follow up on that investigative lead.

11:30 a.m.

Conservative

Ryan Williams Conservative Bay of Quinte, ON

In terms of—

11:30 a.m.

Chief Information Officer, Toronto Police Service

Colin Stairs

It's very similar in that sense.

11:30 a.m.

Conservative

Ryan Williams Conservative Bay of Quinte, ON

I guess the difference on this one that we're looking at—and I'm going to go back to some other earlier testimony—is that when we have this FRT system, it's identified that we see up to 35% error rates in identifying, for instance, Black females versus white females.

When it comes to that identification, you stated in past testimony that you have a human who looks through that data, but are we still seeing that? Your testimony—I'm just going to get you to confirm that—was that the technology you're using was the least biased. Is that correct?

11:30 a.m.

Chief Information Officer, Toronto Police Service

Colin Stairs

It was selected on the basis of minimizing that bias, but that bias still exists, both in the training data and also, more importantly, in the photography technology that we use sort of broadly.

11:30 a.m.

Conservative

Ryan Williams Conservative Bay of Quinte, ON

Okay. I guess the difference between a fingerprint, as you were saying, in a crime scene and this technology is that this one has proven to be inherently biased, or to have some bias, whereas a fingerprint would not have a bias, correct?

11:35 a.m.

Chief Information Officer, Toronto Police Service

Colin Stairs

All systems have some bias, but yes, this has a different type of bias. There—

11:35 a.m.

Conservative

Ryan Williams Conservative Bay of Quinte, ON

A fingerprint would not have a racial bias, correct?

11:35 a.m.

Chief Information Officer, Toronto Police Service

Colin Stairs

A fingerprint would not have a racial bias, as far as I know.

11:35 a.m.

Conservative

The Chair Conservative Pat Kelly

Mr. Williams, your time is up.

11:35 a.m.

Conservative

Ryan Williams Conservative Bay of Quinte, ON

Thank you, sir.

11:35 a.m.

Conservative

The Chair Conservative Pat Kelly

I will move on now to Ms. Hepfner for five minutes.

May 9th, 2022 / 11:35 a.m.

Liberal

Lisa Hepfner Liberal Hamilton Mountain, ON

Thank you very much.

Mr. Stairs, I'm going to go on in the same vein as my colleague Mr. Williams.

Just to clarify, in the system used by Toronto police and I think other police services across Canada, the source of the images you're using is your own database of mug shots.

11:35 a.m.

Chief Information Officer, Toronto Police Service

Colin Stairs

Yes. That's correct.

11:35 a.m.

Liberal

Lisa Hepfner Liberal Hamilton Mountain, ON

What about the body cameras that I know are used by some police services, including by Toronto police, I believe? How are those images used with the police service? Do those images ever get into your database?

11:35 a.m.

Chief Information Officer, Toronto Police Service

Colin Stairs

They don't go into the mug shot database. That's a separate digital evidence management system that holds all the video from body-worn cameras. Body-worn cameras would generally not be used, or no data would be used. The circumstances wouldn't arise. There is no connection between the body-worn cameras and the Intellibook system, no automated connection.

The only way to do a facial recognition off a body-worn camera image would be to lift the still, export it and then bring it into the Intellibook system through the process I described. That would be highly unusual, because if you're interacting face to face with someone, you don't usually need to then determine their identity through that kind of means.

11:35 a.m.

Liberal

Lisa Hepfner Liberal Hamilton Mountain, ON

Would the police service at this point ever go into a crowd or a protest, for example, for images and try to identify people that way?

11:35 a.m.

Chief Information Officer, Toronto Police Service

Colin Stairs

That's explicitly written out of our body-worn camera policy and procedure.

11:35 a.m.

Liberal

Lisa Hepfner Liberal Hamilton Mountain, ON

Okay.

You were talking about how the force is currently trying to develop policies and procedures surrounding facial recognition technology. Can you talk to us about that process? Who's involved in that process? Is it just sworn officers, or do you have advisers from outside the police force, maybe people with ethics backgrounds, who can help develop these frameworks and these ethical questions that should be included?

11:35 a.m.

Chief Information Officer, Toronto Police Service

Colin Stairs

The process was initiated by our board in response to Clearview. The scope of it is slightly larger. It's looking at all AI and ML technologies, not just facial recognition. There are other technologies that have different but similar types of problems. We're looking at all of those.

We had an open consultation to specific groups—law societies, privacy groups, ethics groups and technology specialists—and then we had an open consultation that was open to any members of the public. We went through a round of that on the policy. Now we're expecting to do a similar round on the procedure, which sits underneath the policy and directs the service members.

11:35 a.m.

Liberal

Lisa Hepfner Liberal Hamilton Mountain, ON

What sort of outcome are you looking for? Are you looking for an ethical framework whereby you have a certain number of questions you have to ask before using any new technology? Can you describe a little bit about the outcome that you're hoping to get out of the process?

11:35 a.m.

Chief Information Officer, Toronto Police Service

Colin Stairs

Sure. I think part of the problem we've got that triggered this conversation is that we have insufficient visibility and guidance to frontline officers on how they should approach new technologies. What we're looking to do is create a framework that allows us to filter and surface to our board and to our public the types of technologies that we intend to use and why we intend to use them, and then have a discussion in the full light of day on those technologies.

11:35 a.m.

Liberal

Lisa Hepfner Liberal Hamilton Mountain, ON

With the Intellibook program—I think you've already covered this, but this is just so that we're extra clear—an officer will pull up a list of potential suspects, and then it's really just a clue. It's not a piece of evidence that would be used in a court of law if a photo comes up in the Intellibook system.

11:35 a.m.

Chief Information Officer, Toronto Police Service

Colin Stairs

It by itself is not considered an identification—