Refine by MP, party, committee, province, or result type.
Information & Ethics committee Sure. There is extreme risk, which is something that we would not do. It would be banned. There's high risk, and medium, low and very low. The reason we needed more strata was to account for AIML applications we're getting that are baked into existing and sort of very simple and
May 9th, 2022Committee meeting
Colin Stairs
Information & Ethics committee Risk might be a risk for human rights. It might be risk to the procedural integrity of the investigation. It might be that the information would be incorrect or that results would be unpredictable.
May 9th, 2022Committee meeting
Colin Stairs
Information & Ethics committee Do you mean in terms of determining the risk level or in terms of actually using a system that had a higher risk level?
May 9th, 2022Committee meeting
Colin Stairs
Information & Ethics committee One of the determinants is that there has to be a human in the loop in order to.... That's a significant risk element: Anything that doesn't have a human in the loop is considered high or extreme.
May 9th, 2022Committee meeting
Colin Stairs
Information & Ethics committee The board policy calls for all of our technology to be posted and to be evaluated under this frame. We are not going to be transparent about the very low risk and low risk, because we expect there are going to be a great number of them and the load on our service was going to be
May 9th, 2022Committee meeting
Colin Stairs
Information & Ethics committee Mr. Chair, the street-check practice is discontinued, but that is a practice—
April 28th, 2022Committee meeting
Colin Stairs
Information & Ethics committee We do not.
April 28th, 2022Committee meeting
Colin Stairs
Information & Ethics committee I agree that what we're looking at is mostly an after-the-fact investigative tool, and we are not looking at surveillance or upstream of event types of facial recognition, which would be very intrusive. And in that state, I don't think we're having a significant impact as it stan
April 28th, 2022Committee meeting
Colin Stairs
Information & Ethics committee I'm just going to respond to the first question. We can supply the Forensic Identification Services policy on facial recognition and what qualifies for that. There's a fairly stringent set of criteria, and we can supply those separately. In terms of rights, we're operating unde
April 28th, 2022Committee meeting
Colin Stairs
Information & Ethics committee It would be through the investigative processes.
April 28th, 2022Committee meeting
Colin Stairs
Information & Ethics committee Yes, definitely.
April 28th, 2022Committee meeting
Colin Stairs
Information & Ethics committee There's never no human intervention. There will always be a human intervention.
April 28th, 2022Committee meeting
Colin Stairs
Information & Ethics committee I am aware, yes.
April 28th, 2022Committee meeting
Colin Stairs
Information & Ethics committee I don't believe so. This is how we've approached this with our AI/ML policies. There's a balance of goods around this. There's a social good around public security and safety against privacy and human rights challenges with the technology. The question is, when do we deploy thi
April 28th, 2022Committee meeting
Colin Stairs
Information & Ethics committee We are. We're using facial recognition to compare probe photos that would have been uncovered in investigation against our Intellibook, which is our mug shot database. There is a known set of issues around faces in different training sets. We selected the facial recognition tech
April 28th, 2022Committee meeting
Colin Stairs