Thank you, Mr. Chair and members of the committee, for the opportunity to offer our thoughts on this study.
My name is Rizwan Mohammad, and I'm an advocacy officer with the National Council of Canadian Muslims, the NCCM. I'm joined today by NCCM CEO Mustafa Farooq. I'd also like to thank NCCM intern Hisham Fazail for his work on our submission.
Today we want to look at the heart of the problem with facial recognition technology, or FRT. A number of national security and policing agencies, as well as other government agencies, have come before you to tell you how FRT is an important tool that has great potential use across government. You've been told that FRT can help escape problems of human cognition and bias.
Here are some other names that you all know, names affiliated with times that these same agencies told you that surveillance would be done in ways that were constitutionally sound and proportionate. The are Maher Arar, Abdullah Almalki and Mohamedou Ould Slahi.
The same agencies that lied to the Canadian people about surveilling Muslim communities are coming before you now to argue that while mass surveillance will not be happening, FRT can and should be used responsibly. Those agencies, like the RCMP, have already been found to have broken the law according to the Privacy Commissioner when it comes to FRT.
We are thus making the following two recommendations, and we want to be clear that our submissions are limited to exploring FRT in the non-consumer context.
First, we recommend that the government put forth clear and unequivocal privacy legislation that severely curtails how FRT can be utilized in the non-consumer context, allowing only for judicially approved exceptions in the context of surveillance.
Second, we recommend that the government set out clear penalties for agencies caught violating rules around privacy and FRT.
Let us begin with the first recommendation, calling for a blanket ban on FRT across the government without judicial authorization in the context of any and all national security agencies, including but not exclusive to the RCMP, CSIS, and the CBSA. You know the reasons for this already. A 2018 report in the U.K. found new figures showing that facial recognition software used by the U.K. Metropolitan Police returned incorrect matches in 98% of cases. Another study from 2019, which drew on a different methodology, reported that the Metropolitan Police returned incorrect matches, or a false positive rate, in 38% of cases.
We are well aware that FRT works differently, and with different accuracy results, depending on the technology, but we all acknowledge as a matter of fact that there are algorithmic biases when it comes to FRT. Given what we know, given the privacy risks that FRT poses, and given that Canadians, including members on other committees in this House, have raised concerns around systemic racism in policing, we agree with other witnesses who have appeared before this committee in calling for an immediate moratorium on all uses of FRT in the national security context and for the RCMP until legislative guidelines are developed.
Simultaneously, we recommend that in developing legislative guidelines, a very high threshold be utilized, including judicial authorization, oversight and timeline limitations.
Secondly, we are shocked by the blasé attitude that the RCMP has taken in approaching the issue of its use of Clearview AI. First the RCMP denied using Clearview AI, but then confirmed it had been using the software after news broke that the company's client list had been hacked. An excuse was given that the use of FRT wasn't known widely in the RCMP. The false answer the RCMP gave to the Privacy Commissioner, which was as credible as the “dog ate my homework” excuse, was completely unacceptable.
The RCMP then had the audacity, after the Privacy Commissioner's findings in the report, to state that it did not necessarily agree with the findings. While the RCMP has taken certain steps to ameliorate the concerns raised, a failure of accountability, when it comes to clear errors and misleading statements, must require clear penalties. Otherwise, how can we trust any such process or commitment to avoid mass surveillance?
We encourage this committee to recommend that strong penalties be assessed against agencies and officers who may breach the rules created around FRT, potentially through an amendment to the RCMP Act. We will provide the committee with a broader written brief in due course.
Subject to any questions, these are our submissions.
Thank you.