Thank you, Mr. Chair, and thank you to the committee.
I am joining you from the unceded territory of the Squamish, Tsleil-Waututh and Musqueam nations.
As you heard, I'm a law professor, and my research focuses on the domestic regulation of artificial intelligence and robotics, especially as this relates to public spaces and privacy. I'm representing my own views here today.
I'm very grateful to the committee for the invitation to contribute to this important study. I urge this committee to supply a substantive equality lens to your report and all recommendations made to the government.
Much research has already shown how inequitable various forms of facial surveillance can be, particularly with respect to the misidentification of individuals on the basis of race, gender and age and the quality and source of data used to train such systems. However, even perfectly accurate facial surveillance systems built on data reported to be legally sourced can reflect and deepen social inequality for a range of reasons. I'll focus on some key points and welcome further questions later, including related to apparent narrow beneficial use cases.
First, facial surveillance systems are socio-technical systems, meaning that these technologies cannot be understood just by looking at how a system is built. One must also look at how it will interact with the people who use it, the people affected by it and the social environments in which it is deployed.
Facial surveillance consolidates and perfects surveillance and is introduced into a society where, for example, the Supreme Court of Canada, among other, has already recognized that communities are over-policed on the basis of protected identity grounds. Equity-seeking groups face greater quantities of interpersonal, state and commercial surveillance, and can experience qualitatively greater harm from that surveillance. More perfect surveillance means greater privacy harm and inequity.
I urge the committee to explicitly consider social context in your report and recommendations. This includes that biometric surveillance is not new. I encourage you to place facial surveillance within its historical trajectory, which emerged from eugenic and white supremacist sciences.
Part of the socio-technical context in which facial surveillance is introduced includes gaps in the application and underlying theories of laws of general application. In other words, our laws do not adequately protect against misuses of this technology. In particular, from my own research, I would flag that interpersonal uses of facial surveillance will be under-regulated.
I'm very encouraged to see that the committee is considering interpersonal use within the scope of this study and urge the committee to examine the interrelations between interpersonal surveillance and commercial and state entities. For example, while not specific to facial surveillance, the emergence of Amazon Ring-police partnerships in the United States highlights the potential Interweb of personal surveillance, commercial surveillance infrastructure and state policing, which will at least present challenges to current tort and constitutional laws as interrelations like this emerge in Canada.
Personal use facial surveillance has already been shown to be highly damaging in various cases, particularly with respect to technology-facilitated harassment, doxing and other forms of violence. These uses remain under-regulated because interpersonal surveillance in public spaces and public information is under-regulated. While governance of interpersonal privacy may not fall exhaustively within federal jurisdiction, I do think this is a crucial part of understanding facial surveillance as a socio-technical system and must be considered within the governance of such a technology. I also do not think the solution is to criminalize the personal use of facial surveillance systems, but rather to bolster normative and legal recognition of interpersonal rights and to regulate the design and availability of facial surveillance technologies.
Laws and policies governing technology can have at least three foci: regulating the uses of the technology, regulating the user, and/or regulating the design and availability of the technology. Regulation of design and availability may fall more directly within federal government jurisdiction and better focuses on those responsible for the creation of the possibility of such harm rather than only reactively focusing on punishing wrongdoing and/or compensating for harm that has already occurred.
Also, in terms of regulating the use of facial surveillance, I urge the committee to look to examples around the world where governments have adopted a moratorium on the use of facial surveillance, as has been mentioned by other witnesses, and I do also recommend the same in Canada. More is of course needed in the long term, including expanding the governance focus to include all forms of automated biometric surveillance, not exclusively facial surveillance. The committee may also consider recommending the creation of a national independent expert group to consult on further refinement of laws of general application and design use and user restrictions going forward, perhaps for both federal and provincial guidelines.
Expertise must include those—