Evidence of meeting #18 for Access to Information, Privacy and Ethics in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was technology.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Daniel Therrien  Privacy Commissioner of Canada, Office of the Privacy Commissioner of Canada
Patricia Kosseim  Commissioner, Office of the Information and Privacy Commissioner of Ontario
Diane Poitras  President, Commission d'accès à l'information du Québec
Vance Lockton  Senior Technology and Policy Advisor, Office of the Information and Privacy Commissioner of Ontario

11:05 a.m.

Conservative

The Chair Conservative Pat Kelly

I call this meeting to order.

Welcome to meeting number 18 of House of Commons Standing Committee on Access to Information, Privacy and Ethics. Pursuant to Standing Order 108(3)(h) and the motion adopted by the committee on Monday, December 13, 2021, the committee is resuming its study of the use and impact of facial recognition technology.

Today's meeting is taking place in a hybrid format pursuant to the House order of November 25, 2021. The members are attending both in person in the room and remotely by using the Zoom application. Per the directive of the Board of Internal Economy on March 10, 2022, those attending the meeting in person must wear a mask, except for members who are at their place during proceedings.

I would like to make a few comments for the benefit of witnesses and members. First, wait until I recognize you by name before speaking. For those participating by video conference, click on the microphone icon to activate your mike, and please mute yourself when you are not speaking.

For interpretation for those on Zoom, so that you are aware, you have the choice at the bottom of your screen of having just the floor audio, or you can select English or French. For those in the room, use your earpiece and select the desired channel as you normally would.

Now I would like to welcome our witnesses. From the Office of the Privacy Commissioner of Canada, we have Daniel Therrien, Privacy Commissioner of Canada, and David Weinkauf, senior information technology research analyst.

From the office of the Information and Privacy Commissioner of Ontario, we have Patricia Kosseim, commissioner, and Vance Lockton, senior technology and policy adviser.

From the Commission d’accès à l’information du Québec, we have Diane Poitras, president.

Now we'll go to our first witness. Each witness may deliver an opening statement of up to five minutes.

Go ahead, Commissioner Therrien. You have the floor.

11:05 a.m.

Daniel Therrien Privacy Commissioner of Canada, Office of the Privacy Commissioner of Canada

Good morning, Mr. Chair.

Thank you for inviting me here today and for undertaking this important work on facial recognition.

Like all technologies, FRT can, if used responsibly, offer significant benefits to society. However, it can also be extremely intrusive, enable widespread surveillance, provide biased results and erode human rights, including the right to participate freely, without surveillance, in democratic life. It is different from other technologies in that it relies on biometrics, permanent characteristics that, contrary to a password, cannot be changed. It greatly reduces personal autonomy, including the control individuals should have over their personal information. Its use encompasses the public and the private sectors, sometimes for compelling purposes like the investigation of serious crimes or proving one's identity, sometimes for convenience.

The scope of your study is vast. In the time I have available, I will focus on the use of FRT in a law enforcement context. When we last spoke, my office had completed its investigation into Clearview AI, a private sector platform that we and our colleagues in Quebec, B.C. and Alberta found was involved in mass surveillance.

Since then, my office has examined the RCMP's use of Clearview AI's technology. We found that the RCMP did not take measures to verify the legality of Clearview AI's collection of personal information, and lacked any system to ensure that new technologies were deployed lawfully. Ultimately, we determined the RCMP's use of Clearview AI to be unlawful, since it relied on the illegal collection and use of facial images by its business partner.

Building on these findings, we worked with fellow privacy commissioners across Canada to develop joint guidance for police use of facial recognition. This guidance is meant to assist police in ensuring that any use of the technology complies with the law, minimizes risks and respects privacy rights. We are releasing the final version of the guidance today.

As part of this work, we launched a national public consultation on police use of facial recognition technology. During this consultation, we heard consistently that the current laws regulating the use of facial recognition did not offer sufficient protection against the risks associated with the technology. While all stakeholders we consulted agreed that the law must be clarified, there was no consensus on the content of a new law. Legislators will therefore have to decide how to reconcile various interests.

Following this consultation, fellow provincial and territorial privacy commissioners and I believe that the preferred approach should be to adopt a legislative framework based on four key elements, which we have outlined in a joint statement we're issuing today.

First, we recommend that the law clearly and explicitly define the purposes for which police would be authorized to use facial recognition technology and that it prohibit other uses. Authorized purposes should be compelling and proportionate to the very high risks of the technology.

Second, since it is not realistic for the law to anticipate all circumstances, it is important, in addition to limitations on authorized purposes, that the law also require police use of facial recognition to be both necessary and proportionate for any given deployment of the technology.

Third, we recommend that police use of facial recognition should be subject to strong, independent oversight. Oversight should include proactive engagement measures such as privacy impact assessments, or PIAs; program level authorization or advance notification before use; and powers to audit and make orders.

Finally, we recommend that appropriate privacy protections be put in place to mitigate risks to individuals, including measures addressing accuracy, retention and transparency in facial recognition initiatives.

I encourage you to consider our recommendations as you complete your study of this important issue.

Thank you for the opportunity to appear before you today; I look forward to your questions.

I will be pleased to answer your questions following my colleagues' statements.

11:10 a.m.

Conservative

The Chair Conservative Pat Kelly

Thank you.

Now, for up to five minutes, we have Commissioner Kosseim.

May 2nd, 2022 / 11:10 a.m.

Patricia Kosseim Commissioner, Office of the Information and Privacy Commissioner of Ontario

Good morning.

Thank you for inviting me to speak today.

Joining me is Vance Lockton, senior policy and technology analyst with my office.

I would like to build on the remarks you've just heard from Commissioner Therrien. While all of Canada's Privacy Commissioners recommend the adoption of a comprehensive statutory framework to address the use of facial recognition technology in the criminal law context, we also recognize that some police agencies are already using, or considering using, facial recognition technologies. As such, we have issued guidelines to help guide law enforcement agencies and mitigate against potential harms until a new statutory framework is put in place, as my colleague Mr. Therrien described it.

I would like to emphasize five key elements of the guidelines.

First, before using facial recognition for any purpose, police agencies must establish that they are lawfully authorized to do so. This is not a given, and cannot be assumed. Facial recognition relies on the use of sensitive biometric information. Police should seek legal advice to confirm they have lawful authority either at common law or under statutes specific to their jurisdiction. They must also ensure they are Charter-compliant and their purported use is necessary and proportionate in the circumstances of a given case.

Second, police agencies must establish strong accountability measures. This includes designing for privacy at every stage of a facial recognition initiative and conducting a privacy impact assessment, or PIA, to assess and mitigate risks in advance of implementation.

It also involves putting in place a robust privacy management program, with clearly documented policies and procedures for limiting the purposes of facial recognition, robust systems for logging all related uses and disclosures, and clearly designated roles and responsibilities for monitoring and overseeing compliance.

Such a program must be annually reviewed for its continued effectiveness. It must be supported by appropriate training and education, and ensure that any third party service providers also comply with all related privacy obligations.

Third, police agencies must ensure the quality and accuracy of personal information used as part of a facial recognition system to avoid false positives, reduce potential bias and prevent harms to individuals and groups. Ensuring accuracy involves conducting internal and external testing of the FR system for any potentially discriminatory impacts, as well as building in human review to mitigate risks associated with automated decisions that may have a significant impact on individuals.

Fourth, police agencies should not retain personal information for longer than necessary. This means destroying probe images that don't register a match and removing face prints from the database as soon as they no longer meet the proper criteria for inclusion.

Fifth, policy agencies must address transparency and public engagement. Direct notice about the use of facial recognition may not always be possible in the context of a specific police investigation. However, transparency at the program level is certainly possible, and could include publishing the agency's formal policies on the use of facial recognition, a plain language explanation of their program and a summary of their PIA,.

Any communication with the public should be two-way. Key stakeholders, particularly representatives of over-policed groups, should be consulted in the very design of the facial recognition program. Given the special importance of reconciliation in Canada, this must include input from local indigenous groups and communities.

These are a few of the measures set out in the guidance.

To reiterate, although we believe these guidelines represent important risk mitigation measures, ultimately we recommend the establishment of a comprehensive statutory regime governing the use of facial recognition by police in Canada. Clear guardrails with force of law are necessary to ensure that police agencies can confidently make appropriate use of this technology, grounded in a transparent framework, accountable to the people they serve and capable of earning the public's enduring trust.

Thank you.

11:10 a.m.

Conservative

The Chair Conservative Pat Kelly

Thank you.

The president of the Commission d'accès à l'information du Québec, Ms. Diane Poitras, now has the floor for five minutes.

11:15 a.m.

Diane Poitras President, Commission d'accès à l'information du Québec

Thank you, Mr. Chair.

Good morning and thank you for this invitation to discuss facial recognition.

Building on my colleagues' remarks, I would briefly like to address the problems raised by other uses of this technology and to outline what is provided under Quebec legislation. As several speakers have mentioned, the increasingly widespread use of facial recognition in various contexts raises significant problems, particularly with respect to privacy.

This technology, which combines biometrics with artificial intelligence, among other things, is particularly invasive, partly because it scans unique body characteristics and transforms them into data. Those characteristics, such as certain facial traits, are central to our identity. The fact that this technology can be used without our knowledge means we have less control over our information and are at greater risk of undue surveillance. Some proposed uses of facial recognition and derivative technologies infer from our face or facial expressions personal characteristics such as age, sex, ethnic origin, emotions, degree of attention, fatigue or stress, health information and certain personality traits. These characteristics may be used to categorize, detect or profile individuals for commercial purposes to conduct some form of surveillance or to make decisions concerning them.

The creation of biometric databases also raises significant privacy risks. It is difficult for a person whose biometric data have been compromised to challenge an inadvertent action or transaction or identity fraud given the high degree of reliability that unique and permanent information is assumed to have. Since it is virtually impossible to replace compromised biometrics, it can be just as complicated to re‑establish one's identity.

There is also considerable risk that biometric databases created for one specific purpose may be used for other purposes without our knowledge or an adequate assessment of the problems and risks associated with those other purposes. This is why the creation of these banks and the use of biometrics for identification purposes are governed in Quebec by the Act to establish a legal framework for information technology, as well as privacy statutes applicable to public and private organizations. The creation of every biometric database must thus be reported to the commission. Starting next September, reporting will also be required for every instance in which biometrics are used for identification purposes.

In Quebec, biometrics may not be used for identification purposes without the express consent of the person concerned. No biometric characteristic may be recorded without that person's knowledge. Only a minimum number of biometric characteristics may be recorded and used. Any other information that may be discovered based on those characteristics may not be used or preserved. Lastly, biometric information and any note concerning that information must be destroyed when the purpose of the verification or confirmation of identity has been achieved. The commission has broad authority and may make any order respecting biometric banks, including authority to suspend or prohibit their bringing into service or order their destruction. General privacy protection rules also apply in addition to these specific provisions. That means, for example, that the use of facial recognition must be necessary and proportionate to the objective pursued.

We have observed that organizations unfortunately do not attach all the importance they should to this compliance evaluation or the problems associated with the use of facial recognition. The popularity of biometrics has led to a kind of trivialization of its impact on citizens, which is why the commission recommends that a preliminary analysis be conducted of privacy-related factors. That evaluation will in fact be mandatory as of September 2023. Biometric information will also be expressly designated as sensitive personal information. Although the current regulation of biometrics in Quebec has given the commission an idea of the extent of facial recognition use and grants it enforcement powers, we have requested that regulation be enhanced to reflect developments in the technology and the various contexts in which it is used.

Thank you for your attention. I will be pleased to discuss these matters with you over the next few minutes.

11:20 a.m.

Conservative

The Chair Conservative Pat Kelly

With that, we'll go straight to questions.

Mr. Kurek, you have up to six minutes.

11:20 a.m.

Conservative

Damien Kurek Conservative Battle River—Crowfoot, AB

Thank you very much.

I appreciate the presence of all of the commissioners today, and their expertise.

To all the witnesses, I'm hoping we would be able to get a copy of that joint statement to enter it into testimony. Could you just confirm that it can be done? Thank you very much.

Commissioner Therrien, over the last number of meetings in this study, we've learned and heard a lot about some of the challenges associated with facial recognition technology. You've referenced the consultations that were done. Could you outline for the committee what that consultation looked like in terms of facial recognition technology and its use, some of the stakeholders who were involved in that consultation and some of the trends that you might have noticed during that process?

11:20 a.m.

Privacy Commissioner of Canada, Office of the Privacy Commissioner of Canada

Daniel Therrien

Sure.

When we issued our investigative report on the RCMP's use of Clearview last June in a special report to Parliament, we started at the same time a consultation with stakeholders who were interested in speaking to draft guidance that we had published at the same time. That led to about 30 groups or individuals writing to us, and we also had meetings with a number of stakeholders.

The stakeholders represented civil society, minority groups and the police itself. I met a number of times with the RCMP and with the Canadian Association of Chiefs of Police, and my colleagues also met with provincial equivalents. There was a broad range of people who were consulted. Views were varied, obviously, because the interests were different, but all agreed that the law is insufficient as it is. Depending on the interests of various stakeholders, they did not agree necessarily on the content of that law.

11:20 a.m.

Conservative

Damien Kurek Conservative Battle River—Crowfoot, AB

Sure, and in your opening statement you said that there was no clear consensus found by stakeholders, and certainly that's the sentiment that I've found as we've heard from different witnesses. We did hear the RCMP very clearly say that they had disagreed with your office's findings in terms of their use of Clearview AI.

I'm curious if you could share with the committee some of your observations about the trends that you found when consulting with the wide variety of groups that you've engaged with in this process.

11:20 a.m.

Privacy Commissioner of Canada, Office of the Privacy Commissioner of Canada

Daniel Therrien

I would start first with where there was agreement beyond the need for the law to be changed.

Many people felt that the guidance was drafted or crafted at a level of generality such that the advice is helpful, but they would like it at the very least to be supplemented by advice on what was called “use cases”. Our reaction to that is that indeed there is a need for advice on particular uses in different contexts, because context matters a whole lot, but we still think it's important and relevant to have general guidance that can be be augmented as use cases are developed.

Some stakeholders from civil society or minority groups called for a moratorium on the use of facial recognition. The RCMP obviously did not agree with that. Our position as commissioners is that there should be clear laws prescribing when facial recognition can be used, because it can be used for legitimate, helpful purposes and social good in some circumstances—for instance, in serious crime situations or to find missing children—but these uses should be defined quite narrowly. The law should also prescribe prohibited uses, which would be, I guess, a partial ban or a partial moratorium on the use of facial recognition.

If I may, on the question of a moratorium, we as data protection authorities cannot impose a moratorium that has the force of law. For a moratorium to be binding on police agencies, it would have to take the form of legislation.

I was struck by the testimony that you heard last week from an RCMP representative, to the effect that “The RCMP believes that the use of facial recognition must be targeted, time-limited and subject to verification by trained experts.”

11:25 a.m.

Conservative

Damien Kurek Conservative Battle River—Crowfoot, AB

I'll ask one question now because of time limitations. Would you be able to provide the committee with a list of best practices from other jurisdictions around the world that already have some of these frameworks, for the committee to be able to reference and point to?

11:25 a.m.

Privacy Commissioner of Canada, Office of the Privacy Commissioner of Canada

11:25 a.m.

Conservative

Damien Kurek Conservative Battle River—Crowfoot, AB

I apologize; I think I'm basically out of time.

Thank you very much to all of the witnesses for coming today and for your expertise. Thank you.

11:25 a.m.

Conservative

The Chair Conservative Pat Kelly

Mr. Fergus, go ahead for six minutes.

11:25 a.m.

Liberal

Greg Fergus Liberal Hull—Aylmer, QC

Thank you very much, Mr. Chair.

Thanks as well to Mr. Therrien, Ms. Kosseim and Ms. Poitras for their testimony today.

I will go first to Mr. Therrien, then to the other two witnesses.

Mr. Therrien, I know you've submitted a report on the use of facial recognition by the RCMP, and I thank you for that. I found it very interesting and useful. However, I'd like to take a step back so I can apply that to everyone, both governments and the private sector, as Quebec's legislation attempts to do.

Do you think the advice you gave the RCMP on the use of facial recognition would generally apply to the private sector?

11:25 a.m.

Privacy Commissioner of Canada, Office of the Privacy Commissioner of Canada

Daniel Therrien

I think the common factor that applies horizontally to all stakeholders who would like to use facial recognition is the principle of necessity and proportionality that my two colleagues mentioned. That applies to all stakeholders: police services, businesses and other departments and governments.

In police services, however, the use of facial recognition can have extremely serious consequences, resulting even in the loss of freedom. I would say that many common principles should be considered. All stakeholders, including legislators, had to consider the context and consequences of the use of this technology. For example, a total prohibition of its use by police services in certain circumstances might not necessarily apply to all stakeholders.

11:25 a.m.

Liberal

Greg Fergus Liberal Hull—Aylmer, QC

I agree with you that the use of facial recognition by police services may raise serious issues.

We heard from witnesses from Princeton University in the United States who said that, while governments play a leading role in the use of this technology, private businesses also have a role. For example, there can be serious consequences if you use it to determine what kind of credit risk a citizen presents. Its use is based on a theory that's built on evidence that's insufficient to justify that use.

Ms. Kosseim, thank you very much for citing the five key elements in the guidelines. Do you think they may also apply to the private sector?

11:25 a.m.

Commissioner, Office of the Information and Privacy Commissioner of Ontario

Patricia Kosseim

Thank you for your question.

As my colleague said, the principles should definitely apply, regardless of the sector concerned, obviously considering the context and range of risks at play. I would note that Ontario doesn't have a privacy act that applies to the private sector. However, my office very much agrees with the idea the government has proposed of one day passing one.

In privacy matters, most businesses are subject to federal legislation. However, that leaves a vacuum in many areas in Ontario. In many sectors, there is no legislation protecting the privacy of employees in the vast majority of businesses. So that's a major deficiency. I think it's important that the basic principles we advance in our guidelines apply and that we proceed with the necessary adjustments for other contexts. Our guidelines are specifically designed for the law enforcement sector and police services.

11:30 a.m.

Liberal

Greg Fergus Liberal Hull—Aylmer, QC

Thank you.

Ms. Poitras, I applaud your bill, which would require businesses to comply with the directives provided under the act by 2023.

I know I'm putting you in an uncomfortable position by asking you this question, but can we do more in Quebec or in the federal government to protect citizens from the issues associated with facial recognition technology?

Should the federal government pass legislation similar to what you have in Quebec?

11:30 a.m.

President, Commission d'accès à l'information du Québec

Diane Poitras

Thank you for your question.

The Quebec act is definitely a start, but we've previously submitted recommendations for improving it to Quebec parliamentarians. For example, the framework currently establishes obligations only where biometrics and facial recognition are used to verify identity. However, based on the reports we receive from biometric databases, the technology is also being used for other purposes. I mentioned that in my presentation. Consequently, the first recommendation would be to ensure—

11:30 a.m.

Conservative

The Chair Conservative Pat Kelly

I'm sorry.

11:30 a.m.

Liberal

Greg Fergus Liberal Hull—Aylmer, QC

Mr. Chair, would you please ask the witnesses to forward in writing any further information they may have for the committee?

11:30 a.m.

Conservative

The Chair Conservative Pat Kelly

All right.

Mr. Fergus, you did not permit very much time for this witness to answer that question. I'm sorry, but we will have to move on.

Mr. Vilmure, you have the floor for six minutes.

11:30 a.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

Thank you, Mr. Chair.

Thanks to all the commissioners for being here today.

I congratulate them for publishing the guidelines, a document that we've been waiting for.

Mr. Therrien, in a few words, how would you define what surveillance is?