Evidence of meeting #34 for Access to Information, Privacy and Ethics in the 43rd Parliament, 2nd Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was c-11.

A video is available from Parliament.

On the agenda

MPs speaking

Also speaking

Daniel Therrien  Privacy Commissioner of Canada, Office of the Privacy Commissioner of Canada

11:35 a.m.

Conservative

Jacques Gourde Conservative Lévis—Lotbinière, QC

Thank you, Mr. Chair.

Mr. Commissioner, thank you for being with us again today. It's always a pleasure to have you here.

You've told us that, under your mandate, you must respond to all complaints, but that you don't have the discretion to refuse some of them.

What kinds of complaints wouldn't warrant a review by your office?

11:35 a.m.

Privacy Commissioner of Canada, Office of the Privacy Commissioner of Canada

Daniel Therrien

Before answering the question directly, I would like to remind committee members that, in other jurisdictions, this discretion is granted to agencies that are equivalent in nature to my office. The purpose is to ensure that these offices aren't flooded with complaints, which would prevent them from playing a proactive role. The goal is not to reject applications, but to be able to do all of our work.

Getting back to your question, it comes back to a question of risk assessment. It's not necessarily that complaints have no merit, but there are levels of risk among the complaints that are sent to us. For example, there are some complaints that involve only one person. For the latter, it is very important, and I fully agree that we must give everyone access to a justice system. However, if we have to choose between investigating a complaint whose outcome is only going to affect one person or dealing with another complaint whose resolution may establish a principle that will affect a large part of the population, in this case, unfortunately, we have to go with the second one.

11:40 a.m.

Conservative

Jacques Gourde Conservative Lévis—Lotbinière, QC

That is absolutely laudable.

The growing use of technology was also discussed. I'm trying to get a handle on that.

We know that technology can expedite some cases, but it's still going to take a human being to give advice and write some reports.

Can you shed some light on how technology can expedite these complaints?

11:40 a.m.

Privacy Commissioner of Canada, Office of the Privacy Commissioner of Canada

Daniel Therrien

You're right. Ultimately, it takes a human being to analyze the file and respond to the complainant.

The technology is used primarily in the triage process. Currently, we have an obligation to deal with every complaint, but we don't process them all in the same way. Some we investigate thoroughly and some we investigate in a more expedited manner. One of the things we have is an early resolution process.

Technology can help us make an initial triage between complaints that should be resolved quickly and those that require further investigation. At the end of the day, however, in both cases, even the early resolution case, there is someone who needs to look at the file and respond to the complainant.

11:40 a.m.

Conservative

Jacques Gourde Conservative Lévis—Lotbinière, QC

Certainly, with the rise of all the new technology platforms, we're really on the fast track, if you will. Our work in Parliament has changed a lot in the space of eight or nine months, and it has to be similar to the office of the commissioner.

This method of operation exposes us to the risk of inadvertent or accidental disclosure of confidential information that concerns the public.

You talked a lot about public awareness. Does that entail telling Canadians to be more careful about certain things so that their identities are not disclosed, or do you have a broader objective?

11:40 a.m.

Privacy Commissioner of Canada, Office of the Privacy Commissioner of Canada

Daniel Therrien

The outreach component includes this aspect, but it encompasses many others.

Technology is a complex area for a lot of people. For many people, it's difficult to understand the privacy risks they have to manage unless they have a minimal understanding of the basics of technology and personal information handling.

That being said, we are aware of the terms of use of websites, and we try to do what we can about it, but these terms are very complex, and there's a limit to what we can do. Still, we try to educate people about how the technology works and how it affects their personal information so they can make the most informed decisions possible.

11:40 a.m.

Conservative

The Chair Conservative Chris Warkentin

Thank you, Mr. Gourde.

We're going to turn to Mr. Sorbara for the next questions.

May 10th, 2021 / 11:40 a.m.

Liberal

Francesco Sorbara Liberal Vaughan—Woodbridge, ON

Thank you, Chair.

Good morning, everyone, and welcome, Privacy Commissioner.

In my time today.... I know we're focusing on the main estimates, but what's crucial for me is that your office has the pertinent resources for you to effectively undertake your job and your mandate. That's what's important to me, so on that level those are my thoughts.

I want to move on to something in terms of.... I've read about and followed your office very closely since joining this committee late last year. We just listened to the study that Mr. Angus referred to. When it comes to meaningful consent, this document from May 2018 says:

Meaningful consent is an essential element of Canadian private sector privacy legislation. Under privacy laws, organizations are generally required to obtain meaningful consent for the collection, use and disclosure of personal information. However, advances in technology and the use of lengthy, legalistic privacy policies have too often served to make the control—and personal autonomy—that should be enabled by consent nothing more than illusory. Consent should remain central, but it is necessary to breathe life into the ways in which it is obtained.

Can you comment on that introductory paragraph?

I read your March 25, 2021 speech, and I read the Clearview AI information put forward. I still can't believe it stated that “Canadian privacy laws do not apply to its activities because the company does not have a 'real and substantial connection' to Canada”, even though it collected three billion images of Canadians and came up with that data.

Can you elaborate on meaningful consent, and how we need to balance that between consumer objectives, business objectives and individual objectives?

11:45 a.m.

Privacy Commissioner of Canada, Office of the Privacy Commissioner of Canada

Daniel Therrien

Obviously, it's a very broad question. I will try to do justice to it in a few seconds or minutes.

Consent is a fundamental aspect of the current law, PIPEDA, and it will continue to have a central role under the CPPA under Bill C-11, so there is a place for consent in privacy in 2021. There need to be some rules to make sure that when consent does work, it is obtained in a meaningful way. In my view, that means, in part, to ensure that the consumers who provide consent have a good idea of what they are consenting to, which is not obvious. That's where consent does work.

As I was saying in the documents you were referring to, given where we are with digital developments, there are many situations, a growing list of situations, where consent does not really work, particularly when you think of artificial intelligence, for instance, where the purpose of the technology is to use information for purposes other than that for which it was obtained. That's not really conducive to consent being an adequate means to protect privacy.

Given where we are in 2021, and the following years, there is a role for consent, but we also need to have laws that acknowledge that consent will not always work. Then we need to find an adequate means of protecting privacy absent consent. That's where the real difficulty, I think, lies in the discussion of these issues, particularly with Bill C-11.

Bill C-11 has many more exceptions to consent, some appropriate, others too broad in our view. How do you protect privacy if consent is not the preferred means of protecting it? We propose a human rights approach to privacy protection. Other models are proposed, such as the fiduciary model that Mr. Angus was referring to.

The extremely difficult challenge ahead of Parliament in the next few months is to determine where consent does not work—and it does not always work—and what would be a good model to continue to protect privacy adequately absent consent.

11:45 a.m.

Liberal

Francesco Sorbara Liberal Vaughan—Woodbridge, ON

Mr. Chair, if I can just finish off, because I know—

11:45 a.m.

Conservative

The Chair Conservative Chris Warkentin

You have but seconds, so if it's a short question....

11:45 a.m.

Liberal

Francesco Sorbara Liberal Vaughan—Woodbridge, ON

This discussion is important to 38 million individuals in this country because this is individuals' data. This is not anyone else's data. This is individuals' data. That's the way I view this issue. We need to make sure that we get the balance right, but we also need to make sure that consumers, Canadian citizens, are protected. That is my fundamental belief.

Thank you for that answer, Commissioner.

11:45 a.m.

Privacy Commissioner of Canada, Office of the Privacy Commissioner of Canada

11:50 a.m.

Conservative

The Chair Conservative Chris Warkentin

We're going to turn to Monsieur Fortin now for the next two and a half minutes.

Monsieur Fortin.

11:50 a.m.

Bloc

Rhéal Fortin Bloc Rivière-du-Nord, QC

Thank you, Mr. Chair.

Mr. Therrien, we've obviously only scratched the surface, but given everything that's just been said, how would you rate Canada, relative to other countries, in terms of privacy and protection of personal information?

I'd like you to give me your thoughts on this and to do some kind of comparative analysis in two minutes.

11:50 a.m.

Privacy Commissioner of Canada, Office of the Privacy Commissioner of Canada

Daniel Therrien

Canada was once a leader in privacy protection, but unfortunately, that is no longer the case. Many countries, not only in Europe, but also in South America and Asia, such as South Korea and Singapore, are very innovative. They have laws that protect privacy better than Canada's. Again, I think it's important that the bill, which could be passed by the House in the coming months, allow Canada to catch up with other countries, which have managed to innovate, in terms of economics.

It is often said that overly stringent privacy protections inhibit innovation. Germany, South Korea, Singapore, and several other countries demonstrate very clearly that it is possible to have laws that protect privacy very well and also enable innovative economies. In fact, I would argue that better privacy laws increase consumer confidence, which is a factor that helps to stimulate a country's economy. I definitely see a connection between privacy, confidence and economic growth.

11:50 a.m.

Bloc

Rhéal Fortin Bloc Rivière-du-Nord, QC

Why do you think Canada lost its leadership role in this area?

11:50 a.m.

Privacy Commissioner of Canada, Office of the Privacy Commissioner of Canada

Daniel Therrien

I don't think I'm the best person to answer your question.

11:50 a.m.

Bloc

Rhéal Fortin Bloc Rivière-du-Nord, QC

All right.

In this case, what do Germany and South Korea have that we need to look at?

11:50 a.m.

Conservative

The Chair Conservative Chris Warkentin

Mr. Fortin, your time is up. You'll have a chance to speak to the commissioner again when we start the next round in the next hour of the meeting.

Mr. Angus, we're going to turn now to you, for what I think will be the final questions of this hour. Then, we'll suspend and hear from the commissioner again.

11:50 a.m.

NDP

Charlie Angus NDP Timmins—James Bay, ON

Thank you so much.

I want to begin by saying that I am in complete agreement with my colleague, Mr. Sorbara, on the importance of getting Bill C-11 right, because it is about the rights of 38 million Canadians, and we all have that obligation.

Our committee previously brought forward a number of recommendations about the order-making powers of the Privacy Commissioner as well as the need to be able to levy huge fines. The vast majority of infringements on privacy we believe are accidental or without malice, but we do have some bad operators. We had Facebook say they didn't feel they had to follow Canadian law. We certainly see the same instance with Clearview AI, so the need to give you more tools was clear.

What concerns me, when I look at Bill C-11, is this idea of creating this regulatory tribunal that these companies could then go to about a decision.

I'd like to ask you, number one, whether we have any example of this kind of regulatory tribunal that can override a privacy commissioner's decision in any other jurisdiction, and how you feel about it. You state you believe that this tribunal would encourage companies to choose a route of appeal rather than finding common ground with the Privacy Commissioner's decisions, and it would actually delay and obstruct justice for consumers and privacy rights.

Could you give your thoughts on this regulatory tribunal balloon that has been floated by the government?

11:50 a.m.

Privacy Commissioner of Canada, Office of the Privacy Commissioner of Canada

Daniel Therrien

I am concerned with the creation of this additional layer in the process. I'm concerned, obviously, not because I'm concerned with the issue of fairness towards companies who would be the subject of order-making. I totally get the point that it is important that the system as a whole provides fairness to both complainants and companies. However, to our knowledge, there is no other jurisdiction that has this additional layer between the Privacy Commissioner and the courts. We think that the courts are perfectly capable of reviewing our processes to ensure that companies are dealt with fairly.

The end result of the creation of this tribunal, as I said and as you noted, is that rather than having a conversation between us and a company where, at the first opportunity, we try to make things right, companies would be encouraged to use these avenues of redress, which would considerably lengthen the process and which would be a huge issue for citizens.

11:55 a.m.

NDP

Charlie Angus NDP Timmins—James Bay, ON

Thank you very much.

11:55 a.m.

Conservative

The Chair Conservative Chris Warkentin

Thank you, Mr. Angus.

Thank you, Commissioner Therrien. We appreciate the testimony you have provided in this first hour as we review the estimates.

We are going to suspend now, colleagues, before the second hour. In the second hour, of course, we're going to have the commissioner again, but we have to change out some of the additional witnesses.

We will now suspend for five minutes.

Noon

Conservative

The Chair Conservative Chris Warkentin

I'm calling this meeting back to order.

For the second hour of this meeting, we're launching our study on facial recognition software and concerns related to it. Today we have the commissioner, who has agreed to remain here for an additional hour so that he can answer some questions as we launch into the investigation of this matter.

Thank you, Commissioner, for remaining with us.

We also have Mr. Homan, who is remaining with us as well, and Lara Ives, who is the executive director of the policy, research and parliamentary affairs directorate. Thank you so much for being here. Finally, we have Regan Morris, who is joining us as legal counsel.

Thank you as well for being here.

Commissioner, I'll turn it back to you for an opening statement to allow you to begin the discussion. Then we'll have questions for you.

Thank you again, Mr. Chair.

Facial recognition technology has become an extremely powerful tool that, as we saw in the case involving Clearview AI, can identify a person in a bank of billions of photos or even among thousands of protesters. If used responsibly and in appropriate situations, it can provide significant benefits to society.

In law enforcement, for example, it can enable police to solve crimes or find missing persons. However, it requires the collection of sensitive personal information that is unique to each individual and permanent in nature. Facial recognition technology can be extremely privacy invasive. In addition to promoting widespread surveillance, it can produce biased results and undermine other human rights.

The recent Clearview AI investigation, conducted jointly with my counterparts in three provinces, demonstrated how facial recognition technology can lead to mass surveillance and help treat billions of innocent people as potential suspects. Despite our findings that Clearview AI's activities violated Canadian privacy laws, the company refused to follow our recommendations, such as destroying the photos of Canadians.

In addition, our office is currently investigating the Royal Canadian Mounted Police, or RCMP, use of Clearview AI technology. This investigation is nearing completion. We are also working with our colleagues in all provinces and territories to develop a guidance document on the use of facial recognition by police forces. We expect to release a draft of this document for consultation in the coming weeks.

The Clearview case demonstrates how citizens are vulnerable to mass surveillance facilitated by the use of facial recognition technology. This is not the kind of society we want to live in. The freedom to live and develop free from surveillance is a fundamental human right. Individuals do not forgo their rights merely by participating in the world in ways that may reveal their face to others or enable their image to be captured on camera.

The right to privacy is a prior condition to the exercise of other rights in our society. Poorly regulated uses of facial recognition technology, therefore, not only pose serious risks to privacy rights but also impact the ability to exercise such other rights as freedom of expression and association, equality and democracy. We must ensure that our laws are up to par and that they impose limits to ensure respect for fundamental rights when this technology is used.

To effectively regulate facial recognition technologies, we need stronger protections in our privacy laws, including, among other things, a rights-based approach to privacy, meaningful accountability measures and stronger enforcement powers. The federal government recently introduced two proposals to modernize our privacy laws. These are important opportunities to better regulate the use of facial recognition and other new technologies.

Last November, the Department of Justice released a comprehensive and promising consultation paper that outlined numerous proposals to improve privacy legislation in the federal public sector. It proposes enhanced accountability requirements and measures aimed at providing meaningful oversight and quick and effective remedies. It also proposes a stronger collection threshold, which would require institutions to consider a number of factors to determine if the collection of personal information is “reasonably required” to achieve a specific purpose, such as ensuring that the expected benefits are balanced against the privacy intrusiveness, so that collection is fair, not arbitrary and proportionate in scope.

In the private sector, Bill C-11 would introduce the consumer privacy protection act. In my view, as I stated in the last hearing, that bill requires significant amendments to reduce the risks of facial recognition technology. The significant risks posed by facial recognition technology make it abundantly clear that the rights and values of citizens must be protected by a strong, rights-based legislative framework. The Department of Justice proposes adding a purpose clause to the Privacy Act that specifies that one of the key objectives of the legislation is “protecting individuals' human dignity, personal autonomy, and self-determination”, recognizing the broad scope of the right to privacy as a human right.

Conversely, Bill C-11 maintains that privacy and commercial interests are competing interests that must be balanced. In fact, compared with the current law in the private sector, PIPEDA, the bill gives more weight to commercial interests by adding new commercial factors to be considered in the balance without adding any reference to the lessons of the past 20 years on technology's disruption of rights.

Clearview was able to rely on the language of the current federal act, PIPEDA, to argue that its purposes were appropriate and the balance should favour the company's interests rather than privacy. Although we rejected these arguments in our decision, some legal commentators have suggested that our findings would be a way to circumvent PIPEDA's purpose clause by not giving sufficient weight to commercial interests. Even though we found that Clearview breached PIPEDA, a number of commentators, including the company but not limited to the company, are saying that we actually misapplied the current purpose clause.

If Bill C-11 were passed in its current form, Clearview and these commentators could still make these arguments, buttressed by a purpose clause that gives more weight to commercial factors. I urge you to make clear in Bill C-11 that where there is a conflict between commercial objectives and privacy protection, Canadians' privacy rights should prevail. Our submission analyzing this bill makes specific recommendations on the text that would achieve this goal.

Demonstrable accountability measures are another fundamental mechanism to protect Canadians from the risks posed by facial recognition. Obligations to protect privacy by design, conduct privacy impact assessments, and ensure traceability with respect to automated decision-making are key elements of a meaningful accountability framework. While most of these accountability measures are part of the Department of Justice's proposals for modernizing public sector law, they are all absent from Bill C-11.

Efforts to regulate facial recognition technologies must also include robust compliance mechanisms that provide quick and effective remedies for individuals.

Our investigation into Clearview AI revealed that the organization had contravened two obligations under Canadian privacy law. On the one hand, it collected, used and disclosed biometric information without consent, and for an inappropriate purpose.

Remarkably—and shockingly—the new administrative penalty regime created by Bill C-11 would not apply to these and other important violations of the legislation. Such a penalty regime renders meaningless laws that are supposed to protect citizens.

I therefore urge you to amend the bill to remedy this fundamental flaw.

In conclusion, I would say that the nature of the risks posed by facial recognition technology calls for collective reflection on the limits of acceptable use of this technology. These limits should not be defined only by the risks associated with specific facial recognition initiatives, but by taking into account the aggregate social effects of all such initiatives over time.

In the face of ever-increasing technological capabilities to intrude on our private lives, we need to ask ourselves what are the expectations we should be setting now for the future of privacy protection.

I thank you again for your attention.

I welcome any questions you may have.