Evidence of meeting #25 for Access to Information, Privacy and Ethics in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was data.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Nestor Maslej  Research Associate, Institute for Human-Centered Artificial Intelligence, Stanford University, As an Individual
Sharon Polsky  President, Privacy and Access Council of Canada

4:25 p.m.

Conservative

The Chair Conservative Pat Kelly

I call this meeting to order.

Welcome to meeting number 25 of the House of Commons Standing Committee on Access to Information, Privacy and Ethics.

Pursuant to Standing Order 108(3)(h) and the motion adopted by the committee on Monday, December 13, 2021, the committee is resuming its study on the use and impact of facial recognition technology.

I'd like to welcome our witness. We have today, as an individual, Nestor Maslej, research associate at the institute for human-centred artificial intelligence, Stanford University; and from the Privacy and Access Council of Canada, Sharon Polsky, president.

Mr. Maslej, you have up to five minutes for your opening statement.

4:25 p.m.

Nestor Maslej Research Associate, Institute for Human-Centered Artificial Intelligence, Stanford University, As an Individual

Good afternoon. I'd like to begin by thanking the chair and members of the committee for the invitation to speak today.

I'm Nestor Maslej, and currently I serve as a research associate for the Stanford Institute for Human-Centered AI. I am also a co-author and the lead researcher for the AI Index. Although my testimony today makes use of data from the AI Index, I am speaking as a private citizen, and my views are not representative of those of the Stanford Institute for Human-Centered AI.

The AI Index is an annual report, currently in its fifth edition, that aims to track, distill and visualize key trends in artificial intelligence. Our goal at the index is to be the best and most authoritative single source of information on trends in AI. The index aims to give policy-makers like you not only a deeper understanding of AI but also, crucially, an understanding that is grounded in empirical data.

It is this latter aim especially that informs my testimony today. I am here to answer the following question: What does data tell us about facial recognition technology? I will answer this question by tackling two sub-questions. First I will comment on capability. As of today, what can FRT do? Second I will examine usage. Who uses FRT—public and private actors—and how?

In terms of capability, there has been tremendous progress in the performance of facial recognition algorithms in the last five years. The index looked at data from the National Institute of Standards in Technology's face recognition vendor test, which comes from the U.S. Department of Commerce and measures how well FRT performs on a variety of homeland security and law enforcement tasks, such as facial recognition across photojournalism images, identification of child trafficking victims, deduplication of passports and cross-verification of visa images.

In 2017, some of the top-performing facial recognition algorithms had error rates anywhere from roughly 20% to 50% on certain FRVT datasets. As of 2021, none has posted an error rate greater than 3%, with the top-performing models registering an error rate of 0.1%, meaning that for every one thousand faces, these models correctly identify 999.

The index also shows that the performance of FRT deteriorates on masked faces but not by an overly significant degree. More specifically, performance is five to 16 percentage points worse depending on the FRT algorithm and dataset.

In terms of usage, FRTs are becoming increasingly deployed in both public and private settings. In 2021, 18 of 24 U.S. government agencies used these technologies: 16 departments for digital access or cybersecurity, six for creating leads in criminal investigations, and five for physical security. Moreover, 10 departments noted that they hoped to broaden its use. These figures are admittedly U.S.-centric, but they paint a picture of how widely governments use these tools and towards what end.

Since 2017, there has also been a total of $7.5 billion U.S. invested globally in funding start-ups dedicated to facial recognition. However, only $1.6 million of that investment has gone towards Canadian FRT start-ups. In the same time period, the amount invested in FRT technologies has increased 105%, which suggests that business interest in FRT is also growing. Our estimates also show that FRT is the 12th-most funded area out of 25 AI focus areas.

Lastly, a McKinsey survey of leading business executives, which we include in the index, shows that across all surveyed industries, only 11% of businesses had embedded facial recognition technology in their standard business processes, which trailed robotic process automation at 26% and natural speech understanding at 14% as the most embedded technologies.

In conclusion, I've presented some of the AI Index's key findings on the current capabilities and usage of FRT. It is my hope that the data I have shared usefully informs the committee's deliberation on the future regulation of facial recognition technologies in Canada. I'd be more than happy to answer any questions on the data I've presented and the implications that it may have.

Thank you.

4:30 p.m.

Conservative

The Chair Conservative Pat Kelly

Thank you.

Now we'll go to Ms. Polsky for up to five minutes.

4:30 p.m.

Sharon Polsky President, Privacy and Access Council of Canada

Thank you so much, Chair, and good afternoon, members of the committee as well. Thank you for inviting me to appear before you today on behalf of the Privacy and Access Council of Canada.

My remarks today reflect round tables held by the council with members from across the public and private sectors, and with members of law enforcement, who agree that facial recognition is one of many digital tools that have great potential.

Like any technology, facial recognition is neither good nor bad, but it's easy to justify, especially when considered on its own. What people do with technology makes all the difference in reasonableness, proportionality and impact on lives.

Thirty-four years ago, our Supreme Court said that “privacy is at the heart of liberty in a modern state”, that “privacy is essential for the well-being of the individual” and that privacy “is worthy of constitutional protection”, and I dare say it still is, except that now we struggle to have any privacy, at home or away.

It's difficult now, if not impossible, to prevent our facial images being captured and analyzed and our movements and our associations being calculated and evaluated in real time. We are in view every time we go outside, and often inside as well, and our images are posted to the Internet, often without our knowledge. We haven't consented to those images being used, or to our gait, our keystrokes or other biometrics being analyzed and correlated with databases that have been amassed with information about each of us.

We haven't asked that the voice-activated devices or the messaging platforms that our children use at school and we use at work analyze our conversations or our emotions, or for our TVs to watch us watching them, yet that is now commonplace, thanks to governments and companies creating an unregulated global biometrics industry that's predicted to reach $59 billion U.S. by 2025, while the tech companies embedded in the public sector urge us to use our faces to pay for groceries and to get government services.

In the 40 years that computers have been part of our daily lives, though, there hasn't been any substantive education in Canada about privacy or access laws, or rights or responsibilities, so it's no surprise that Canadians trust that the laws themselves are enough to protect privacy or that just 14% rate their own knowledge of their privacy rights as “very good”. In the meantime, there's been an onslaught of automated, privacy-invasive technologies and multi-million dollar investments in surveillance technologies to create safe communities across Canada purchased by the other 86% of people as well.

Certainly, facial recognition-enabled cameras in cars, body cams, doorbells and cellphones might help police identify a suspect or solve a crime, but even police admit that cameras and facial rec do not prevent crime, and there's little correlation between the number of public CCTV cameras and crime or safety, yet their unregulated sale and use are a self-fulfilling prophesy, because familiarity breeds consent.

Facebook, Cambridge Analytica, Cadillac Fairview and Tim Hortons are just the tip of the iceberg. Companies and governments can and do create or use technologies that violate our privacy laws because they can, because the current consent model is a fantasy, and because Mark Zuckerberg and others know that the risk of penalty is far less than the reward of collecting, manipulating and monetizing information about us.

We are at a moment, though, where three important changes are needed to help safeguard our democratic freedoms without impeding innovation and so that Canadians can regain trust in government, police and the public sector.

First, enshrine privacy as a fundamental human right for all Canadians, in person, online and at our borders.

Second, enact laws that require everyone who creates, purchases or uses technology to demonstrate that they actually have a clear and correct grasp of our privacy laws, rights and responsibilities.

Third, in the same way that vehicles and food must meet stringent government regulations before being allowed for sale or use in Canada, craft laws that put the onus on creators, requiring that technologies undergo comprehensive independent examination of their privacy access and algorithmic integrity, bias and impact before the product or platform may be acquired or used, directly or indirectly, and make sure the standards are set and the laws are written without the direct or indirect influence or input of industry.

Those are just a few highlights of a very complex issue that I am looking forward to discussing with you.

4:35 p.m.

Conservative

The Chair Conservative Pat Kelly

Thank you.

With that, we'll proceed to questions.

For the first six minutes, we have Mr. Williams.

4:35 p.m.

Conservative

Ryan Williams Conservative Bay of Quinte, ON

Thank you to all our witnesses.

Through you, Mr. Chair, I'm going to start with Ms. Polsky.

I think you mentioned this in your recommendations. Do we need proper education campaigns for Canadians on digital consent and privacy in the digital age?

4:35 p.m.

President, Privacy and Access Council of Canada

Sharon Polsky

We need substantive education that explains what privacy is. It doesn't exist yet. Really, it's no different from me tossing my car keys to the kid across the street and saying go have a good time but stay safe out there, without explaining what a stop sign is or what to do when they see one.

We need proper education, and we need the people who will be delivering the education to be educated first. We're lacking that right now.

4:35 p.m.

Conservative

Ryan Williams Conservative Bay of Quinte, ON

We've talked before about consent fatigue: people who aren't reading the consent form that's about six pages long. People scroll through it, and with any kind of app that you download, it's the same kind of thing.

Do you see consent fatigue in your work, and what do we do about it?

4:35 p.m.

President, Privacy and Access Council of Canada

Sharon Polsky

Well, consent fatigue is an interesting term. I think it's more a matter that people are resigned to the fact that no matter what they do or don't encounter in a so-called privacy policy, it's irrelevant, because the language that has been allowed—and frankly, embraced—by Canadian, European and other data protection regulators is allowed to be so vague as to be meaningless.

A perfect example is that you'll typically see the introductory fluff, “We respect your privacy,” and then it's, “We will collect your personal information only for business purposes,” and a list of other vague terms. Every for-profit organization's business purpose is to improve their bottom line and their net profit. Anything they can do to fulfill that obligation is a legitimate business purpose, as far as they're concerned. It's meaningless when it comes to protecting individuals. When we say yes to any of these, the companies essentially have carte blanche to share our information with their business partners, whoever they might be, wherever in the world. When it's outside of Canada, those companies do what they wish with it, for as long as they wish.

4:40 p.m.

Conservative

Ryan Williams Conservative Bay of Quinte, ON

We know that the Privacy Act is outdated and needs to be updated, so how would you update the Privacy Act?

4:40 p.m.

President, Privacy and Access Council of Canada

Sharon Polsky

On the Privacy Act, I'd say it's important to stop having so many fractured puzzle pieces of privacy legislation—federally and provincially and territorially. With each one, although they're very much alike—they're similar in most respects—they all have different exemptions, and it's almost impossible for anybody to know what law to comply with. If it's provincial, does it comply with this...or if it's health legislation, is it public sector? Then, when it crosses the line out of the country or to a different jurisdiction, it's a nightmare for compliance.

Have one overarching piece of legislation that covers the public sector, the private sector, the non-profit sector and political parties as well, please.

4:40 p.m.

Conservative

Ryan Williams Conservative Bay of Quinte, ON

Thank you.

Almost every witness who has appeared before the committee—academics, lawyers and civil liberty experts—has called for a moratorium on the use of FRT by police forces.

I know that you've conducted round tables with law enforcement officers. What was their honest opinion on FRT use?

4:40 p.m.

President, Privacy and Access Council of Canada

Sharon Polsky

The facial recognition that we use right now pulls a selection of mug shots from our database, and then a person actually has to look at the suspect picture and the database and compare them. That's fine. Not one of them could wrap their heads around the idea that there is such a thing as real-time, live facial recognition already in use in some jurisdictions.

They insisted that this is what we have today. They couldn't see beyond what they use today, or the implications for privacy and security of the new technology that they're not yet using.

4:40 p.m.

Conservative

Ryan Williams Conservative Bay of Quinte, ON

In your opinion, would you say that the rank-and-file members understand the FRT that they're using?

4:40 p.m.

President, Privacy and Access Council of Canada

Sharon Polsky

No. Very simply, no, because they're no different from most people across Canada, and I dare say elsewhere. Without education about the correct compliance requirements, what the legislation actually means, what the technology can actually do—not the sales pitch—all they can rely on is the sales pitch from a vendor whose interest is in their commission and their company's bottom line. They are not interested in our protection or our privacy, or, frankly, the police's problems.

4:40 p.m.

Conservative

Ryan Williams Conservative Bay of Quinte, ON

Would the actual rank and file that you went through those round tables with support a moratorium on the use of FRT for police?

4:40 p.m.

President, Privacy and Access Council of Canada

Sharon Polsky

When they can talk about themselves in their own lives, yes. I've spoken with many members of law enforcement from across Canada in different agencies, municipal, federal and military, and they basically say this: I'm not interested in being assumed to be a criminal. It's just a matter of time until I'm identified. I want to be able to go about my business anonymously. Just because I walk outside my door, I shouldn't always be under surveillance, with somebody—I don't know who or where—trying to figure out who I am, who I'm with and what I'm doing.

When the officers are in uniform, though, they have to toe the party line.

4:40 p.m.

Conservative

The Chair Conservative Pat Kelly

Thank you for that.

Now we will go to Mr. Bains for up to six minutes.

4:40 p.m.

Liberal

Parm Bains Liberal Steveston—Richmond East, BC

Thank you, Mr. Chair, and thank you to our witnesses for joining us today.

Mr. Maslej, the Institute for Human-Centered Artificial Intelligence index report for 2022 discusses diagnostic metrics that evaluate the model's impact or performance on, for example, population subgroups or minorities compared with the entire population. Can you comment on the research and investments being made to improve diagnostic metrics so that models do not misidentify persons from subgroups and minorities?

4:45 p.m.

Research Associate, Institute for Human-Centered Artificial Intelligence, Stanford University, As an Individual

Nestor Maslej

Yes. That's an excellent question.

The index doesn't look in too much detail at how much investment is being put into that area at the moment, but interest is growing. One thing I will note on the data I cited from the NSIT FRT test is that the data I looked at—

4:45 p.m.

Conservative

The Chair Conservative Pat Kelly

I'm going to interrupt you, Mr. Maslej. We're having a little trouble with your audio. We had tested it earlier, and I understand that it was all right, but it's not good right now. Can you ensure that you've selected the correct headset on your Zoom application?

4:45 p.m.

Research Associate, Institute for Human-Centered Artificial Intelligence, Stanford University, As an Individual

Nestor Maslej

Let me try once again.

Is it better now?

4:45 p.m.

Conservative

The Chair Conservative Pat Kelly

I will ask the translators....

No. I will suspend the meeting for a moment while we test this and get it straightened out.

4:50 p.m.

Conservative

The Chair Conservative Pat Kelly

Thank you.

Mr. Bains, perhaps you can briefly repeat your question, and we'll get to Mr. Maslej's response.

4:50 p.m.

Liberal

Parm Bains Liberal Steveston—Richmond East, BC

Yes.

We were talking about improving diagnostic metrics so that models do not misidentify persons from subgroups and minorities. I was getting your thoughts on that.