Evidence of meeting #15 for Access to Information, Privacy and Ethics in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was used.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Rob Jenkins  Professor, University of York, As an Individual
Sanjay Khanna  Strategic Advisor and Foresight Expert, As an Individual
Angelina Wang  Computer Science Graduate Researcher, Princeton University, As an Individual
Elizabeth Anne Watkins  Postdoctoral Research Associate, Princeton University, As an Individual

11:35 a.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

Thank you very much.

Mr. Khanna, in past discussions, you have alluded to biometric terrorism.

Could you tell us more about that?

11:40 a.m.

Strategic Advisor and Foresight Expert, As an Individual

Sanjay Khanna

I'm trying to recall the particular conversation you're referring to, but certainly there are scenarios within which those kinds of questions are being explored, such as the extent to which someone's identity could be stolen to identify them as a terrorist actor.

There are many plausible scenarios. I'm not sure how facial recognition technology might play into that specifically, but this is where scenario planning and those sorts of techniques can be very useful to draw in the kinds of lines of inquiry that you are concerned about.

11:40 a.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

Could you tell us a little more about the type of government framework we should be thinking about?

11:40 a.m.

Strategic Advisor and Foresight Expert, As an Individual

Sanjay Khanna

I think the frameworks can emerge only from the kind of study that this committee is doing already. There may be studies that are occurring in parallel that you need to draw upon to look at these challenges more holistically. I think that's what I would ask.

Facial recognition technology is embedded in a whole bunch of other technologies, and to accelerate development requires machine learning, computer vision and a whole bunch of other sorts of areas. It needs to be looked at quite holistically in order for Parliament to develop that kind of holistic framework that's needed, I believe.

11:40 a.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

Thank you very much, Mr. Khanna.

Mr. Jenkins, considering the speed at which technology is evolving, is it too late to act?

11:40 a.m.

Prof. Rob Jenkins

No, I don't think it's too late to act. I think it's important that we act now. We should proceed on the basis of evidence—what we know—and use that evidence to try to accomplish what we want.

11:40 a.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

Thank you very much, Mr. Jenkins.

I will leave my remaining 30 seconds to my colleagues.

11:40 a.m.

Conservative

The Chair Conservative Pat Kelly

All right. Thank you. It's appreciated.

We'll move now to Mr. Green for six minutes.

April 4th, 2022 / 11:40 a.m.

NDP

Matthew Green NDP Hamilton Centre, ON

Mr. Chair, I'll happily take those 30 seconds as offered.

Mr. Chair, I think we can all agree that the technical aspects of this committee. I'm not sure we're going to get as deep as we need to go in order to get the kind of report that is going to be required out of this in the time we have allotted, so I'm going to put some very concise questions to all of the witnesses, starting with Dr. Watkins.

Dr. Watkins, based on your subject matter expertise, what would be your top legislative recommendations to this committee? We're going to be putting together a report and hope to have some of these recommendations reflected back to the House for the government's consideration.

11:40 a.m.

Postdoctoral Research Associate, Princeton University, As an Individual

Dr. Elizabeth Anne Watkins

Thank you so much. I would say that I have three top recommendations.

The top one would be to establish a moratorium. It's simply too unreliable for the futures and the livelihoods to which we are allocating responsibility.

The second two recommendations would involve accountability and transparency.

We need better insight into how these tools are being used; where the data is being stored; how decisions are being made with them; whether or not humans are involved; and how these decisions are embedded within larger bureaucratic organizational structures around how decisions are being made. Some kind of documentation to give us insights into these processes, such as algorithmic impact assessments, would be very useful.

Further, we need some kinds of regulatory interventions to produce accountability and build the kinds of relationships between the government, private actors and the public interest so that the relationships can be built to ensure that the needs of the most vulnerable are addressed.

11:40 a.m.

NDP

Matthew Green NDP Hamilton Centre, ON

Ms. Wang, what would be your top legislative recommendations to this committee for its consideration?

11:40 a.m.

Computer Science Graduate Researcher, Princeton University, As an Individual

Angelina Wang

I don't think I have anything else to add to what Dr. Watkins has said.

11:40 a.m.

NDP

Matthew Green NDP Hamilton Centre, ON

Okay.

Professor Khanna, what would be your recommendations to this committee?

11:40 a.m.

Strategic Advisor and Foresight Expert, As an Individual

Sanjay Khanna

I think the safeguards need to be increased, certainly for children, marginalized groups and first nations in particular. The COVID pandemic has made things worse for all of those populations, and it's important to consider what the trajectory is in order to figure out what kinds of harms could plausibly occur in the years to come, given the shocks we've already experienced.

11:40 a.m.

NDP

Matthew Green NDP Hamilton Centre, ON

What kind of safeguards would you recommend? Do you have any specificity around that?

11:40 a.m.

Strategic Advisor and Foresight Expert, As an Individual

Sanjay Khanna

No. I would need to take some time to think about where, specifically, the strengthening could occur, but there are some reports—for instance, the UNICEF “Policy guidance on AI for children” of November 2021—that could be very valuable in this context.

11:45 a.m.

NDP

Matthew Green NDP Hamilton Centre, ON

I would put to all witnesses that if, after this, you come up with some thoughts that you weren't able to articulate in our fast-fire rounds, to consider providing them to this committee for consideration in writing, and hopefully they will also be included in our report.

Professor Jenkins, what are your top legislative recommendations for this committee's consideration?

11:45 a.m.

Prof. Rob Jenkins

I would say attention to human operators in the design and implementation of facial recognition systems, transparency and the development of an expert workforce in facial recognition.

11:45 a.m.

NDP

Matthew Green NDP Hamilton Centre, ON

Thank you very much.

Professor Watkins, I noted that in a report called “Now you see me: Advancing data protection and privacy for Police Use of Facial Recognition in Canada” that “Danish liberal deputy Karen Melchior said during parliamentary debates that 'predictive profiling, AI risk assessment and automated decision-making systems are weapons of “math destruction”', because they are 'as dangerous to our democracy as nuclear bombs are for living creatures and life.'”

Given that kind of framing of “weapons of 'math destruction'”, you noted that there's going to be an important accountability in the private sector. I note that Amazon has just had its first unionization. Hopefully, there will be some discussions around this.

What safeguards should we be putting on the private sector to ensure that these “weapons of 'math destruction'” are not unleashed on the working class?

11:45 a.m.

Postdoctoral Research Associate, Princeton University, As an Individual

Dr. Elizabeth Anne Watkins

That's a fantastic question. The private sector often goes under-regulated when it comes to these sorts of technologies.

There's a really fascinating model available in the state of Illinois under their Biometric Information Privacy Act. They established that, rather than having a notice and consent form, whereby users have to opt out of having their information used, it's actually the reverse, so that users have to actually opt in. Users have to be consulted before any kind of biometric information is used.

Biometric information is defined quite widely in that legislation. As far as I can recall, it includes facial imprints as well as voice imprints. This legislation has been used to wage lawsuits against companies in the private sector—for example, Facebook—for using facial recognition in their photo-identification processes.

So looking at that kind of legislation, which places control over biometric information back into the hands of users from the get-go, would be very advantageous in terms of taking steps toward putting guardrails around the private sector.

11:45 a.m.

NDP

Matthew Green NDP Hamilton Centre, ON

I will close by saying that in one of your papers, you and your colleagues wrote that, “Despite many promises that algorithmic systems can remove the old bigotries of biased human judgement, there is now ample evidence that algorithmic systems exert power precisely along those familiar vectors.” Can you comment on that statement?

11:45 a.m.

Postdoctoral Research Associate, Princeton University, As an Individual

Dr. Elizabeth Anne Watkins

Thank you.

While AI, machine learning and algorithmic technologies appear to be very futuristic, very innovative and brand new, they're based on data that has been gathered over years and decades, reflecting things like institutional biases, racism and sexism.

This data doesn't come from nowhere. It comes from these institutions that have engaged, for example, in over-policing certain communities. Processes like over-policing then produce datasets that make a criminal look a certain way, when we know that doesn't actually reflect reality. These are the institutional ways in which they see populations.

Those datasets are then the very datasets on which AI and machine learning learn and they learn what the world is. So rather than being innovative and futuristic, AI, machine learning and algorithmic processes are actually very conservative and very old-fashioned, and they are perpetuating the biases that we, as a society, ought to figure out how to step forward and get past.

11:45 a.m.

NDP

Matthew Green NDP Hamilton Centre, ON

Thank you very much.

11:45 a.m.

Conservative

The Chair Conservative Pat Kelly

We now go to Mr. Kurek for five minutes.

11:45 a.m.

Conservative

Damien Kurek Conservative Battle River—Crowfoot, AB

Thank you very much.

Thank you to the witnesses for providing your expertise to the committee. Let me first make a quick comment. As a number of my colleagues have said, the way these committee reports work is that only evidence presented can end up in the report. So if there is any further documentation, thoughts or evidence that you believe would be valuable for this committee to see, including your recommendations, please feel free to send it our way. It becomes incredibly helpful as we compile reports. Let me make that offer to all of you beyond simply answering the questions that are asked here today.

To follow up on a question Mr. Green asked, Dr. Khanna, do you support a moratorium on FRT until there is a framework in place?