Evidence of meeting #107 for Industry, Science and Technology in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was chair.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Vass Bednar  Executive Director, Master of Public Policy in Digital Society Program, McMaster University, As an Individual
Andrew Clement  Professor Emeritus, Faculty of Information, University of Toronto, As an Individual
Nicolas Papernot  Assistant Professor and Canada CIFAR AI Chair, University of Toronto and Vector Institute, As an Individual
Leah Lawrence  Former President and Chief Executive Officer, Sustainable Development Technology Canada, As an Individual

January 31st, 2024 / 6:20 p.m.

Executive Director, Master of Public Policy in Digital Society Program, McMaster University, As an Individual

Vass Bednar

I see that Andrew has his hand up.

Sorry, I don't know who you are directing it to because we're not on the floor.

6:20 p.m.

Prof. Andrew Clement

I was just going to respond to the idea of algorithmic collusion.

The bigger issue, of which I think this is an example, is that we do not know how these algorithms are being used. That's all hidden and proprietary.

There are enormous financial market incentives for companies to take advantage of whatever they can about their data. There might be indirect collusion in terms of creating an environment where algorithmic amplification, reinforcement or however you want to go becomes quite powerful, but is still very opaque.

6:20 p.m.

Liberal

Iqwinder Gaheer Liberal Mississauga—Malton, ON

Thank you.

My next question is for Mr. Papernot.

You mentioned in your opening testimony that there are new developments in AI every single day and that it's an ever-changing field.

Can you talk to the committee more about the minister's suggested amendment on high-impact systems?

For high-impact classes, there will be a schedule. There's an initial list of high-impact classes that will be modified through regulation to keep it flexible as the technology advances.

Do you want to speak on that a little bit?

6:20 p.m.

Prof. Nicolas Papernot

I think one issue with identifying specific systems as being high impact is we have to keep in mind that AI systems can impact other forms of algorithmic data analysis, and so again, the outputs that these systems make can influence other systems downstream. That's something, I think, to keep in mind so that we don't completely deregulate those that are not considered as high impact.

6:20 p.m.

Liberal

Iqwinder Gaheer Liberal Mississauga—Malton, ON

What system would you propose? How would you change the law as it's currently worded?

6:20 p.m.

Prof. Nicolas Papernot

That's not my expertise.

6:20 p.m.

Liberal

Iqwinder Gaheer Liberal Mississauga—Malton, ON

This is generally open for the committee, but we've heard a lot of comparative pieces about the EU regulation and the regulation that we're proposing here. How does AIDA align with what's happening in the EU?

6:20 p.m.

Prof. Nicolas Papernot

I will say that the European Union legislation has a similar issue that it relies too excessively on de-identification. As I mentioned in my statement, the fact that the privacy legislation puts too much emphasis on modifying personal information puts it at odds with what the technology that is currently being developed can do.

We're no longer in this world where we protect data by modifying it. We're protecting data by carefully analyzing it with guarantees that come from the way the data is analyzed, and so I think that just underlies the entire.... If you look at AIDA and CPPA, both of them rely too excessively on de-identification to be implementable in a world where people are going to analyze data with AI, and this is the same in other legislation, and it's the same in the European Union.

6:25 p.m.

Liberal

Iqwinder Gaheer Liberal Mississauga—Malton, ON

Thank you.

6:25 p.m.

Liberal

The Chair Liberal Joël Lightbound

Thank you very much, Mr. Gaheer.

Mr. Garon, you have the floor.

6:25 p.m.

Bloc

Jean-Denis Garon Bloc Mirabel, QC

Thank you, Mr. Chair.

I'd like to thank the witnesses for being here.

Professor Papernot, you talked about the fact that nothing in the bill currently guarantees that anonymity will be preserved. You also talked about de‑identification or anonymization of data, for example. It seems to me that these methods existed before the advent of artificial intelligence, and that economists and statisticians have used them. You put noise in the data, you obtain a regression, and it produces the same result.

As you said, the algorithms are now able to handle the data in such a way that anonymity is no longer guaranteed. However, Part 1 of the bill is very specific about what we consider to be methods that guarantee anonymity. It seems to me that technology is changing very rapidly. Shouldn't there be broader regulations in order to move the framework of what we consider to be technologies that guarantee anonymity at a faster pace than the legislative pace?

6:25 p.m.

Prof. Nicolas Papernot

I think the main problem is that the legislation only talks about anonymizing or de‑identification of data, which are not the only ways to protect personal information. In fact, a number of other techniques can be used that will provide better guarantees, that will better protect individuals and that will make it possible to have more useful analyses for society. So—

6:25 p.m.

Bloc

Jean-Denis Garon Bloc Mirabel, QC

Let me interrupt you, Professor Papernot.

The problem I raised is that the bill lists methods that can supposedly guarantee anonymity. You mentioned others that could be included in the bill. If we were to talk again in five years, do you think you could tell us about new methods, methods that don't exist today?

6:25 p.m.

Prof. Nicolas Papernot

I'm not recommending any other methods that will make it possible to obtain anonymity. Instead, I'm proposing other ways of protecting personal information that don't require arriving at an anonymized dataset.

In fact, the reasoning is wrong: It's impossible to anonymize data, since it can always be reidentified, which has been scientifically proven. The problem is that we cannot model what is already known about the individuals we are trying to protect. I gave the example of transportation data; people already had information on their colleagues and knew which times of the week they had taken the bus. That's what they were using to de‑anonymize the data.

6:25 p.m.

Bloc

Jean-Denis Garon Bloc Mirabel, QC

I understand.

6:25 p.m.

Prof. Nicolas Papernot

What I'm asking is that we look at all the other approaches that don't need to change the data, but rather try to change the analysis of the data. We can continue to use the data, but analyze it differently.

6:25 p.m.

Bloc

Jean-Denis Garon Bloc Mirabel, QC

In your opinion, that's not in the bill as it stands.

6:25 p.m.

Prof. Nicolas Papernot

This isn't in the current bill at all. The most advanced techniques being researched today are not compatible with a bill—

6:25 p.m.

Bloc

Jean-Denis Garon Bloc Mirabel, QC

Since my time is running out, I'd like to ask you a question. Would you be prepared to submit to the committee an explanatory note on your suggestions in this regard?

6:25 p.m.

Prof. Nicolas Papernot

Yes, of course. I have a written example of what I mean, which I can share with the committee.

6:25 p.m.

Bloc

Jean-Denis Garon Bloc Mirabel, QC

Thank you.

I would now like to come back to something that may seem innocuous. My colleague Ms. Rempel Garner asked you some questions and talked about copyright when our faces are used.

I understand that you may not have the necessary expertise, but you said that it wasn't always possible, based on the results of the model, to identify the faces used. Nevertheless, there are still problems related to the fact that you don't know what is being done with your face.

I'm asking you a very naive question. If a child is born tomorrow morning—so the child has never been identified on the Internet—would you suggest that I put the child's face on the Internet, given the current regulations?

6:30 p.m.

Prof. Nicolas Papernot

No.

6:30 p.m.

Bloc

Jean-Denis Garon Bloc Mirabel, QC

Thank you very much. That sends a clear message about the confidence an expert like you may have in the current regulations. There are even people who say that this bill is inadequate and that we should tear it up and rewrite it.

Canadian regulations already exist. Indeed, other legislation directly or indirectly regulates artificial intelligence and data protection. Do you think that, if Bill C‑27 were amended to reflect the advances, there would be a way to improve what we already have, or is it a waste of time?

6:30 p.m.

Prof. Nicolas Papernot

The bill can be improved, of course.

I'll go back to what I was saying earlier. We have to take into account the new technologies that have been developed over the past 10 or 15 years and that allow us to think about the protection of personal information and the way we analyze that data. That is what will make it possible to innovate and continue to develop new algorithms and new artificial intelligence systems while protecting the individuals whose data is used to create those systems.

You have to change the part of the act that requires you to change the data to get that protection. It is that reasoning that is not valid.

6:30 p.m.

Bloc

Jean-Denis Garon Bloc Mirabel, QC

We need to look at data analytics.