Evidence of meeting #100 for Industry, Science and Technology in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was consent.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Sébastien Gambs  Canada Research Chair, Privacy-Preserving and Ethical Analysis of Big Data, Université du Québec à Montréal, As an Individual
Philippe Letarte  Head of Policy and Public Affairs, Flinks
Alexandre Plourde  Lawyer and Analyst, Option consommateurs
Sara Eve Levac  Lawyer, Option consommateurs
Sehl Mellouli  Deputy Vice-Rector, Education and Lifelong learning, Université de Montréal

5:20 p.m.

Liberal

Tony Van Bynen Liberal Newmarket—Aurora, ON

Thank you, Mr. Chair.

In previous testimony, and in some again today, there are two different approaches in managing and making safe the use of artificial intelligence.

The approach we're looking at currently is that we're looking at how we regulate artificial intelligence in various capacities. Those capacities are privacy, competitiveness and the use of technology.

Then we've heard in the past—and I think this was the reference in the previous meeting—about using the distributed model for regulation, which is to have the Privacy Commissioner take a look at the use of artificial intelligence in that capacity, and similarly to have the commissioner for competition and for technology do that as well.

My question is for Mr. Gambs.

What's your thought on those two different approaches? Which would you prefer, or which would you recommend as being more effective?

5:25 p.m.

Canada Research Chair, Privacy-Preserving and Ethical Analysis of Big Data, Université du Québec à Montréal, As an Individual

Sébastien Gambs

I think using the Privacy Commissioner's expertise on privacy and other issues in artificial intelligence is a good way to leverage the expertise that is already there. I think a centralized entity that is able to audit companies for privacy and also for fairness and explainability would be the more efficient way to go forward, rather than splitting this into different entities that would have to coordinate anyway, because this issue is intricate. If you are a machine learning engineer and you have to implement privacy, fairness and explainability in your AI model, there is tension and synergy between these issues, and you cannot do them separately. I think the auditing part would also be one entity with the expertise to do that.

5:25 p.m.

Liberal

Tony Van Bynen Liberal Newmarket—Aurora, ON

Thank you.

My next question is for Mr. Mellouli.

You keep making references to the black box. First of all, you mentioned that it's critically important that we authenticate the data and ensure the data is accurate. One part of the question is, how do we go about making sure, or should we regulate a methodology for authenticating data?

Second, with respect to the black box that all of this goes into, the artificial intelligence and data act will impose the obligation on those responsible for the intelligence system to contribute to it. Is there a way to provide, or does this bill provide, sufficient algorithmic transparency, and is there enough authority in that in what you have seen in the bill? I'm concerned about the authenticity of the data and whether there is a way to regulate it.

Second is the transparency. Does the bill go far enough to satisfy the needs for the transparency of the algorithm?

Am I frozen? Can you hear me?

5:25 p.m.

Liberal

The Chair Liberal Joël Lightbound

Yes, Tony. You need to leave some time for translation.

5:25 p.m.

Deputy Vice-Rector, Education and Lifelong learning, Université de Montréal

Sehl Mellouli

I think that the bill, as it stands today, doesn't go far enough to regulate the black box. That's really the issue.

You're asking whether data use should be regulated. I think that it should. As you said a number of times, I think that the Privacy Commissioner can play a major role in raising awareness.

In all honesty, the data can be used for any purpose. I can give you any application. You click on the accept button and you're told that your request has been sent. As a consumer, you have no idea whether it has actually been sent. In the age of big data, managing hundreds of millions of data items is a complex business.

Will the bill make it possible to control everything? Personally, I'm not sure. It is possible to set out ways to train and educate people on data definitions, data selection and the use of data in artificial intelligence systems? I think so.

This can go beyond the data. It can even affect the teams that choose the data. This choice can have a major impact on discrimination. We've seen this in applications where certain categories of people weren't included in the data selection process. As a result, certain groups received positive treatment. However, one segment of the population received negative treatment. There are some examples of this issue.

In my opinion, the bill can be improved to better regulate data use; ensure greater accountability on the part of companies; and give the Privacy Commissioner a bigger role and more powers, by boosting the commissioner's ability to raise awareness and educate people about data use.

5:30 p.m.

Liberal

Tony Van Bynen Liberal Newmarket—Aurora, ON

I have one quick question. Do you feel that there's enough value in the penalties?

I've read that there are monetary penalties. Is there any provision that should be considered in terms of requiring the offending party to disgorge the data that was created and/or to stop processing it? Do you feel that it would be a critical authority for the Privacy Commissioner or the tribunal to have?

If it's only a monetary penalty, then it simply becomes a cost of doing business. How can we have a more meaningful regime in terms of penalties?

5:30 p.m.

Deputy Vice-Rector, Education and Lifelong learning, Université de Montréal

Sehl Mellouli

There can always be a tougher penalty system. However, as I said earlier, these systems aren't foolproof. The flawed nature of these artificial intelligence systems must be taken into account. A company may comply with all the processes, but in the end, the results may not be consistent or expected. Also, when a company uses artificial intelligence data and sees hundreds of millions of data items, there's no guarantee that all the data is clean or compliant.

In my opinion, if restrictions on data use become much tighter, it could also hamper economic development. This ecosystem is developing at breakneck speed, and our companies must remain competitive. To that end, they need to use data. If data control is too restricted, it could slow down the development of systems. A smart system isn't developed overnight. It takes time.

As a result, data use must be controlled, but this control can't be exhaustive. There could be a form of supplementary control. This matters given that data lies at the heart of artificial intelligence. The more restrictions and reporting requirements are imposed on companies, the more it will adversely affect the economy. I don't know the extent of that impact. That said, global competition in this area is enormous.

5:30 p.m.

Liberal

The Chair Liberal Joël Lightbound

Thank you, Mr. Mellouli.

I think that you have hit the nail on the head when it comes to our concern. We need to strike a balance between these two interests, which don't always see eye to eye.

Mr. Lemire, the floor is yours.

5:35 p.m.

Bloc

Sébastien Lemire Bloc Abitibi—Témiscamingue, QC

Thank you, Mr. Chair.

Mr. Letarte, we heard that the bill doesn't clearly identify what qualifies as an adverse effect, particularly when it comes to exempting an organization with a legitimate interest from the need to obtain an individual's consent to collect, use and share their data.

In your expert opinion, how should the bill clarify this provision on adverse effects, and how could this ambiguity affect privacy?

5:35 p.m.

Head of Policy and Public Affairs, Flinks

Philippe Letarte

It's necessary to look at the reason for an adverse effect. A violation of privacy may be good for a company, but is it good for the consumer? There's always some sort of dilemma.

For example, a person's consumption habits can reveal very private information. For instance, these habits can show whether a person has started a diet, bought a house, cut back on spending or changed jobs. A company could get hold of this person's data to create a profile. The company could then notice that the person has changed their consumption habits and that they could benefit from new discounts. Technically, this would be a monetary benefit for the consumer. However, I personally don't think that it's good for a company to have that much information on a person.

The idea is to identify what qualifies as a positive or adverse effect using case studies. The consumer must always come first. When a company collects too much information on a person, it can become a major issue for them.

5:35 p.m.

Bloc

Sébastien Lemire Bloc Abitibi—Témiscamingue, QC

One adverse effect could be to conclude by association that the person is suffering from depression or has mental health issues.

I know that you like to look for innovative best practices. Could any best practices or models used in other places help clarify the provision on adverse effects connected to the legitimate interest exception?

5:35 p.m.

Head of Policy and Public Affairs, Flinks

Philippe Letarte

Yes. In Europe, the General Data Protection Regulation covers the entire continent and is extremely specific when it comes to legitimate interest. It also provides for various exemptions. It even establishes what qualifies as direct marketing and the circumstances that prohibit it. A number of bills determine whether highly targeted and relevant advertising can be deemed positive or adverse. Once again, Australia does more or less the opposite. It prohibits direct marketing, except in certain cases, and it clarifies these exceptions.

There are a number of good practices. The advantage of lagging behind the rest of the world in this area is that we can choose the approach that suits us best.

5:35 p.m.

Bloc

Sébastien Lemire Bloc Abitibi—Témiscamingue, QC

Thank you.

5:35 p.m.

Liberal

The Chair Liberal Joël Lightbound

Thank you.

Mr. Masse, the floor is yours.

5:35 p.m.

NDP

Brian Masse NDP Windsor West, ON

Thank you.

I'd like to continue with your opinions on the United States and its process right now. If we take a different approach, how will that potentially affect investment trading, because we have many companies that are matching up?

Thank you.

5:35 p.m.

Head of Policy and Public Affairs, Flinks

Philippe Letarte

The good news is that the CFPB released its first set of rules a couple of weeks ago, which closely look at what it wants to do. Of course, the CFPB doesn't care about everything and it takes more of a laissez-faire approach on some stuff, but for the first time, it clearly outlines that it wants to create a universal data protectivity right, and it's going to be imposed on financial institutions. Once that's done and we have the same approach—and I know people at Finance Canada are talking to people from the CFPB as well—I don't think it's going to be that difficult to do trade across the border, because the big team and the base principle are quite similar.

5:35 p.m.

NDP

Brian Masse NDP Windsor West, ON

Right. With that is more discretion for the consumer to choose the level of exposure that they want.

5:35 p.m.

Head of Policy and Public Affairs, Flinks

Philippe Letarte

Absolutely, and they can choose the time of exposure. For example, if you want to try two different companies for the same product and you prefer one of them, you can drop the other one immediately. Therefore, your data is not used by this company anymore.

It's really about the power of the consumer and the time in which you can revoke your consent.

5:35 p.m.

NDP

Brian Masse NDP Windsor West, ON

Thank you.

Thank you, Mr. Chair.

5:35 p.m.

Liberal

The Chair Liberal Joël Lightbound

Thank you, Mr. Masse.

This concludes the 100th meeting of the House of Commons Standing Committee on Industry and Technology.

I want to thank the panel. I'll take this opportunity to point out that meetings held in French to this extent in Ottawa are more the exception than the rule. Personally, I'm delighted that this was the case for the 100th meeting.

On that note, thank you. I also want to thank Mr. Mellouli and Mr. Gambs, who joined us virtually. I particularly want to acknowledge Mr. Mellouli, who is from Université Laval, in my constituency. I would also like to thank the interpreters, the analysts and the clerk.

We'll briefly suspend the meeting before continuing in camera for committee business.

The meeting is suspended.

[Proceedings continue in camera]