Evidence of meeting #99 for Industry, Science and Technology in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was board.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Barry Sookman  Senior Counsel, McCarthy Tétrault, As an Individual
Elizabeth Denham  Chief Strategy Officer, Information Accountability Foundation
Kristen Thomasen  Assistant Professor, Peter A. Allard School of Law, University of British Columbia, Women's Legal Education and Action Fund
Geoffrey Cape  Chief Executive Officer, R-Hauz, As an Individual
Andrée-Lise Méthot  Founder and managing partner, Cycle Capital, As an Individual

4:45 p.m.

Bloc

Sébastien Lemire Bloc Abitibi—Témiscamingue, QC

Thank you, Mr. Chair.

Mr. Sookman, at the ALL IN summit on AI, you hosted a panel entitled Creating tomorrow today: AI, copyright, and the wisdom of experience. I'd be very curious to hear what you have to say about it.

What are the concerns regarding copyright protection, particularly in the cultural sector, but also in a research context.

What are the weak points of Bill C‑27 when it comes to protecting artificial intelligence?

We know that Canada's Copyright Act is now out of date and that it generally provides little copyright protection.

Will Bill C‑27 push us down into an even deeper hole?

4:50 p.m.

Senior Counsel, McCarthy Tétrault, As an Individual

Barry Sookman

Mr. Lemire, thank you very much for that very important question.

The draft law that's in front of you doesn't deal with intellectual property rights at all, and that's unlike the EU legislation, which has at least a provision requiring transparency in data that's disclosed. It's also unlike the draft U.K. bill that requires compliance with copyright laws, and it is also not consistent with draft French legislation, which would also require compliance with copyright laws for training models.

As you know, there is a consultation ongoing with ISED and the Department of Canadian Heritage that asks a number of questions, including whether the act needs to be changed and, in particular, whether there should be a new text and data mining exemption. That is a very important consultation and raises the balance between the ability of creators, including many creators from Quebec, to be able to control the uses of the work and to get compensation for the uses of the work when their models are trained, and what might be an interest that AI entrepreneurs and larger businesses have to use works for training the models. It's a question that has policy considerations in it.

In my view, the existing law adequately sets the standard because, while training models would involve the reproduction right, which would be prima facie infringement, they're always subject to a fair dealing exception and fair dealing is the best way to calibrate, using the current law and the current principles, the use of works without consent.

4:50 p.m.

Bloc

Sébastien Lemire Bloc Abitibi—Témiscamingue, QC

Mr. Chair, how much time do I have left?

4:50 p.m.

Liberal

The Chair Liberal Joël Lightbound

You may ask another question if you wish, Mr. Lemire.

4:50 p.m.

Bloc

Sébastien Lemire Bloc Abitibi—Témiscamingue, QC

All right. I'll save my question for the next round.

4:50 p.m.

Liberal

The Chair Liberal Joël Lightbound

Thank you.

Go ahead, Mr. Masse.

4:50 p.m.

NDP

Brian Masse NDP Windsor West, ON

Thank you, Mr. Chair.

For Ms. Thomasen, you mentioned in your original submission to us that special attention needs to be taken...and also to separate the policy. I had the chance to attend in the United States this summer a couple of different conferences put on by Amazon, Google, Meta and a few others who were lobbying at the national state legislature level. One of the more interesting aspects was that we seemed to be, as they're developing their own AI, in their hands as they're trying to come to grips with the imbalances and the biases that they have because the people who are contributing to the AI are not reflective of society.

Maybe you can highlight that a little bit with regard to some of the concerns you have in terms of how we're currently allowing some models of AI to go forward with very little balance in many respects for societal issues related to gender, race, ethnicity and so forth.

4:50 p.m.

Assistant Professor, Peter A. Allard School of Law, University of British Columbia, Women's Legal Education and Action Fund

Dr. Kristen Thomasen

Thank you. That's an excellent question.

My first reaction would be that this goes back to an important need for this bill to really be able to allow for regulations that will address discriminatory bias outputs that come from AI systems. If the bill is structured in a way that the obligation is clear and actually captures the range of ways in which algorithmic bias can arise, then companies will have an impetus to hire teams that are better enabled to anticipate and mitigate some of those harms.

One of the comments that I included in my introduction was that one of the issues, and what I see as one of the limits of this bill as it's structured right now, is that many occasions of discriminatory bias have been identified after the fact, usually through investigative reporting, usually by experts or people who experience these harms themselves and so have an understanding of the kind of harm that might arise. Then when that becomes publicly known and there's a public backlash, at that point there's an explanation that it was unforeseeable at the time the system was being developed or that the initial idea was developed to automate a decision-making process that was previously done by people, for example.

This bill, in the structure it is in right now, needs to be enabled to capture not just discrimination on recognized grounds but also discrimination by proxy for a recognized ground. Where a postal code might stand in or where employment status or previous experience of imprisonment or social networks might stand in to influence algorithmic decision-making and reflect a protected ground, this bill needs to really capture all the complexity of how these harms can arise so that companies are then motivated to ensure, to the best of their abilities, that this kind of discrimination doesn't happen.

This is also why we recommended an equity audit, which obviously would need more structure probably in regulation to really signal and advance the importance of equity and anti-discrimination in the development of these systems.

It's a crucially important point. I honestly do think that the companion document reflects the importance of that, but we don't see it fleshed out in the bill right now.

4:55 p.m.

Senior Counsel, McCarthy Tétrault, As an Individual

Barry Sookman

Mr. Chair, could I add just a very quick supplemental to that?

Mr. Masse, the question you ask is very important, but we need to recognize that one of the reasons there is the discrimination in algorithms is that they're being trained on data that is currently being used in other contexts.

I think we need to recognize that this problem is not just an algorithmic problem. It's a problem with discrimination that we have in this country that's not being addressed. What's the best entity to deal with that? It's the Canadian Human Rights Commission.

We don't need to solve this problem with AIDA. There's ample authority under the Canadian Human Rights Act to make extensive regulations. They need to be given the power to deal with all discrimination that's currently not covered, including algorithmic. Put the expertise in them, and do not separate them so that the Human Rights Commission is dealing only with older discrimination but not algorithmic discrimination. Let them deal with this so they can take a holistic view and deal with the problem.

4:55 p.m.

NDP

Brian Masse NDP Windsor West, ON

I'll just wrap up real quickly.

I appreciate that. What I saw was extremely disturbing because right now all the AI that's being launched is being launched under the so-called generosity of the companies' own interpretations of what they believe is fair and what they're trying to do for the models they're launching.

The human rights thing I take note of as well.

Thank you to the online witnesses too.

Thank you, Mr. Chair.

4:55 p.m.

Liberal

The Chair Liberal Joël Lightbound

Thank you.

Mr. Vis, the floor is yours.

4:55 p.m.

Conservative

Brad Vis Conservative Mission—Matsqui—Fraser Canyon, BC

Thank you, Mr. Chair.

Thank you to all of the witnesses today.

The chief information officer of Canada, Catherine Luelo, recently resigned from her position. She commented recently that the public service needs to build “credibility” in respect to a fractured IT system with up to two-thirds of government departments being poor at managing information technology systems.

Mr. Sookman, in that context, do you think it is responsible for us as legislators to allow the AI bill to go forward?

4:55 p.m.

Senior Counsel, McCarthy Tétrault, As an Individual

Barry Sookman

Mr. Vis, thank you for the question.

I'm not familiar with that, and I don't know that it provides any insight into whether this bill would be adequate because there isn't proper management of IT.

However, to the extent that there's a requirement for expertise related to the regulation of AI, that expertise doesn't currently exist, in my view, within ISED. This is very complex. If you can't manage an IT system, how are you going to manage an AI ecosystem? If that's your point, I agree with you.

4:55 p.m.

Conservative

Brad Vis Conservative Mission—Matsqui—Fraser Canyon, BC

That came directly from the recent Auditor General's report, which basically outlined that the government is failing Canadians as it relates to the management of data that the Government of Canada collects on behalf of all of us.

Ms. Denham, thank you for your testimony as well. I was very interested in some of the work you've done in the U.K., and I really believe that your testimony provides a lot of weight with respect to the amendments we need to make in respect of children.

Prior to this meeting, I looked at “Age appropriate design: a code of practice for online services”, which came from the Information Commissioner's Office in the U.K. In that document, it is outlined that there are 15 principles related to the protection of children's data. Those include the best interests of the child, data protection impact assessments, age-appropriate application, transparency—that's in respect to how companies are using data, I presume—detrimental use of data, policies and community standards, default settings, data minimization, data sharing, geolocation, parental controls, profiling, nudge techniques, connected toys and devices, and other online tools.

I presume, given your expertise, that you're somewhat familiar with the code of practice outlined in the United Kingdom, which is also part of the Data Protection Act of 2018 and in section 123 of the U.K. act.

Given your expertise, would it be your recommendation that this committee adopt a similar code of practice to ensure that children across Canada are not subject to online harms?

5 p.m.

Chief Strategy Officer, Information Accountability Foundation

Elizabeth Denham

Thank you for that question.

In all the work I've done as a regulator over 15 years, the work that I think is most impactful and the work that I'm most proud of is developing the age-appropriate design code.

5 p.m.

Conservative

Brad Vis Conservative Mission—Matsqui—Fraser Canyon, BC

Did you design this code?

5 p.m.

Chief Strategy Officer, Information Accountability Foundation

Elizabeth Denham

The code is a statutory code, which means it's enforceable by the commissioner and by the court. It's not guidelines.

It follows directly from the U.K. GDPR. It has some provisions in it that I would recommend need to be in the CPPA.

One of them needs to be a statement in the preamble or in the purpose statement that recognizes that companies need to provide services in the best interests of the child. That language comes out of the UN convention that I mentioned earlier. Canada is a signatory to that.

The best interests of the child—

5 p.m.

Conservative

Brad Vis Conservative Mission—Matsqui—Fraser Canyon, BC

Thank you for that.

On that point, please be assured that the Conservative Party will be putting forward an amendment. Hopefully the other parties will work with us to see a clause about “in the best interests of children” put into this legislation.

I have a point of clarification. My mic was off earlier.

Did you write the code standards for children in the U.K.?

5 p.m.

Chief Strategy Officer, Information Accountability Foundation

Elizabeth Denham

Yes, I did—with my team.

We were directed by Parliament to write a code. It was a requirement in the act. It was a one-line requirement to have an age-appropriate design code.

My team worked for two years. We had 57 round tables across all industry sectors. We worked with child psychologists, parents, children themselves, the gaming industry, and the social media and video-sharing companies to come up with a code.

The code has been deeply inspiring to many countries around the world.

5 p.m.

Conservative

Brad Vis Conservative Mission—Matsqui—Fraser Canyon, BC

Thank you. It's very deeply inspiring to me, as a parent of three young children, and to many other parliamentarians across the political spectrum.

With respect to the relationship between Parliament, the privacy commissioner in the U.K. and the design code, does the design code and the legislation in the U.K. provide for flexibility to account for future technologies and future harms that children may face?

5 p.m.

Chief Strategy Officer, Information Accountability Foundation

Elizabeth Denham

It does.

Obviously, the commissioner's office developed the code, but it was laid in Parliament. Parliament approved the code, which gives it the status before the courts that the code needs to be taken into consideration.

5 p.m.

Conservative

Brad Vis Conservative Mission—Matsqui—Fraser Canyon, BC

I have one final, quick question. I'm so sorry to interrupt you.

Do you believe that, if we took a similar measure here for Canadian children, to give the status of a code such as they have in the U.K., we would be doing the right thing to protect children in Canada?

5 p.m.

Chief Strategy Officer, Information Accountability Foundation

Elizabeth Denham

Yes. Because the CPPA gives the commissioner a new power to create codes and certification, there would be a vehicle in the CPPA for the commissioner to develop such a code as is happening across northern Europe, Argentina, Turkey, Ireland and many places around the world. The state of California has this code.

5 p.m.

Liberal

The Chair Liberal Joël Lightbound

Thank you, Mr. Vis. That's all the time you had.

I yield the floor to Ms. Lapointe.

November 28th, 2023 / 5 p.m.

Liberal

Viviane LaPointe Liberal Sudbury, ON

Thank you.

My question is for Ms. Thomasen.

The LEAF organization submitted a brief to this committee in September.

Can you expand on your recommendation to amend proposed section 5 by adding the language of “identifiable group”, “collectively owned” and “group or collective”?

As you explain that amendment, can you also tell us why it is important?