Evidence of meeting #95 for Access to Information, Privacy and Ethics in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was tiktok.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Brett Caraway  Associate Professor of Media Economics, University of Toronto, As an Individual
Emily Laidlaw  Associate Professor and Canada Research Chair in Cybersecurity Law, University of Calgary, As an Individual
Matt Malone  Assistant Professor, Thompson Rivers University, As an Individual
Sam Andrey  Managing Director, The Dais
Joe Masoodi  Senior Policy Analyst, The Dais

4:25 p.m.

Liberal

Iqra Khalid Liberal Mississauga—Erin Mills, ON

Thank you very much.

My apologies, Dr. Laidlaw. I didn't address you as I should have. That's my bad.

4:25 p.m.

Associate Professor and Canada Research Chair in Cybersecurity Law, University of Calgary, As an Individual

Dr. Emily Laidlaw

I didn't even notice, but thank you.

4:25 p.m.

Conservative

The Chair Conservative John Brassard

Thank you, Ms. Khalid.

We now go to Mr. Villemure.

Before we continue with Mr. Villemure, I want to make sure that our guests have their interpretation on, if they need it.

Go ahead, Mr. Villemure. You have six minutes.

4:25 p.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

Thank you, Mr. Chair.

Thank you to our two witnesses for being here today. Their reputations precede them.

I'm going to start with Ms. Laidlaw.

You aren't a fan of self-regulation, are you?

4:25 p.m.

Associate Professor and Canada Research Chair in Cybersecurity Law, University of Calgary, As an Individual

Dr. Emily Laidlaw

No, I am not, but I am a fan of giving room to the companies to come up with the solutions themselves.

4:25 p.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

If I asked you today to set up a regulator—hopefully, not the CRTC—what would you recommend?

4:25 p.m.

Associate Professor and Canada Research Chair in Cybersecurity Law, University of Calgary, As an Individual

Dr. Emily Laidlaw

I recommend that we create an online safety regulator and that they have an obligation to investigate companies and to audit companies for their compliance with specific duties. I think the duty should be a duty to act responsibly with, perhaps, a special duty of care to children.

I think the regulator should also have a very important education role with the public. We have realized that so much of this is about lifting up the capacity and understanding of the public, and also holding companies accountable.

4:25 p.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

Would that regulator be similar to the Conflict of Interest and Ethics Commissioner or the Commissioner of Lobbying? In other words, would the regulator be someone appointed by Parliament, or would it be a public servant in a department?

4:25 p.m.

Associate Professor and Canada Research Chair in Cybersecurity Law, University of Calgary, As an Individual

Dr. Emily Laidlaw

I think it's absolutely crucial that this regulator is independent from government. It would be more akin to the Privacy Commissioner because you would be creating a digital human rights regulator. They need to be independent from any pressure when it comes to how to balance rights. It needs to be through a legal lens and a corporate accountability lens. Also, there needs to be the power to impose quite hefty monetary penalties, as Dr. Caraway mentioned.

4:30 p.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

Precisely. Earlier, someone said the penalties should be a percentage of the company's revenues, as opposed to a $25,000 fine, which is trivial under the circumstances.

You talked a lot about education, as have all the witnesses we've heard from, including police officials. Ultimately, though, no one has said who should educate who. Perhaps the new regulator should have a mandate to provide that education, even in schools, since we are talking about young people.

What does that education look like to you? Everyone is in favour of education, but no one has put forward a solution as of yet.

4:30 p.m.

Associate Professor and Canada Research Chair in Cybersecurity Law, University of Calgary, As an Individual

Dr. Emily Laidlaw

I think it's crucial that it's through the regulator, and we've seen this in Australia with their eSafety Commissioner. I think that would be the model here for a regulator and educator.

I think partly it's that the education across the country and in different schools and communities varies greatly, and it depends on people reaching out for the information. It depends on schools bringing in the right people. At the moment, there is a lot of just scaring children or parents, and most of the studies show that's ineffective. I've tried to say that to my children's school, and they've been really receptive.

I think education is so core to this that the regulator needs that as part of their mandate.

4:30 p.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

The regulator should have the authority to go into schools to deliver that education. Is that right?

4:30 p.m.

Associate Professor and Canada Research Chair in Cybersecurity Law, University of Calgary, As an Individual

Dr. Emily Laidlaw

That's an interesting question about federal-provincial powers that certainly could set the curriculum and provide the resources that would hopefully influence different schools and even municipalities and what they're implementing and so on. I guess the hope is that this will trickle down. Ultimately, there is a provincial aspect to this, so if we start seeing provincial regulators appear, then maybe they could work together, much like the way we have seen with the privacy commissioners.

4:30 p.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

I am a staunch advocate of respecting provincial jurisdiction.

You said Canada was a laggard in digital legislation.

Is it too late?

I'm quite familiar with the European law. We can try to catch up, but is it too late?

4:30 p.m.

Associate Professor and Canada Research Chair in Cybersecurity Law, University of Calgary, As an Individual

Dr. Emily Laidlaw

We're not too late now, but we will be soon if we don't introduce laws. Europe and the U.K. just passed their online safety legislation—the Digital Services Act—earlier this year or in the last year, and they're in the midst of implementing it.

If you fast-forward five years, what I think we're going to see is more coordinated global investigations of companies, which takes care of some of the cross-border issues. If Canada doesn't move on this in the next year or so, I think they will fall woefully behind. However, right now we do have a late-mover advantage.

4:30 p.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

In 30 seconds, can you tell me whether you support Bill C‑27 as it currently stands?

4:30 p.m.

Associate Professor and Canada Research Chair in Cybersecurity Law, University of Calgary, As an Individual

Dr. Emily Laidlaw

I fully support the recommendations for amendments by Commissioner Dufresne regarding Bill C-27. I think it needs to be amended. I think it only solves part of the problem, because it's still a consent paradigm. Also, as long as it relies on consent, it doesn't dive into some of the more problematic aspects of social media and their influence, which, really, nobody can consent to.

Therefore, unless we wholly change Bill C-27, which I don't think we'll do, we need online harms legislation. I do think the AI act is problematic and needs to be pulled out of Bill C-27 and reworked. It absolutely should not be set up under ISED as a commissioner within that body.

4:30 p.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

Thank you very much.

4:30 p.m.

Conservative

The Chair Conservative John Brassard

Thank you, Mr. Villemure.

Being aware of the time and the votes, what I am thinking—and I want you to think about this as well—is that we can go six minutes with Mr. Green. We're going to need some time to switch over to the next panel. We could have the opening statements. I expect we're going to have two opening statements in the next panel.

That would take us roughly up to the time of the votes, but it would end this round after Mr. Green. I would encourage our witnesses to submit any additional thoughts they may have.

Mr. Green, go ahead for six minutes, please.

4:30 p.m.

NDP

Matthew Green NDP Hamilton Centre, ON

Thank you very much, Mr. Chair.

I want to pick up on some of this, particularly around Bill C-27. I myself think that this portion of the bill would have been better dealt with here under an ethical framework rather than an industry one.

Dr. Laidlaw, can you maybe talk about the ethics of AI and why, from a legal framework, those considerations in terms of the legitimacy of democracy and the ways in which AI is undermining society would probably be best situated as a carve-out, as you just suggested?

4:35 p.m.

Associate Professor and Canada Research Chair in Cybersecurity Law, University of Calgary, As an Individual

Dr. Emily Laidlaw

That's a great question. Thank you.

I think we have seen, just in the last year, the way AI has transformed our society, and we're just at the beginning of that journey. The problem with the AI act, as it stands right now, is that it's not sufficiently developed to be able to actually cope with the different problems we're going to face. It needs to be carved out so that we can actually sit down and have a proper discussion about the ways in which AI can be used that fundamentally will disrupt democracy, interfere with our ability to make decisions and create physical risks to us individually or collectively.

We need to break down those various risks and the opportunities and draft a legislation that reflects that. I think we do have a model, as well, in Europe that can help us. However, as it stands, the AI act must be amended.

4:35 p.m.

NDP

Matthew Green NDP Hamilton Centre, ON

I want to get more specific.

You referenced the undermining of democracy. I'll reference the case of Cambridge Analytica, where we know that Facebook did not undertake sufficient oversight to ensure that the use of data was done according to its own terms of service.

I think I heard in your testimony that having the industry regulate itself is a problem, although it might be able to present some solutions.

How confident should we be that social media companies have a full grasp of how their data is being used and whether the data is being properly protected? Further to that, do you think they know and just perhaps allow it to happen anyways?

4:35 p.m.

Associate Professor and Canada Research Chair in Cybersecurity Law, University of Calgary, As an Individual

Dr. Emily Laidlaw

I think it's a bit of both. I think they are not providing the full picture. I do not think they fully know what is happening.

For example, a colleague of mine, Joel Reardon, has done some reverse engineering of various apps that say they put in place all the child protection measures. What has been revealed through this is that many apps have not.

Essentially, we're relying on people finding this out and then having a scandal. That's just woefully insufficient here. Transparency on its own is meaningless. We actually need some sort of avenue to investigate, audit and lift the lid on these companies. Otherwise, we end up with a crisis like Cambridge Analytica.

4:35 p.m.

NDP

Matthew Green NDP Hamilton Centre, ON

Let's go back to that.

In your opinion, are there ways, through legislation and regulation, that the federal government could do a better job of protecting the personal information as collected and used by these platforms?

4:35 p.m.

Associate Professor and Canada Research Chair in Cybersecurity Law, University of Calgary, As an Individual

Dr. Emily Laidlaw

I think one thing we need to look more closely at is what the no-go zones are. There are actually certain forms of collecting data that should be seen as wholly inappropriate. I think we still rely so much on consent that it has—