Evidence of meeting #126 for Justice and Human Rights in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was c-63.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Anaïs Bussières McNicoll  Director, Fundamental Freedoms Program, Canadian Civil Liberties Association
Catherine Claveau  Bâtonnière du Québec, Barreau du Québec
1  As an Individual
Nicolas Le Grand Alary  Lawyer, Secretariat of the Order and Legal Affairs, Barreau du Québec
Michel Marchand  Member, Criminal Law Expert Group, Barreau du Québec
Emily Laidlaw  Associate Professor and Canada Research Chair in Cybersecurity Law, University of Calgary, As an Individual
Étienne-Alexis Boucher  President, Droits collectifs Québec
Matthew Hatfield  Executive Director, OpenMedia

4:55 p.m.

As an Individual

Witness-Témoin 1

I'm sorry. Were you directing your question towards me?

Chris Bittle Liberal St. Catharines, ON

I was. Thank you.

4:55 p.m.

As an Individual

Witness-Témoin 1

Yes, you're correct. There really wasn't much help in the very beginning.

I didn't really hear your full question. I didn't realize you were directing it to me. I'm sorry.

Chris Bittle Liberal St. Catharines, ON

That's okay. I can rephrase it, because the alternative that's being suggested is that victims are required to take it to a court, and take it to a judge, which is, I think, well meaning, but can also take time.

I was wondering if you could speak about, in your mind, having someone like a digital safety commissioner act on your behalf versus your own requirement of having to take that on your own to a judge or to a court, and how you would see that.

4:55 p.m.

As an Individual

Witness-Témoin 1

Yes, I'm totally all for that. I like to focus on the platforms themselves. Those people should be responsible for what content they're sharing. If they had some responsibility, then they wouldn't be allowed to continue to exploit my child.

Chris Bittle Liberal St. Catharines, ON

You're absolutely right.

Again, thank you for bringing this forward and speaking up, because I can tell you—and I think Mr. Julian could agree with me—that dealing with large tech companies has not been easy. They've fought regulation along the way, but there are consequences and victims, and there's a requirement for government to act. I think there's some disagreement around this table on what that looks like. I think there is unanimity in acting to protect our kids.

I'll turn to Madam McNicoll.

You spoke about protection of privacy versus protection of freedom of speech. I was wondering if you could comment on part 1 of the legislation and its protection of privacy for the victims of these images and the challenge of managing freedom of expression versus the protection of privacy and the rights of the individuals who are exploited by the Internet.

4:55 p.m.

Director, Fundamental Freedoms Program, Canadian Civil Liberties Association

Anaïs Bussières McNicoll

Thank you for the question.

First, there are indeed specific legal obligations, as suggested in part 1 of the bill. These obligations would make it possible to quickly ensure the removal of particularly harmful content. I'm thinking here of content that sexualizes children or perpetuates the victimization of survivors and intimate content shared without consent. In this sense, I consider it a significant step forward.

That said, there are still privacy issues in part 1 of this bill. As a result, one of our recommendations seeks to clarify that the obligations of operators and the obligations of the Digital Safety Commission of Canada and other regulators must respect the privacy of users and operators.

Let me explain.

Of course, we know that operators have access to users' personal information as part of their activities. We also know that certain federal legislation already regulates the collection, retention, protection and sharing of confidential and private information. The failure to specifically refer to these obligations can lead to confusion for operators.

4:55 p.m.

Conservative

The Vice-Chair Conservative Larry Brock

Thank you. That's your time, Mr. Bittle.

We'll move on to Monsieur Fortin.

You have the floor for two and a half minutes.

Rhéal Fortin Bloc Rivière-du-Nord, QC

Thank you, Mr. Chair.

I would like to ask Ms. Claveau or the other Barreau du Québec representatives about life imprisonment.

I gather that the Barreau du Québec considers the provision somewhat broad when it sets out this penalty for a wide range of offences. I also share this view and find it worrying.

However, if we want to convey the seriousness and gravity of the type of offence involved, is there any way to increase the penalty?

I understand that you're proposing to review sentences one by one. Couldn't we include a provision whereby, in certain set cases, the maximum or minimum penalty would be double the prescribed penalty?

Could this be a good option to look into, or do we really need to proceed offence by offence and set out specific penalties?

5 p.m.

Bâtonnière du Québec, Barreau du Québec

Catherine Claveau

Thank you for the question.

I'll refer you to page 8 of our brief. You'll see that we're proposing this exact solution. We're proposing to increase the current penalties and even to increase them by more than the suggested double. This obviously depends on the offence.

My colleague, Mr. Le Grand Alary, will elaborate on this.

5 p.m.

Lawyer, Secretariat of the Order and Legal Affairs, Barreau du Québec

Nicolas Le Grand Alary

We drew from a current provision concerning intimate partner violence. Where the offence is motivated by hate, the maximum sentence would be increased as follows: two‑year sentences would become five‑year sentences, five‑year sentences would become 10‑year sentences, 10‑year sentences would become 14‑year sentences and 14‑year sentences would become life sentences.

You must also understand that the calculations may not always amount to the double. There may be nuances, but this is in line with the logic of the Criminal Code—

Rhéal Fortin Bloc Rivière-du-Nord, QC

Mr. Le Grand Alary, sorry to interrupt. I don't mean to be rude, but I have only a few seconds left.

I gather that the one‑size‑fits‑all solution of simply doubling the penalties isn't a good idea.

Is that right?

5 p.m.

Lawyer, Secretariat of the Order and Legal Affairs, Barreau du Québec

Nicolas Le Grand Alary

That's right. When an offence carries a sentence of 14 years or more, the defendant is entitled to a preliminary inquiry. When it carries a sentence of five years, the defendant is entitled to a jury trial.

The Criminal Code already contains a variety of sentencing scales. Merely doubling the sentences may not be the solution. It might be a matter of reviewing them in light of the scales already established for various types of offences.

Rhéal Fortin Bloc Rivière-du-Nord, QC

Thank you, Mr. Le Grand Alary.

I think that my time is up, Mr. Chair.

5 p.m.

Conservative

The Vice-Chair Conservative Larry Brock

That's the time. Thank you.

Mr. Julian, you have two and a half minutes.

Peter Julian NDP New Westminster—Burnaby, BC

Thank you, Mr. Chair.

I would like to turn again to Ms. Claveau and Ms. Bussières McNicoll.

Ms. Claveau, you spoke about section 13 of the Canadian Human Rights Act and the Canadian Human Rights Commission.

The minister has already expressed an interest in removing this clause from the bill. However, reinstating section 13 in the Canadian Human Rights Act may hamper the Canadian Human Rights Commission's ability to implement the major process required to handle complaints, given that it already lacks resources to do its job.

Ms. Claveau, are you concerned about this situation?

5 p.m.

Bâtonnière du Québec, Barreau du Québec

Catherine Claveau

Yes, we're concerned about the situation.

Peter Julian NDP New Westminster—Burnaby, BC

Along the same lines, if we look at the bill and the current situation, the Digital Safety Commission of Canada set out in the bill won't necessarily have all the resources needed to do its job either.

Are you also concerned that a situation similar to the one experienced by Witness 1 could happen again? In that case, the issue wasn't addressed promptly. Swift action should have been taken and tough measures put in place to address the victimization of children.

5 p.m.

Bâtonnière du Québec, Barreau du Québec

Catherine Claveau

The Barreau du Québec believes that you generally must make sure that you have all the resources needed to implement the legislation. Otherwise, it won't work. It's really important to make sure.

Peter Julian NDP New Westminster—Burnaby, BC

Thank you.

I would like to ask Ms. Bussières McNicoll the same question regarding the need to provide the necessary resources for the Canadian Human Rights Commission and the newly created Digital Safety Commission of Canada.

5 p.m.

Director, Fundamental Freedoms Program, Canadian Civil Liberties Association

Anaïs Bussières McNicoll

Thank you for the question.

You're quite right about the current human rights tribunals. We elaborate on this topic in the brief that we sent you. Their lack of resources and case backlogs are well documented. It's hard to see how adding the hate speech file to their workload without allocating significant resources will help them. From a strictly pragmatic perspective, this raises an issue.

We also have other concerns about asking these tribunals, which have highly specific and significant expertise in equality rights, to regulate hate speech and freedom of expression in Canada.

I have no particular comments regarding the second part of your question. If a new entity is set up to do this work or if new regulators are created, they must receive proper funding.

5:05 p.m.

Conservative

The Vice-Chair Conservative Larry Brock

Thank you. That is your time, Mr. Julian.

Thank you to all of the witnesses in the first round. We appreciate your time and attention.

We're now going to suspend for a few minutes.

5:10 p.m.

Conservative

The Vice-Chair Conservative Larry Brock

I call the committee back to order.

We have, for the second panel, Emily Laidlaw from the University of Calgary; Étienne-Alexis Boucher from Collective Rights Quebec; and Matthew Hatfield from Open Media,.

Members, all these witnesses appearing by video conference have been tested. They all qualify.

That being said, I'd like to turn matters over to the witnesses for their opening statements.

We'll start with you, Ms. Laidlaw, for five minutes.

Dr. Emily Laidlaw Associate Professor and Canada Research Chair in Cybersecurity Law, University of Calgary, As an Individual

Thank you for the invitation to appear before you.

My name is Emily Laidlaw. I'm a Canada research chair and associate professor of law at the University of Calgary.

At the last committee meeting, and earlier today, you heard horrific stories, bringing home the harms this legislation aims to address. With my time, I'd like to focus on the legal structure for achieving these goals, why law is needed, why part 1 of Bill C-63 is structured the way it is and what amendments are needed.

My area of expertise is technology law and human rights: specifically, platform regulation, freedom of expression and privacy. I have spent my career examining how best to write these kinds of laws. I will make three points with my time.

First, why do we need a law in the first place? When the Internet was commercialized in the 1990s, tech companies became powerful arbiters of expression. They set the rules and how to enforce them. Their power has only grown over time.

Social media are essentially data and advertising businesses and, now, AI businesses. How they deliver that to consumers and how they design their products and services can directly cause harm. For example, how they design their algorithms makes decisions about our mental health, pushing content encouraging self-harm and hate. They use persuasive techniques to nudge addictive behaviour, such as with endless scrolling rewards and constant notifications.

Thus far in Canada, we have largely relied on corporate self-governance. The EU, U.K. and U.S. passed legislation decades ago. Many are on their second-generation versions of these laws, and a network of regulators is working together to create global coherence.

Meanwhile, Canada has never passed a comprehensive law in this space. The law that does apply is piecemeal, mainly a bit of defamation, privacy and competition law, circling important dimensions of the problem, but not dealing with it directly.

Where does that leave us in Canada? Part 1 of Bill C-63 is the product of years of consultation, to which I contributed. In my view, with amendments, it is the best legal structure to address online harms.

That brings me to my second point. This legislation impacts the right to freedom of expression.

Our expert panel spent considerable time on how best to protect freedom of expression, and the graduated approach we recommended is reflected in this bill.

There are three levels to this graduated approach.

First, the greatest interference with freedom of expression is content removal, and the bill requests that for only two types of content that are the worst of the worst, the stuff that we all agree should be taken down: child sexual abuse material and non-consensual disclosure of intimate images, both of which are crimes.

At the next level is a special duty to protect children, recognizing their unique vulnerability. The duty requires that social media integrate safety by design into their products and services.

The third, the foundation, is that social media have a duty to act responsibly. This does not require content removal. It requires that social media mitigate the risks of exposure to harmful content.

In my view, the bill aligns with global standards because it's focused on systemic risks of harm and takes a risk mitigation approach, coupled with transparency obligations.

Third, I am not here to advocate that the bill is passed as is. The bill is not perfect. It should be carefully studied and amended.

There are also other parts of the bill that don't necessarily need to amended but entail hard choices that should be debated. To be debated are the scope of the bill; what harms are included and not; what social media are included based on size or type; the regulatory structure; a new versus existing body and what powers it should have; and what should be included in the legislation versus left to be developed later in codes of practice or regulations.

There are, however, amendments that I do think are crucial. I'll close with this list. I have three.

One, the duty to act responsibly should also include the duty to have due regard for fundamental rights and how companies mitigate risk. Otherwise, social media might implement sloppy solutions in the name of safety that disproportionately impact rights. This type of provision is in the EU and U.K. legislation.

Two, the duty to act responsibly and duty to protect children should clearly cover algorithmic accountability and transparency. I think it's loosely covered in the current bill, but it should be fleshed out and made explicit.

Three, the child protection section should be reframed as the best interests of the child. In addition, the definitions of harmful content for children should be amended. There are two main amendments here. One is that content that induces a child to harm themselves should be narrowly scoped so that children exploring their identity are not accidentally captured and, two, addictive design features should be added to the list.

Thank you for your time. I look forward to our discussion.

5:15 p.m.

Conservative

The Vice-Chair Conservative Larry Brock

Thank you, Ms. Laidlaw.

We'll turn now to Mr. Boucher.

You have the floor for five minutes.