Evidence of meeting #126 for Justice and Human Rights in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was c-63.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Anaïs Bussières McNicoll  Director, Fundamental Freedoms Program, Canadian Civil Liberties Association
Catherine Claveau  Bâtonnière du Québec, Barreau du Québec
1  As an Individual
Nicolas Le Grand Alary  Lawyer, Secretariat of the Order and Legal Affairs, Barreau du Québec
Michel Marchand  Member, Criminal Law Expert Group, Barreau du Québec
Emily Laidlaw  Associate Professor and Canada Research Chair in Cybersecurity Law, University of Calgary, As an Individual
Étienne-Alexis Boucher  President, Droits collectifs Québec
Matthew Hatfield  Executive Director, OpenMedia

Peter Julian NDP New Westminster—Burnaby, BC

Thank you, Mr. Chair.

I want to say to you, Jane, that I've been in Parliament for many years, and this is one of the most moving presentations I've heard by a witness. I know it would have been extremely difficult for you to come forward to this committee. We can't thank you enough for your brutal honesty on what your daughter has been through. It is something that I think will remain in our minds for some time to come. Thank you for sharing that. All of us hope that your daughter is getting the care and supports she needs.

The fact that these images are continuing to circulate obviously shows the importance of moving forward as quickly as possible with the provisions of the bill in part 1 that deal with criminal sexual exploitation of children.

At this point, is it individuals, companies...? Who is continuing to perpetuate these terrible images of crime?

4:40 p.m.

As an Individual

Witness-Témoin 1

It's anyone who is accessing the dark walls of the web—child predators, people who are interested in that kind of stuff. They are the ones who are trading these images and uploading them on a regular basis—pretty much daily.

Peter Julian NDP New Westminster—Burnaby, BC

They are doing it with impunity.

4:40 p.m.

As an Individual

Witness-Témoin 1

Unfortunately, yes. It's not only the images. There's regular talk about my child.

Peter Julian NDP New Westminster—Burnaby, BC

I can't imagine, as a parent, what you're going through and what she's going through.

4:40 p.m.

As an Individual

Witness-Témoin 1

It's a very scary situation.

Peter Julian NDP New Westminster—Burnaby, BC

The message you're sending us is very clear: that we need to take action. I think all members of the committee understand that. I can't thank you enough for coming forward today to share that with us.

I have questions for the other witnesses.

Now I'm going to turn to Ms. Bussières McNicoll and Ms. Claveau.

Part 1 of Bill C‑63 establishes fines. Operators are liable to “a fine of not more than 3% of the person’s gross global revenue or $10 million, whichever is greater”.

It says that, on summary conviction, an operator is liable to “a fine of not more than 2% of the person’s gross global revenue or $5 million, whichever is greater”.

Individuals are liable to “a fine of not more than $50,000”. That seems pretty low given the repercussions of the offence in question, such as the impact on Witness 1, her daughter and family.

It's one thing to put a legislative framework in place, but it's another to establish penalties in order to end the scourge. It's clear that the case involving Witness 1's daughter calls for significant penalties.

What do you think of the penalties I just mentioned and the approach outlined in the bill?

I would like Ms. Bussières McNicoll to answer first.

4:45 p.m.

Director, Fundamental Freedoms Program, Canadian Civil Liberties Association

Anaïs Bussières McNicoll

Thank you for your question.

I would say, at the outset, that it's important to put into context the fact that the bill establishes seven types of harmful content. When considering penalties for individuals, lawmakers mustn't go too far by unduly punishing individuals in connection with certain types of content.

As far as the penalties for operators are concerned, I will let those who wish to do so comment on the size of the fines. However, I will say that it's important to keep something in mind: the higher the penalty is, the clearer the duty needs to be. Otherwise, operators will want to fulfill the vague duties imposed on them at all costs, possibly at the expense of users' freedom of expression.

It comes back to the situation I described earlier. Taking an excessively cautious approach in relation to flagged content and responding in a very swift and disproportionate way to assess that content could be harmful to online free speech.

Peter Julian NDP New Westminster—Burnaby, BC

Your concern has to do with the definitions proposed in the bill and the direction taken. Thank you for clarifying that.

Ms. Claveau, I have the same question for you about the structure of the bill and part 1 as it relates to penalties.

I think everyone agrees that it's necessary. What do you think of the approach taken in part 1 of the bill?

4:45 p.m.

Bâtonnière du Québec, Barreau du Québec

Catherine Claveau

I will let Mr. Grand Alary answer that.

4:45 p.m.

Lawyer, Secretariat of the Order and Legal Affairs, Barreau du Québec

Nicolas Le Grand Alary

Thank you for your question.

As you no doubt saw when reading our brief, we didn't comment specifically on part 1 of the bill.

Generally speaking, though, when it comes to these types of penalties and fines, especially an administrative monetary penalty regime, a whole process goes into determining the amounts of the fines, whether they apply to individuals or businesses. In many cases, it's a percentage of the business's revenues. A lot of factors are taken into account.

I won't comment on whether the approach is consistent or appropriate, but I will say that a lot of work has to go into establishing an administrative monetary penalty regime.

I encourage you to compare this regime with others that have already been adopted to see whether there are any similarities. You can also look to the teachings of the Supreme Court in decisions relating to the validity of such regimes.

4:45 p.m.

Conservative

The Vice-Chair Conservative Larry Brock

Thank you. That is your time, Mr. Julian.

That completes our first round.

We are now moving into the second round. This will be the final round for the first panel. It will be for 15 minutes with five minutes, five minutes, two and a half minutes and two and a half minutes.

We're starting with you, Ms. Ferreri.

You have five minutes.

4:45 p.m.

Conservative

Michelle Ferreri Conservative Peterborough—Kawartha, ON

Thank you, Mr. Chair.

Thank you to the witnesses for your testimony today.

Witness 1, do I have permission to call you by the first name that you used? Is that okay?

4:50 p.m.

As an Individual

4:50 p.m.

Conservative

Michelle Ferreri Conservative Peterborough—Kawartha, ON

Thank you.

Jane, what you've done today is very courageous. People don't know this is happening. They have no idea. I believe that halfway to beating this is.... Obviously, we have to do legislation and implement change, but people don't believe that parents traffic their children. People don't believe that children are used as sexual tools online daily, as you've testified here today. They don't know because they don't want to believe that humanity is that horrific.

I want to tell you thank you. We can't fix anything if we don't acknowledge what has actually happened. Thank you for that.

There are a couple of things I want to point out. The big thing we're trying to sort out here is the best recommendation so that we have implementation as soon as possible to protect children online. We've had witness testimony on sextortion. Children are taking their lives.

Jane, you're traumatized for the rest of your life. Your child is traumatized for the rest of her life. The impact on the community is significant.

Right now, the way that Bill C-63 is written, it is calling on—and I'll use the language from it—a digital safety commission of Canada, the digital safety office of Canada, the position of a digital safety ombudsperson, and a mandate for the commission and ombudsperson to follow. This is another aspect of not having action instantly.

To my Liberal colleague's point of an immediate takedown of the image, you're not going to have that with Bill C-63. You need a regulated body to be put in place, which could take years.

What we're saying in Bill C-412 is that we would implement this instantly through the actual social media platform. A judge would have the capacity instantly to name the person who has the image, release their name and charge them. The duty of care then falls on the social media platforms to be implementing age verification—which we know they can do through algorithms.

The issue we're having with Bill C-63 is the same issue we've seen in other regulating bodies. The action doesn't come with the intention.

The example I will give you is the ombudsperson we have in this country for victims. They've seen an increase of 477%. Nothing happens after the victims go to the ombudsman, right? There's no action tied to it.

My question for you, Jane, is this. Would you like to see a bill like Bill C-412 that implements instant action on the social media platforms and enables judges to ensure that those names are released so that there is actually a takedown and not just an intention of takedown?

4:50 p.m.

As an Individual

Witness-Témoin 1

First, what I'm going to say is that I'm not familiar with the bill you just spoke about.

My concern is what's in Bill C-63. I see that adding protection and moving forward for my child and the other children.

4:50 p.m.

Conservative

Michelle Ferreri Conservative Peterborough—Kawartha, ON

I appreciate that you don't know what that bill is, so that's totally fair and I'm happy to share it with you.

I can tell you that with Bill C-63 there still is this concern of its being years down the road. What I'm saying is that we all want the same thing. We want protection of children today, but if you implement a regulating body, and you don't have duty of care to the social media platforms, then it's not instant, because the regulating body then has to have a meeting, with a meeting, and so on.

Do you see what I'm saying? It's not direct to the person. Does that make sense?

4:50 p.m.

As an Individual

Witness-Témoin 1

I am trusting those who are in charge of Bill C-63 with what they're doing for the protection of all children in Canada.

4:50 p.m.

Conservative

Michelle Ferreri Conservative Peterborough—Kawartha, ON

Okay. I appreciate that. I think, obviously, I can't stress enough that we absolutely want the protection of children.

Again, I would put this forward to you. If there were an option between going directly to the social media platform...? I guess I will use you as an example. Right now, why are the images of your child not removed?

4:50 p.m.

As an Individual

Witness-Témoin 1

It's because it's not mandated. Nobody has to remove them; they're not told that they have to. There are no consequences.

Michelle Ferreri Conservative Peterborough—Kawartha, ON

Exactly. With Bill C-63, you would still have to go through a person, a regulating body, so let's say it's an ombudsman. They would then have to have a meeting with the regulating body. Then they would have to go to the social media platform.

What we're saying is that instead of having to go in-between, you would get to go right to a judge; and the judge would say, okay, this is the person—because there's a duty of care for the social media platform to remove that image instantly.

4:50 p.m.

Conservative

The Vice-Chair Conservative Larry Brock

That is your time, Ms. Ferreri.

Thank you.

4:50 p.m.

As an Individual

Witness-Témoin 1

I hear what you're saying, but I'm trusting the process. As a parent, I'm trusting the process.

4:50 p.m.

Conservative

The Vice-Chair Conservative Larry Brock

Thank you, Witness 1.

Moving on to Mr. Bittle, you have five minutes.

Chris Bittle Liberal St. Catharines, ON

Thank you very much, Mr. Chair.

Jane, I'd like to echo what my colleagues have said. The great courage you've shown to come forward is absolutely incredible, and your determination to protect not only your own daughter but also other children is commendable. Thank you.

We heard from Carol Todd at the last meeting, who expressed concern that victims were being asked these technical legal questions, and I don't want to get into these.

However, because you talked about your experience with police and the current process, and there was no help available, I was wondering if you could talk about what it would mean to have a digital safety commission that could act for victims. I know some people will dismiss it as a bureaucracy, but I was wondering if you could speak to that, to have a voice, if that would be beneficial.