Thank you very much.
Mr. Van Popta, I'm going to let you continue with your time for the questions.
Evidence of meeting #127 for Justice and Human Rights in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was platforms.
A recording is available from Parliament.
Liberal
The Chair Liberal Lena Metlege Diab
Thank you very much.
Mr. Van Popta, I'm going to let you continue with your time for the questions.
Conservative
Liberal
The Chair Liberal Lena Metlege Diab
I will give you two minutes. How's that? That's probably very nice, to give you two minutes.
Conservative
Tako Van Popta Conservative Langley—Aldergrove, BC
Okay, that's good. Thank you.
Thank you to all the witnesses.
Ms. Baron, I have a question for you. I'm reading from an article written by you that was published in The Hub on February 28 of this year. You said, “The internet is an ugly place.” I agree with you. There's a lot of good, and there's a lot of ugliness. You said that the online harms act is “a profoundly anti-free expression bill that threatens draconian penalties for online speech, chilling legitimate expression by the mere spectre of a complaint to the Canadian Human Rights Commission or the new Digital Safety Commission of Canada.”
Now, you heard that the minister has parsed parts 2 and 3 out of this bill, so I'm assuming it's a less offensive bill. Here's my question for you. In your opinion, if we were to also remove part 1, so all that's left is part 4, would that be a good stand-alone bill? Could that work together with Bill C-412 as well?
December 12th, 2024 / 12:55 p.m.
Executive Director, Canadian Constitution Foundation
I think part 4 should be moved forward immediately. It's pressing and urgent, and I think it's really unfortunate that this government has lumped it together with parts 1, 2 and 3, which are entirely different.
As for Bill C-412, it does have language that is disconcertingly vague to us—content that can lead to loneliness, content that constitutes bullying, content that is harmful to dignity. This is also vague and could lead to takedowns of protected content. I think that bill needs to be debated and studied further.
Conservative
Tako Van Popta Conservative Langley—Aldergrove, BC
That's fair enough. Hopefully we will debate it, and hopefully you will be back to give evidence on that.
My question really is, could the two be debated at the same time?
Conservative
Liberal
The Chair Liberal Lena Metlege Diab
Thank you very much for that.
We will now go for four minutes, two minutes and two minutes, and that will wrap it up.
Please go ahead, Mr. Maloney.
Liberal
James Maloney Liberal Etobicoke—Lakeshore, ON
Thank you, Madam Chair.
Thank you to the witnesses.
Ms. Baron, I want to pick up on something you said in your opening remarks. You've repeated several times that you're not in favour of a digital safety commission and the process laid out in part 1. You just made it very clear that you don't support Bill C-412 because it's too “vague”, which is a word that's been used by virtually every witness who's been asked about it. We've had two categories of witnesses on Bill C-412: They either didn't know about it or didn't like it, so I'll leave that there.
But what don't you like about the idea...? You said in your remarks, “there are other ways of enforcing that”, and then you went on to criticize the court process. Where does that leave us?
Executive Director, Canadian Constitution Foundation
Well, first of all, in the court process, we can look at the causes of why we see shockingly lenient penalties being meted out to those individuals who are convicted—
Liberal
James Maloney Liberal Etobicoke—Lakeshore, ON
With all due respect, that's an entirely different issue. We're talking about taking measures to remove stuff from the Internet. It has nothing to do with penalties for people who have been charged and convicted. Let's separate the two, if we can, please.
Executive Director, Canadian Constitution Foundation
I think there are just much more nimble and focused approaches that could go after child sexual exploitation materials and child predation, as well as revenge porn. You do not need a $200-million regulator and commissioner. Also, the majority of Canadians who are going to be affected by the provisions in part 1 are adults who are perhaps communicating spicy opinions online. That is, luckily, the majority of individuals.
Liberal
James Maloney Liberal Etobicoke—Lakeshore, ON
Okay. I'm asking you, then, what are these nimble and useful approaches? So far, all I've heard you say is that this one doesn't work and the courts are no good either, so what is it?
Executive Director, Canadian Constitution Foundation
It's perhaps a pared-down office that focuses just on revenge porn and child sexual exploitation materials.
Liberal
James Maloney Liberal Etobicoke—Lakeshore, ON
So it's some sort of bureaucratic mechanism, some sort of structure in place, just not the one that's being proposed.
Is that what you're saying?
Executive Director, Canadian Constitution Foundation
It's not my job to present the precise mechanism. It's my job to point out a mechanism that would be less offending of constitutional rights.
Liberal
James Maloney Liberal Etobicoke—Lakeshore, ON
Okay, well, it's our job to come up with a mechanism and come up with a solution. When witnesses like you, who come here with a certain level of expertise, criticize what's being proposed, I would really like to hear your thoughts on alternatives. That's why I'm asking if you have any. If you don't, then that's fine too.
Executive Director, Canadian Constitution Foundation
I think I've said all that I have to say about what a future response should look like.
Liberal
James Maloney Liberal Etobicoke—Lakeshore, ON
All right.
Now, you said something else that intrigued me. I guess this is sort of the Internet or social platforms self-regulating. You said that people can migrate from one platform to another. How does migrating from Twitter or Facebook over to Bluesky, which is the current popular social media platform, help address the concerns of the mothers, the families and the victims we've heard from in this study on previous occasions? That does nothing, other than that they go to a nicer platform. How does that address issues like the dark web? How does that create a solution for these families that are victims?
Executive Director, Canadian Constitution Foundation
To be clear, I was answering that question in the context of a witness who spoke about feeling unsafe while communicating on social platforms. I was not answering it in the context of child predators. I think that, clearly, all of the platforms are aware that this content gets distributed. They have algorithms. They have ways to flag it and remove it much faster. No doubt, tragedies still happen. To the extent possible, I think that should be addressed. My comment about migrating to Bluesky was in an entirely different context.
Liberal
James Maloney Liberal Etobicoke—Lakeshore, ON
If I could sum up your testimony, you're here saying that we need to take some drastic steps to protect children and create a safer online environment, but you're offering no solutions or a way to do that.
Executive Director, Canadian Constitution Foundation
I'm saying that existing penalties ought to be enforced. Do you think this content is not already criminalized? Of course it is.
Liberal
James Maloney Liberal Etobicoke—Lakeshore, ON
But it's not being removed from the online world, is it? That's what this bill is about.
Thank you.