Evidence of meeting #139 for Canadian Heritage in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was example.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Marion Ménard  Analyst
Stéphane Sérafin  Assistant Professor, University of Ottawa, As an Individual
Kathryn Hill  Executive Director, MediaSmarts
Matthew Johnson  Director of Education, MediaSmarts

Matthew Green NDP Hamilton Centre, ON

Would it fit in the liar's dividend?

5:25 p.m.

Executive Director, MediaSmarts

Kathryn Hill

I think what needs to be paid attention to is the intention, the deliberate intention knowingly to deceive. Is that—

Matthew Green NDP Hamilton Centre, ON

Yes, as argued by the lawyers in court. I appreciate that.

I'll go on to another example, because we have many.

5:25 p.m.

Conservative

The Vice-Chair Conservative Kevin Waugh

You have 30 seconds left.

Matthew Green NDP Hamilton Centre, ON

Well, that doesn't feel like enough runway.

Do you believe in the concept of hate speech?

5:25 p.m.

Executive Director, MediaSmarts

Matthew Green NDP Hamilton Centre, ON

Do you believe in the concept of section 1 of the charter, which is a reasonable limit to freedom of expression?

5:25 p.m.

Executive Director, MediaSmarts

Matthew Green NDP Hamilton Centre, ON

Would you agree that denial of the Holocaust is a form of hate speech?

5:25 p.m.

Executive Director, MediaSmarts

Kathryn Hill

It's not an area that I would....

Matthew Green NDP Hamilton Centre, ON

Mr. Sérafin, do you believe that denial of the Holocaust is a form of hate speech?

5:25 p.m.

Assistant Professor, University of Ottawa, As an Individual

Stéphane Sérafin

I'm not sure.

Matthew Green NDP Hamilton Centre, ON

Please expand on that.

5:25 p.m.

Conservative

The Vice-Chair Conservative Kevin Waugh

Well, he can in the next round. We're already at six minutes and 15 seconds. Thank you.

We'll do the second round. We will do five minutes from the Conservatives, five from the Liberals, then two and a half each for the Bloc and the NDP. We're going to conclude the second round with five minutes Conservative and five Liberal, and then we'll call it a day, okay? This is the second round. Then, after that, we will be on our way.

Mr. Jivani, you have five minutes for the Conservative Party. Away you go.

5:25 p.m.

Conservative

Jamil Jivani Conservative Durham, ON

Thank you, Mr. Chair.

I have some questions for you, Professor Sérafin.

Many Canadians express great concern over Justin Trudeau's censorship agenda, and Bill C-63 is a piece of legislation that is part of that agenda. You've written about that bill for the Macdonald-Laurier Institute. In particular, I'd like to ask you about one of your comments and have you just elaborate on it.

You wrote, in reference to Bill C-63:

...it is not inconceivable that remedies might be sought against other kinds of online content distributors in an effort to have them engage in proactive censorship or otherwise set general policy with little or no democratic oversight. This possibility is certainly heightened by the way in which the existing directed remedies for anti-discrimination have been used to date.

Could you elaborate on that point?

5:30 p.m.

Assistant Professor, University of Ottawa, As an Individual

Stéphane Sérafin

Yes. The main example of this is.... I talked about the EDI stuff and higher education earlier. There's also the Canada research chairs program, which is subject currently to a rather strict quota system that I think was a subject of controversy a year or two ago.

Actually, that quota system is the result of a Canadian Human Rights Tribunal settlement. The Human Rights Tribunal settlement essentially consecrated an agreement between the government and the plaintiffs in that human rights complaint, which had as an effect to completely overturn the entire way in which the Canada research chairs are awarded. Now there's a strict quota system in place because of that, so it's not inconceivable.

My suggestion was that there are some provisions in the wording of the proposed amendment to Bill C-63 that would suggest that orders against content distributors in and of themselves are off the table, but that's a question of interpretation. It's not inconceivable in that context that there would be a possibility of an order against someone who was found to be doing more than just distributing content, to proactively adopt certain measures to, for example, prevent marginalized voices—as they are conceived—from being censored, which would maybe mean censoring other voices instead.

Those are the kinds of things I had in mind when I was writing that.

5:30 p.m.

Conservative

Jamil Jivani Conservative Durham, ON

Do you empathize with Canadians who have concerns over the centralization of power and control in federal bureaucracies over what people can see, hear and say online?

5:30 p.m.

Assistant Professor, University of Ottawa, As an Individual

Stéphane Sérafin

Yes.

I mean, ultimately, I think it's a question of what Parliament intended. Parliament can intend a broad delegation of authority to, say, the Human Rights Tribunal. It's perfectly legitimate. I'm not one of the people who would deny the legitimacy of administrative law writ large.

That said, there are trade-offs involved. If you're going to delegate to regulation-making bodies or administrative decision-makers, then you are, necessarily, undercutting the sort of representative nature of the legislative process. For example, going back to the Canada research chairs program, this is a decision that was made with no public consultation and turned out to be quite controversial.

The public wasn't even aware of this. In fact, I don't think most people know how this settlement came about at all. They're not even aware of the case. That's the issue with these kinds of measures.

5:30 p.m.

Conservative

Jamil Jivani Conservative Durham, ON

Yes. I'll put forward an example to you of where giving someone like Justin Trudeau so much power to decide what would be objectionable to say online could be problematic.

You may recall that in 2018 it went viral that a lady in Quebec had expressed concerns over immigration policies and Justin Trudeau called her a racist. You fast-forward now to 2024, and Justin Trudeau is now admitting that those policies were bad for our country. He has not called himself a racist for admitting that. It gives a very clear example of where the definition of these kinds of things can be easily politicized and why centralizing that kind of power and control in the hands of a federal government could be a problem.

What would you say to that?

5:30 p.m.

Assistant Professor, University of Ottawa, As an Individual

Stéphane Sérafin

It's not just limited to government oversight. This is a broader issue with the way that these words—like “racist” and all the other epithets you can think of—are used in the social media context. It's completely arbitrary in a lot of cases. One day, it's okay to say something. The next day, it's not okay to say something.

I would suggest that it's not even just a function of centralization and government bureaucracies, although removing the public oversight creates additional risks, perhaps. Just the way that these things change for no apparent reason presents, I think, significant risks to, say, banning hate speech. What counts as hate speech can change from one day to the next.

5:30 p.m.

Conservative

Jamil Jivani Conservative Durham, ON

I think we're low on time, but I would say that I think your testimony and your writing affirm why so many Canadians are concerned about what we're seeing from Justin Trudeau and the Liberal government.

Certainly, with Bill C-63 you raise a lot of important considerations that need to be made and that speak to why Canadians are so unhappy with what's happening right now.

Thank you.

5:30 p.m.

Conservative

Kevin Waugh Conservative Saskatoon—Grasswood, SK

Thank you.

We go to the Liberals for five minutes.

Go ahead, Mr. Coteau.

Michael Coteau Liberal Don Valley East, ON

Thank you very much, Chair.

Thank you to our witnesses for being here today.

MediaSmarts, you're doing great work. This is much-needed work. I think the advancement of digital literacy and digital media literacy is so necessary today, and I appreciate the work you're doing.

I was the executive director of a national literacy organization called AlphaPlus. We did a lot of work on digital literacy with the essential skills. Back then, the main conversation was around how you give people the tools to really move within the digital world. Now, the complexity has grown so much that it's about not only moving within it, but finding out what actually is real in the maze that presents itself to you.

Ms. Hill, you spoke about the algorithm. On the algorithms that are out there today.... We actually did a lot of studying as a committee with the big tech companies, on algorithms and how they're used. They're hidden as code. Really, as a society, we still don't understand how they work, but we can figure out some things.

Does the algorithm itself create censorship? Is that an argument that's out there? It doesn't have to be your opinion, but is there a discussion happening about whether the algorithm itself creates censorship and limits our ability to express ourselves?

5:35 p.m.

Director of Education, MediaSmarts

Matthew Johnson

We do have evidence that recommendation algorithms lead to self-censorship, because one of the things they do is to favour certain content and down-rank, or shadow ban, other content. This is not transparent to users, to people who are participating on these platforms, so in many cases, people will be particularly cautious to avoid using terms that they think might have them down-ranked.

Sometimes they will use so-called “algospeak”, which is a code word that people in the community know stands for a particular word that they expect will get them down-ranked. Of course, this means that people who are not yet members of the community don't have access to that conversation.

We also know that even in creating content, particularly people who are commercial content creators, they feel a very strong pressure to create not necessarily the content that they want to express, but the content that will be favoured by the algorithm.

Michael Coteau Liberal Don Valley East, ON

Do you have an example of that? Have there been big examples of where that algorithm has led to specific individuals being pushed down and to having their voices quieted online?