Evidence of meeting #19 for Public Safety and National Security in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was twitter.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Evan Balgord  Executive Director, Canadian Anti-Hate Network
Barbara Perry  Director, Ontario Tech University, Centre on Hate, Bias and Extremism
Wendy Via  Co-Founder, Global Project Against Hate and Extremism
Ilan Kogan  Data Scientist, Klackle, As an Individual
Rachel Curran  Public Policy Manager, Meta Canada, Meta Platforms
David Tessler  Public Policy Manager, Meta Platforms
Michele Austin  Director, Public Policy (US & Canada), Twitter Inc.

11:20 a.m.

Conservative

Dane Lloyd Conservative Sturgeon River—Parkland, AB

Can you make a submission to the committee and provide us with this evidence? You've said it and I guess I'll take you at your word for now.

Can you actually provide us with written evidence to back up these claims?

11:20 a.m.

Executive Director, Canadian Anti-Hate Network

Evan Balgord

Yes, we have.

I'd be happy to share some of the articles we've already written on the subject with the committee.

11:20 a.m.

Conservative

Dane Lloyd Conservative Sturgeon River—Parkland, AB

Do these articles contain primary sources that back up the evidence or are these opinion articles written by your admittedly not objective organization?

11:20 a.m.

Liberal

The Chair Liberal Jim Carr

Give a 10-second answer, please.

11:20 a.m.

Executive Director, Canadian Anti-Hate Network

Evan Balgord

Yes. Everything is demonstrated in the articles.

Thank you.

11:20 a.m.

Liberal

The Chair Liberal Jim Carr

I would now invite Mr. Chiang to begin his six-minute line of questioning.

The floor is yours, sir.

April 26th, 2022 / 11:20 a.m.

Liberal

Paul Chiang Liberal Markham—Unionville, ON

Thank you, Mr. Chair.

I'd like to thank all the witnesses for their time and sharing their expertise with us.

My question is directed to Mr. Balgord.

In your opinion, are Canada's national security agencies adequately focused on the far-right threats? If not, what recommendations do you have for these agencies?

11:20 a.m.

Executive Director, Canadian Anti-Hate Network

Evan Balgord

I'm not privy to how they make their decisions, of course.

From what we can observe from the outside, there certainly seems to be much more of a focus on right-wing extremism and the ideologically motivated violent extremism that comes from it.

I can't answer that question in depth. You'd have to ask our national security agencies themselves.

11:25 a.m.

Liberal

Paul Chiang Liberal Markham—Unionville, ON

Thank you so much.

Does your organization have any sort of tracking for hate-based extremism incidents?

What are some ways the federal government might improve data collection related to extremism in order to better understand and combat this issue?

11:25 a.m.

Executive Director, Canadian Anti-Hate Network

Evan Balgord

We do not collect that kind of data ourselves. There are two sources of that data in Canada.

The first is police-reported hate crime statistics. These are flawed because they don't capture a lot of the data.

The best way we can measure hate crime and hate incidents in Canada is simply by asking Canadians if they've been the victim of it. That's what we do through the general social survey. Every five years there is this portion on victimization where we simply ask people if they have been the victim of a hate crime and collect some surrounding information on it. That gives us our best snapshot of where we are at in Canada in terms of hate crime.

I would respectfully submit that every five years is too infrequent for collecting that data. We've been long advocating that Statistics Canada should be collecting that data on an annual basis.

11:25 a.m.

Liberal

Paul Chiang Liberal Markham—Unionville, ON

Thank you, Mr. Balgord.

Dr. Perry, your bio describes you as a primary national authority on far-right extremism in Canada.

Could you elaborate on the work you have done in this field and some of your findings related to the risk of right-wing extremism in Canada?

11:25 a.m.

Director, Ontario Tech University, Centre on Hate, Bias and Extremism

Dr. Barbara Perry

I have been studying far-right extremism in the Canadian context since about 2012-13. I had done a little work previously in this space in the U.S. in the mid mid-nineties or so, but I have been working more broadly in the area of hate studies for about 30 years now.

In 2015, we published a report coming from a study that was funded by Public Safety Canada, which was really the first comprehensive academic approach to understanding right-wing extremism in Canada. We have just finished another three-year study, which is an update of that.

What we have found in that report in 2015—and I can share it or the subsequent book that came out of that—was a very conservative estimate of about 100 active groups across Canada. We could document through open-source data that there were over 100 incidents of violence of some sort associated with the far-right in Canada. Just to put that in context, during the same period of time there were about eight incidents of Islamist-inspired extremism, which is what the focus was at the time.

What else did we find there? In the update, we have found in the last couple of years in particular over 300 active groups associated with the far-right and, of course, just in the last seven years or so we have seen now 26 murders, 24 of those mass murders, motivated by some variant of right-wing extremism.

What else are we finding? One of the things that was alluded to earlier was the idea of the shifting demographics within the movement as well. I think that as we saw with the convoy, it is a much older demographic than what we were seeing previously, where it was not wholly but predominantly a youth movement—Skinheads, neo-Nazis,those traditional sorts of groups—but we're now seeing an older, better educated demographic being brought to the movement as well. Certainly, it is a movement that is much more facile and ready to use social media in very ironic, as well as very open, ways to share their narratives.

11:25 a.m.

Liberal

Paul Chiang Liberal Markham—Unionville, ON

Thank you, Dr. Perry.

Next, do you have any recommendations for this committee regarding the deradicalization of people with extremist views? How can we get people out of extremist groups once they have joined? How can we prevent people from joining these groups in the first place?

11:25 a.m.

Director, Ontario Tech University, Centre on Hate, Bias and Extremism

Dr. Barbara Perry

These are the easy questions, I think.

With respect to deradicalization, there's a lot of controversy about that term. We can bring people out of the movement. It doesn't necessarily mean that if they come out of the movement, they put aside those narratives. Sometimes these narratives stay with them for a long time, but these people at least desist from engaging in spreading those narratives or engaging in any sort of violence or harassment.

11:30 a.m.

Liberal

The Chair Liberal Jim Carr

Wrap it up in 10 seconds, please.

11:30 a.m.

Director, Ontario Tech University, Centre on Hate, Bias and Extremism

Dr. Barbara Perry

There are a number of organizations with that task, both to counter the mobilization to the movement and to help people come out—life after hate and exit programs, and those sorts of things.

11:30 a.m.

Liberal

The Chair Liberal Jim Carr

Thank you very much.

I would now like to invite Ms. Michaud to begin her six minutes of questioning.

The floor is yours.

11:30 a.m.

Bloc

Kristina Michaud Bloc Avignon—La Mitis—Matane—Matapédia, QC

Thank you, Mr. Chair.

I thank the witnesses for joining us.

I will address Mr. Balgord.

In an article from September about protests during the election campaign, you said that protest groups were organizing their activities through online groups, including on platforms like Facebook. I assume something similar happened with the “freedom convoy”. You talked a bit about that earlier.

Do you think platforms like Facebook are doing enough with their service policies to counter those activities? Do you think they are helping hate groups get organized?

11:30 a.m.

Executive Director, Canadian Anti-Hate Network

Evan Balgord

Through all of the whistle-blower data that has come out and from the whistle-blowers themselves who have told the story of what happens behind the scenes at Facebook, we've seen pretty conclusively that they identify problems like polarization and hate speech. When they propose solutions, they're told by their executives not to do them because it would hurt engagement or they discover that some of the things they do to increase engagement are in fact driving polarization. They move forward with those decisions because engagement is money for them. Platforms like Facebook and Twitter have more of a built-in incentive to drive engagement at all costs.

No, they are not doing enough to combat things. I know that right now the government is looking at an online safety piece of legislation. That would have been very effective five years ago. It's still going to be effective and it's important because when people get involved in ideologically motivated violent extremism or far-right organizing or COVID conspiracies, they don't start doing that on the weird fringe platforms like Telegram. They start on the Facebooks and the Twitters of the world.

If we can stop people from connecting with that misinformation and disinformation, we can help a lot of families who are dealing with their grandmother, their uncle or their aunt who's been swept up into this alternate reality that's causing a lot of trouble.

There's still a lot that we can accomplish with the platforms, but we need to change the incentives. We need to make it so that they act responsibly.

They've had 10 years to figure out how to do it themselves. Unfortunately, nobody really likes the idea of government having to step in and tell an industry what to do. Everybody rankles at that here and there, but we have to because, quite frankly, the status quo is untenable.

11:30 a.m.

Bloc

Kristina Michaud Bloc Avignon—La Mitis—Matane—Matapédia, QC

Thank you.

I especially like how you concluded your comments. No one likes it when the government interferes in these kinds of things, but we cannot always rely on organizations' good faith.

What do you think the government should do? Do you think the legislation Europe recently adopted on problematic content on major platforms could be a good solution for Canada? Should we adopt that kind of a model here?

11:30 a.m.

Executive Director, Canadian Anti-Hate Network

Evan Balgord

As far as I can tell, none of the legislation that has tried to address online harms has made a difference to people who are victimized by it. I mean, platforms may point and say they did this and they did that, but I dare say that if you ask people who use these platforms, they will not perceive that there's much of a difference in their safety or how they perceive these platforms.

Of course, we run into opposition to doing anything about online harms, so I think we should be moving forward with a different model. I don't think we should have a complicated model that looks at censoring or taking down individual pieces of content. I think that we should have an ombudsperson model.

The basic idea is that you have an ombudsperson that is a well-resourced regulator with investigatory powers, so they can kick down the door of Facebook and take its hard drives. I'm being a little hyperbolic here, but we know that these platforms hide data from us and lie to journalists, so we do need broad investigatory powers to investigate them.

I believe that this ombudsperson should be able to issue recommendations on the platforms about the algorithms and things like that. That would be very similar to what their own employees kind of want to do behind the scenes. Like, if they learn that something drives polarization and negative engagement and is leading to hate speech, they suggested to maybe do this instead, or put this in as a stopgap measure.

If we had an ombudsperson who could look at what was happening under the hood and make recommendations on the platforms, that's the direction we want to go. Where the platforms do not take those recommendations, we feel that the ombudsperson should be able to apply to a court. The court can measure what the ombudsperson is recommending versus all the charter implications. If the court decides that it's a good measure and it's charter consistent, then the court can make it an order. Then if the platforms don't follow it, they could face a big fine.

This is a much more flexible way to move forward because it means that any particular arguments we might have against free speech versus hate speech, etc., are taken out of the hands of government and instead happen with a bunch of intervenors in front of a court and a judge. That's how we would move forward because it's kind of flexible. We can put it in place now and we can defer some of those arguments and have them in front of a court where they belong.

11:35 a.m.

Bloc

Kristina Michaud Bloc Avignon—La Mitis—Matane—Matapédia, QC

I can't help but take the time I have left to ask you a question about Elon Musk's recent purchase of Twitter.

We know that algorithms play an important role on those types of platforms to spread disinformation and hateful content. This morning, I read in the media that the richest troll on earth has taken over that social media site and wants to make the algorithm public. What do you think about that? Should we be concerned about it?

11:35 a.m.

Executive Director, Canadian Anti-Hate Network

Evan Balgord

It's just a great example of how a lot of people who do not actually believe in free speech and free expression hide behind those arguments.

We've seen Elon Musk, on a personal level, try to censor or sue people who say things he doesn't like. It's very concerning when somebody like that would have so much power over a social platform that we all use everyday and we have to use for work reasons.

11:35 a.m.

Liberal

The Chair Liberal Jim Carr

You have 10 seconds, please.

11:35 a.m.

Executive Director, Canadian Anti-Hate Network

Evan Balgord

So no, I think it's an incredibly terrible development, but I don't know what we do about it.

Thank you.