Evidence of meeting #23 for Public Safety and National Security in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was extremism.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Adam Hadley  Executive Director, Tech Against Terrorism
Vidhya Ramalingam  Co-Founder, Moonshot
Navaid Aziz  Imam, As an Individual
Mohammed Hashim  Executive Director, Canadian Race Relations Foundation
Kara Brisson-Boivin  Director of Research, MediaSmarts
Taleeb Noormohamed  Vancouver Granville, Lib.

11:35 a.m.

Co-Founder, Moonshot

Vidhya Ramalingam

Sure. What we mean there is individuals who are searching for terrorist content, in some cases. They're searching for terrorist manifestos or propaganda put out by white supremacist groups. They're searching for information about how to join the Base or how to join Atomwaffen. These are people who are indicating intent in some form.

This does not include people who might have just read about something in the news and are searching for information generally on Atomwaffen or the Base. They need to be actually indicating through their search behaviours that they're taking an active interest and are possibly interested in consuming, because they would like to join or get involved. Those are the sorts of searches that would be included here.

I hope that answers your question, sir.

11:40 a.m.

Conservative

Doug Shipley Conservative Barrie—Springwater—Oro-Medonte, ON

It does a little bit. Quite frankly, it really raised a flag with me, because over the last few months I've been searching a little bit, obviously, to do research on this. Exactly where do you take into account that someone is just doing research and not actively wanting to join or pursue?

11:40 a.m.

Co-Founder, Moonshot

Vidhya Ramalingam

In part because we are not accessing or engaging with any personally identifiable data when we run these kinds of campaigns, we can't say for certain that the person we're offering a safer alternative to is a researcher or someone who is at risk, but because we are not actually moderating their searches and we're not seeking to move anything from the Internet, we are simply ensuring that any time someone searches for this content, there is a safer alternative available to them. They're given the option to consume non-terrorist content.

We're willing to take the risk that some of the individuals we engage with may actually just be researchers. We'll offer them the safer alternative as well.

11:40 a.m.

Conservative

Doug Shipley Conservative Barrie—Springwater—Oro-Medonte, ON

Again—I'm sorry I have to belabour this a little bit—someone is doing these searches, you're monitoring these searches and then you're reaching out to them to try to assist them with help. Is that correct?

11:40 a.m.

Co-Founder, Moonshot

Vidhya Ramalingam

When we're running advertising campaigns on search, we're using the same commercial methods that any big brand uses to ensure that their content comes up first—for example, when you're looking for information on how to buy a pair of shoes. If you're looking in Canada for information on how to join Atomwaffen, we would ensure, through advertising, that the very first option you see, which is labelled as an advertisement, is a piece of safer content than Atomwaffen content that might otherwise surface through the search algorithms.

11:40 a.m.

Conservative

Doug Shipley Conservative Barrie—Springwater—Oro-Medonte, ON

Okay. Thank you.

Mr. Hadley, you too had something interesting in your opening remarks. You mentioned that you monitor over 100 platforms. I have to be honest. I use a couple. I didn't know there was anywhere near that many platforms.

You mentioned that you are monitoring over 200 terror websites. Why are these websites just not being shut down?

11:40 a.m.

Executive Director, Tech Against Terrorism

Adam Hadley

That is an excellent question, one that we ask ourselves on a daily basis.

In terms of these 100 platforms, the point to stress is that many of them are very small indeed, the sorts of services that can be created by someone in their own room. Terrorist-operated websites are a significant issue. They remain online for many years, in many cases.

11:40 a.m.

Liberal

The Chair Liberal Jim Carr

Thank you very much.

Ms. Damoff, I will now turn the microphone over to you for five minutes, whenever you're ready.

May 10th, 2022 / 11:40 a.m.

Liberal

Pam Damoff Liberal Oakville North—Burlington, ON

Thank you so much.

Thank you to both of our witnesses for your testimony today.

My first question is for Ms. Ramalingam from Moonshot.

We've been trying to get you here for quite some time. I want to thank you for the work you're doing and for being here today.

Since 2014, CSIS has identified 10 plots—seven attacks and three disrupted plots—that killed 26 people and wounded 40 on Canadian soil. They identified all of these plots. Four were incel. All of them involved far-right or incel attacks.

When NSICOP tabled their report, they mentioned that in the last two years, “CSIS has uncovered extensive ideologically motivated violent extremism...(notably right-wing extremist groups)...through online activity and physical attacks. The sizable increase in this activity throughout 2020 suggests [that] the terrorist threat landscape is shifting. The primary physical threat to Canada remains low-sophistication attacks on unsecured public spaces.”

Given what independent agencies like CSIS are reporting, does it not make sense that the Government of Canada would be funding your research on those threats?

11:40 a.m.

Co-Founder, Moonshot

Vidhya Ramalingam

Thank you very much for the question. I'm very happy to be here.

We believe the Government of Canada should be not only funding research on these threats but also working to build practitioners' capabilities across Canada to intervene across these ideological spectrums. While I mentioned that the opportunity to intervene is no different with incel communities from what it would be with someone on the far right, or with al Qaeda-inspired or Daesh-inspired terrorism, there are some unique requirements of mental health practitioners and counsellors who are going to be having conversations with someone coming from a violent misogynistic background.

There is quite a lot of work to be done to equip practitioners in Canada with skills and to build their confidence to deliver interventions across this threat landscape. We would welcome the Canadian government's investment in both research and prevention on these emerging ideologies of concern.

11:45 a.m.

Liberal

Pam Damoff Liberal Oakville North—Burlington, ON

Thank you for that response.

In spite of what Mr. Lloyd is tweeting about Government of Canada investments, it seems that we're investing where the threat actually exists for Canadians.

In the work that you did with Norway—you mentioned it, and Madame Larouche also asked you about it—were there any recommendations you made, given that investigation, that you think the Government of Canada should be implementing?

11:45 a.m.

Co-Founder, Moonshot

Vidhya Ramalingam

At the time, some of my main recommendations were based on the reality that far-right extremism so often falls in a policy gap between the community safety initiatives and counterterrorism. Counterterrorism practitioners and the counterterrorism community across Canada needed to be equipped at the time with the skills to engage with far-right terrorism. I think that has dramatically improved in the last 10 years, both to Canada's credit as well as that of the international government community.

That said, I think where the threat has evolved since 2011 is in the online space. There is this worrying risk that members of the wider public are coming into contact with this content that was once relegated to very niche spaces online, or even to niche communities off-line.

My major concern is that the content that's being pushed by violent far-right groups and also violent incel groups is suddenly emerging into mainstream communities online. This is where we need to invest not only in prevention but in broader programs, to build, as I mentioned, critical media consumption skills amongst the wider public to prepare them for the possibility that they will encounter this.

11:45 a.m.

Liberal

Pam Damoff Liberal Oakville North—Burlington, ON

I don't have a lot of time left, but you mentioned in your testimony safer content and directing people to safer content. Is there anything the government can do to assist with that, or is that solely within the purview of the companies themselves?

11:45 a.m.

Co-Founder, Moonshot

Vidhya Ramalingam

The government can invest in Canadian practitioners taking their skills from an off-line context and creating digital content that will be those safer alternatives. When Moonshot delivers this work, we are not creating those safer alternatives. We actually want to be directing at-risk audience members towards Canadian practitioner content.

That's where I would encourage the Canadian government to invest. Help Canadian practitioners create better content that can serve as that compelling counter-narrative and compelling counter-offer to terrorist content online.

11:45 a.m.

Liberal

The Chair Liberal Jim Carr

Thank you very much.

I now move to Ms. Larouche for two and a half minutes.

Go ahead whenever you're ready.

11:45 a.m.

Bloc

Andréanne Larouche Bloc Shefford, QC

Thank you, Mr. Chair.

For my second round of questions, I would like to return to some of what Mr. Hadley said.

Mr. Hadley, it has been shown that smaller and medium-sized companies face a bigger challenge in terms of being well protected against online risks and threats. You've explained why very well, but I'd like to know a little bit more about how exploitation of their sites by terrorists affects small technology companies.

Can you give any other examples to help us to better understand this reality?

11:45 a.m.

Executive Director, Tech Against Terrorism

Adam Hadley

Many thanks.

Recognizing the short time available, there's one particular Canadian messaging app, which I won't name, that became totally inundated by terrorist activity. We estimate that at one point, 80% of its user base was associated with ISIS a number of years ago. As a result, that platform was simply unable to operate in any functional way because it had been taken over by terrorist activity.

Increasingly, we see that terrorist-operated websites are a big issue. We're talking about hundreds of terrorist-operated websites, the majority of which are owned or operated, based on our assessment, by extreme far-right actors. The reason these stay up online so much is that the legal infrastructure to guide governments in helping them understand how to go about taking down these websites is very unclear.

The private sector does co-operate to some extent on terrorist-operated websites. I believe that only recently, a website that was highly likely owned or operated by American Futurist, which is an organization closely linked to designated NSO and James Mason, was removed. There are some successful efforts to have terrorist-operated websites removed. However, a lot more needs to be done. It's not just about smaller platforms but also terrorist-operated websites.

Thank you.

11:50 a.m.

Bloc

Andréanne Larouche Bloc Shefford, QC

In response to a question, you talked about algorithms, which worsen the problem for small and medium-sized companies. What impact can algorithms have?

11:50 a.m.

Executive Director, Tech Against Terrorism

Adam Hadley

Algorithms are typically not a big part of the terrorist use of smaller platforms. The use case for smaller platforms is typically really simple and straightforward. It tends to be copying links or copying material, or where the extreme far right is concerned, having an alternate site to upload a video or audio.

Algorithms certainly are of concern. However, where small platforms are concerned, they are a relatively insignificant factor.

11:50 a.m.

Liberal

The Chair Liberal Jim Carr

Mr. MacGregor, you have two and a half minutes, sir, whenever you're ready.

11:50 a.m.

NDP

Alistair MacGregor NDP Cowichan—Malahat—Langford, BC

Thank you, Mr. Chair.

Ms. Ramalingam, I'd like to continue with you.

I really appreciated your recommendations for our committee about strengthening mental health and community intervention and making sure that we adapt those services for online use. Our committee recently completed a study into gun smuggling and gang warfare. We heard a lot of testimony about the effectiveness of community-based programs to help vulnerable populations avoid a life with gangs. I think we can use the same model on this.

I want to ask you specifically about the subject of deplatforming.

We had Mr. Imran Ahmed before our committee last week. He is with the Center for Countering Digital Hate. I'll read a quote from his testimony. He said, “Deplatforming these people and putting them into their own little hole, a little hole of anti-Semites, anti-vaxxers and general lunatics, is a good thing, because [actually] you limit their capacity to infect other people. Also, for trends such as the convergence and hybridization of ideologies”.

You're proposing a set of recommendations where it's a positive intervention. Do you have any comments on the concept of deplatforming to try to, I guess, cauterize the wound and prevent some of these crazy ideologies and violent extremism from spreading to vulnerable groups?

11:50 a.m.

Co-Founder, Moonshot

Vidhya Ramalingam

Thank you for your question, sir.

Deplatforming works. There's plenty of evidence to suggest that deplatforming does work in limiting the spread of terrorist content on platforms, but it's not enough on its own. In order to effectively prevent terrorist abuse of online platforms, we need to accept two things. First, there will always be some content that falls in the grey zone and will not be liable for removal and these groups walk the line very carefully.

Second, there will always be some spaces on tech platforms that are not liable for moderation. I've mentioned “search” a few times now—that's a great example here. Search engines don't prevent you from entering anything you'd like into the search engine box. That search engine box is a great moment to intervene with someone who is searching actively for terrorist content.

For these kinds of cases, in addition to moderation efforts, we need to be thinking about how we deliver safer alternatives to users who might be at risk of getting involved in violence. You can delete the user and you can delete the account or the video, but that person still exists in the community around us.

Thank you.

11:50 a.m.

Liberal

The Chair Liberal Jim Carr

Thank you very much.

Mr. Lloyd, I can offer you two minutes. Take full advantage of them. Go ahead.

11:50 a.m.

Conservative

Dane Lloyd Conservative Sturgeon River—Parkland, AB

Thanks, Mr. Chair.

For Moonshot, you were talking about the search engine results you track, how to join so-and-so far-right organizations. Do you track any search engine results that you would classify as on the left-wing side of the political spectrum, and can you give examples?

11:50 a.m.

Co-Founder, Moonshot

Vidhya Ramalingam

We do, sir.

In our international work, we do track search terms that are affiliated with anti-government left-wing extremist movements specifically inciting violence against the government.