Evidence of meeting #23 for Public Safety and National Security in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was extremism.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Adam Hadley  Executive Director, Tech Against Terrorism
Vidhya Ramalingam  Co-Founder, Moonshot
Navaid Aziz  Imam, As an Individual
Mohammed Hashim  Executive Director, Canadian Race Relations Foundation
Kara Brisson-Boivin  Director of Research, MediaSmarts
Taleeb Noormohamed  Vancouver Granville, Lib.

11:15 a.m.

Conservative

Dane Lloyd Conservative Sturgeon River—Parkland, AB

No, you don't know.

11:15 a.m.

Co-Founder, Moonshot

Vidhya Ramalingam

I would not have awareness of the full range of funded projects, so I wouldn't feel comfortable saying one way or the other. Our group has not received funding to work beyond Daesh, al Qaeda, far-right extremism and incel terrorism, but we would certainly welcome it.

Prevention should be proportionate based on the data.

11:15 a.m.

Conservative

Dane Lloyd Conservative Sturgeon River—Parkland, AB

Would you say that—if it were proven the government was not funding research in this area—this is a blind spot of the government?

11:15 a.m.

Co-Founder, Moonshot

Vidhya Ramalingam

My belief is that research should span the entire ideological spectrum. We should use data to inform where prevention should be based. I would also mention that most prevention programs should really be cross-ideological. Every prevention program should be equipped to handle any case of violence, whether it's coming from violent far-left groups, violent far-right groups, or al Qaeda and Daesh inspired terrorism.

11:15 a.m.

Conservative

Dane Lloyd Conservative Sturgeon River—Parkland, AB

I couldn't agree more.

I'll move on to Mr. Hadley.

In terms of your work in countering terrorism.... We had a recent case in Montreal. It's still under investigation. A former Conservative cabinet minister and staffer for RBC, Michael Fortier, had his two vehicles torched in Montreal. An anarchist environmentalist group claimed responsibility for the attacks because RBC is funding oil and gas projects and pipeline projects in Canada.

We've been told that attribution is a key thing we need in order to deal with this. Can you comment on the importance of unmasking who is truly behind these attacks?

11:15 a.m.

Executive Director, Tech Against Terrorism

Adam Hadley

Many thanks.

Could you clarify who you mean, in terms of importance?

11:15 a.m.

Conservative

Dane Lloyd Conservative Sturgeon River—Parkland, AB

For the attribution of who is behind the attacks, how important is it to unmask the actual people behind these terrorist attacks? How would you go about doing that?

11:15 a.m.

Executive Director, Tech Against Terrorism

Adam Hadley

I think that's probably a question for law enforcement and intelligence agencies. Certainly the work at Tech Against Terrorism isn't focused on identifying individuals, but rather supporting tech platforms in reducing their activity online. Where appropriate and where there is a realistic threat to life, we ensure that the alert is sent to the relevant authorities, including in Canada.

As to the attribution, I think that's probably a question for law enforcement in terms of the measures that they may have and the mechanisms they have available under Canadian law in order to conduct surveillance and carry out intelligence operations.

11:15 a.m.

Liberal

The Chair Liberal Jim Carr

Thank you very much.

I would like to invite Mr. Chiang to take the floor for six minutes.

Go ahead whenever you're ready, sir.

11:15 a.m.

Liberal

Paul Chiang Liberal Markham—Unionville, ON

Thank you, Mr. Chair.

Thank you to the witnesses for taking the time to be with us today.

My question is for Mr. Hadley.

In 2017, your organization launched a knowledge-sharing platform, which was a collection of tools that start-ups and small tech companies can use to better protect themselves from terrorists' exploitation of their services.

Could you provide this committee with some more in-depth information about how this platform works and some of the results you have seen?

11:15 a.m.

Executive Director, Tech Against Terrorism

Adam Hadley

Of course. Many thanks for that.

The knowledge-sharing platform is designed as a tool that's free to access for tech platforms. Its objective is to improve the understanding that those running small platforms have of the terrorists' use of the Internet. It spans the spectrum of terrorism and violent extremism. Within the scope are violent Islamist extremism, the extreme far right and a number of other terrorist organizations that are designated by other international organizations.

In detail, the KSP provides information on logos associated with designated groups, the terminology associated with them and phraseology that may be typical of the content that appears. There's also detail on workflow in order to support platforms in making better content moderation decisions. There is also a significant amount of information about designation lists at the international level and a summary of global online regulatory efforts and many other elements. For more information, the website is ksp.techagainstterrorism.org.

11:20 a.m.

Liberal

Paul Chiang Liberal Markham—Unionville, ON

In essence, does anybody have access to this website of yours?

11:20 a.m.

Executive Director, Tech Against Terrorism

Adam Hadley

We are careful to vet access in everything that we do. In fact, in everything I will say during this committee meeting, I will assume that terrorists and violent extremists are aware of what we're saying, so there is always concern about not disclosing too much.

Tech Against Terrorism is distinctive in that much of our work is done confidentially and privately. In order to build trust and confidence with smaller platforms, much of this must be done in private. In particular, there are grave concerns about access to the methodology and information that small platforms have. We know that terrorists and violent extremists are extremely adept at changing their use of the Internet. The more information they have about content moderation, the easier it is to change their methodology and therefore subvert mechanisms designed to stop that activity, so we have to be careful.

In detail, for every individual who applies for access to the knowledge-sharing platform, we will ensure that they belong to a real platform. We will email them, call them and ensure that the knowledge that's being shared is appropriate for that audience.

11:20 a.m.

Liberal

Paul Chiang Liberal Markham—Unionville, ON

Excellent. Thank you, Mr. Hadley.

In 2018, your organization launched a data science network, which your website calls “the world's first network of experts working on developing and deploying automated solutions to counter terrorist use of smaller tech platforms whilst respecting human rights.”

Could you tell this committee more about automated solutions to counter terrorism?

11:20 a.m.

Executive Director, Tech Against Terrorism

Adam Hadley

Of course. Automation can cover a number of separate activities. Often we might discuss algorithms, which certainly are part of automation. However, in our experience, the biggest challenge that small platforms have isn't in the basics but in the workflow. Content moderation automation is a simple mechanism in principle. It's identifying content that may fall afoul of the law or terms and conditions. It's then assessing whether this content does pass those thresholds. It's taking action, recording that action and reporting on it. It's also providing an opportunity for a user to appeal that decision. For the workflows, the complex ones, with smaller platforms in particular, most of our activity in supporting platforms is with that basic infrastructure.

You could argue that this is all about automation. It's about trying to ensure that small platforms are able to accurately identify and moderate content in a scalable way. Unlike big platforms, smaller platforms have very small teams. They often have no or limited revenue or profitability, and they tend not to have particularly sophisticated technical infrastructure. That explains partly why terrorists and violent extremists will often use smaller platforms, because they know it's so much harder for those smaller platforms to remove the material.

When we're working with smaller platforms, we provide a number of recommendations about how they can best use technology and automation to make the content moderation process more accurate and more successful as a result. Automation can include various other mechanisms such as hashing or hash-sharing. Potentially it can ultimately include searches of keywords and terminology, and it could involve more sophisticated mechanisms to understand whether a symbol is in an image or a video.

However, most small platforms rarely have the capacity or capability to build complex automation. The automation that we typically support with is fairly simple and it's about helping them make the right decisions and record the decisions that they're making. An important principle in all content moderation, at least in our view, is transparency. Therefore, we recommend that platforms of all sizes invest in transparency reporting and, for that, automation is required to understand what has been removed and what's been left up.

11:20 a.m.

Liberal

Paul Chiang Liberal Markham—Unionville, ON

Thank you, Mr. Hadley.

11:20 a.m.

Liberal

The Chair Liberal Jim Carr

I would now like to turn to Ms. Larouche who has a six-minute block.

Whenever you're ready, please proceed.

May 10th, 2022 / 11:20 a.m.

Bloc

Andréanne Larouche Bloc Shefford, QC

Thank you very much, Mr. Chair.

I thank both witnesses for taking the time to appear before the committee today.

My first question is for Ms. Ramalingam.

Ms. Ramalingam, in your opening remarks, you mentioned the tragic event in Norway. To enlighten the committee, I would like to know what you think about the recently passed European legislation on illegal content online. What can we learn from that?

I'd also like to hear what you have to say specifically on the issue of liability for technology companies.

11:25 a.m.

Co-Founder, Moonshot

Vidhya Ramalingam

Thank you very much for your question. That tragedy, which I referred to, from nearly 11 years ago now was really a wake-up call for European governments. That was really the first moment that European governments realized that there had been a threat that had been completely overlooked.

11:25 a.m.

Bloc

Andréanne Larouche Bloc Shefford, QC

I have a point of order, Mr. Chair.

11:25 a.m.

Liberal

The Chair Liberal Jim Carr

Yes.

11:25 a.m.

Bloc

Andréanne Larouche Bloc Shefford, QC

I'm sorry, Ms. Ramalingam, I will have to ask you to repeat yourself.

Mr. Chair, there is no interpretation.

11:25 a.m.

Liberal

The Chair Liberal Jim Carr

We did not have interpretation, so please go back to the beginning of your answer, and let's ensure that we have interpretation.

Proceed, please.

11:25 a.m.

Co-Founder, Moonshot

Vidhya Ramalingam

Thank you for your question, Madame Larouche.

I was mentioning that the tragedy you referred to from 2011 was really a wake-up call for European governments. That was the first moment they realized that they had really been overlooking this threat.

I really welcome the recent legislation in Europe and the efforts to hold tech companies accountable. Tech companies are often very reactive rather than proactive. It often takes a tragedy for the tech sector to be compelled to act. We saw this after the massacre in Christchurch and after January 6. They so often wait either for tragedy or for governments to impose legal and commercial imperatives to act. Legislation works. Legislation is absolutely required to hold the tech companies accountable, and I welcome the recent EU legislation.

11:25 a.m.

Bloc

Andréanne Larouche Bloc Shefford, QC

Ms. Ramalingam, in your opening remarks, you mentioned the “incel“ movement, meaning involuntary celibate. As the Bloc Québécois critic for the status of women, I am very concerned about the radicalization of this movement, particularly as it pertains to women.

I'd like to hear a little bit more about what the study of this movement can contribute to the committee's deliberations on the study of online radicalization.