Evidence of meeting #153 for Justice and Human Rights in the 42nd Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was platform.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Michele Austin  Head, Government and Public Policy, Twitter Canada, Twitter Inc.
Clerk of the Committee  Mr. Marc-Olivier Girard

3:45 p.m.

Liberal

Colin Fraser Liberal West Nova, NS

Are you aware of any person who has successfully sued another individual who has put that type of information...?

3:45 p.m.

Head, Government and Public Policy, Twitter Canada, Twitter Inc.

Michele Austin

I would have to get back to you on that.

3:45 p.m.

Liberal

Colin Fraser Liberal West Nova, NS

All right.

I understand obviously there is a difference between your proactive investigation of the type of content we're talking about, online hatred, and having the individual users having to report it and wait for you to get back to them. Do you see an issue right now with how long it's taking for somebody who reports a type of abusive, threatening or hateful type of conduct online to get a response?

As I understand it, it takes up to 24 hours just to get the initial response, and then it can take a long time after that, and the onus is on the individual user to prove their case to you, rather than perhaps nipping it in the bud, putting it in abeyance and then being able to determine whether or not that content should be removed or that person should be allowed to use your platform.

3:45 p.m.

Head, Government and Public Policy, Twitter Canada, Twitter Inc.

Michele Austin

Context matters with regard to actions that we take. We have a number of signals that we measure and take a look at before we take action. Let me break that into two pieces: The first would be the reporting and the second would be review.

We publicly acknowledge that there's too much burden on victims to report to Twitter and that is why we are trying to do a better job. We now have a dedicated hateful conduct workflow so that we know we can raise those issues for review faster. As I mentioned, we're working with proprietary technology. We realize we have to do a better job in reporting abuse.

In reviewing those accounts or those tweets flagged for action, it's extremely important for us to try to get it right. There are a number of behavioural signals we get, so if I tweet something and you mute me, you block me and you report me, clearly something's up in the quality of the content. Further, we take a look at our rules and we take a look at the laws where that tweet came from.

The other part of context is that there are very different conversations happening on Twitter. Often, the example we use is gaming. It is perfectly acceptable in the gaming community to say something like, “I'm coming to kill you tonight, be prepared”, so we would like to make sure we have that context as well.

These are the things we consider when we make that review.

3:50 p.m.

Liberal

Colin Fraser Liberal West Nova, NS

All right, thanks.

3:50 p.m.

Liberal

The Chair Liberal Anthony Housefather

Thank you very much.

Mr. MacGregor.

3:50 p.m.

NDP

Alistair MacGregor NDP Cowichan—Malahat—Langford, BC

Thank you, Chair.

Ms. Austin, thank you for coming here before the committee.

Facebook banned a series of white nationalist groups: Faith Goldy, the Soldiers of Odin, Kevin Goudreau and the Canadian Nationalist Front, among others. Twitter only chose to ban the Canadian Nationalist Front from their own platform. Faith Goldy used Twitter to direct her followers to her website after her ban on Facebook.

When Twitter banned the Canadian Nationalist Front, you said you did so because the account violated the rules for barring violent extremist groups, but there was no further elaboration, and a multitude of other white nationalist groups still have a presence on the platform. Facebook has said that the removal of the groups came from their policy that they don't allow anyone who's engaged in offline organized hate to have a presence on the platform.

When Facebook took that step, why did Twitter only ban the Canadian Nationalist Front? Why is your threshold different from Facebook as to what can be allowed on your platform?

3:50 p.m.

Head, Government and Public Policy, Twitter Canada, Twitter Inc.

Michele Austin

I'm not going to comment on the individual accounts.

Firstly, with regard to white supremacism, we have an in-house process now working with civil society, working with researchers, to better understand how white nationalists and white supremacists use the platform, but that approach is not exclusive to hate. That is how we go through policy review and policy development, but that specific policy is currently under review.

We have the three core robust policies of violent extremist groups, which is the one you mentioned. We also have terrorism and the hateful conduct policies. We will take a look at all three policies as those accounts are actioned to determine whether or not we should take further action.

3:50 p.m.

NDP

Alistair MacGregor NDP Cowichan—Malahat—Langford, BC

Thank you.

You have made an allusion to the fact that there's always room for improvement, and it's a continuous work. I can appreciate that.

I'm curious. What leads to an actual ban? Can a ban happen right away, or do you like to progressively put sanctions on a person or an account?

3:50 p.m.

Head, Government and Public Policy, Twitter Canada, Twitter Inc.

Michele Austin

There are a number of stages with regard to bans. Egregious violence can result in an immediate ban. An egregious breaking of our rules like posting child sexual exploitation material can result in an immediate ban.

To be honest, most users who receive an action have made an error, and in our asking them to take that tweet down, they will apologize to their users once they have received that kind of information. We are looking to hopefully provide avenues for corrective behaviour, for counter-speech, for others to come and run to those conversations, and talk about different issues, or ask people to behave differently.

There are a number of measures we use. We can action a single tweet. We can lock down an account, but to ban somebody from Twitter completely is a very big deal for us that we take extremely seriously.

3:50 p.m.

NDP

Alistair MacGregor NDP Cowichan—Malahat—Langford, BC

I know you don't want to really comment on individual accounts. Are you studying what other platforms like Facebook are doing looking at their reasons and is that forming a large part of your review?

3:50 p.m.

Head, Government and Public Policy, Twitter Canada, Twitter Inc.

Michele Austin

Yes. We are constantly comparing our policies with the best practices of a number of organizations, not just online platforms or digital companies. We are also reaching out constantly to civil society. We have a trust and safety council, of which MediaSmarts in Canada is a member. They meet twice a year. They come and tell us what they are seeing on platforms.

We also rely on a number of researchers to come and tell us what they are seeing. It's that open dialogue and exchange of not just ideas but of information in terms of how violent extremist groups are behaving that is extremely important to us. We value it and depend on it as we continue to iterate and try to improve the service.

3:55 p.m.

NDP

Alistair MacGregor NDP Cowichan—Malahat—Langford, BC

Thank you.

I have a final question. In your opening statement you made mention of the help centre. You have a team of people who are reviewing accounts for possible violations of your rules, and really it's a global team.

Do you know how many people are staffed on the global team and how many are based in Canada?

3:55 p.m.

Head, Government and Public Policy, Twitter Canada, Twitter Inc.

Michele Austin

There is probably—

3:55 p.m.

NDP

Alistair MacGregor NDP Cowichan—Malahat—Langford, BC

Just to clarify, I'm asking this because I'm wondering if those people who are reviewing accounts based in Canada have an understanding of the Canadian context, our history, our culture, the fact that we have a lot of racism towards first nations and there is a lot of misunderstanding. A lot of those stereotypes are still being propagated on social media.

3:55 p.m.

Head, Government and Public Policy, Twitter Canada, Twitter Inc.

Michele Austin

There are two things. First of all, there are no content reviewers in Canada. We have a really excellent Moments team. I don't know if you use the search function, but when you look through and we highlight trends, we have a really excellent Moments team in Canada that looks at content from that perspective.

Canadian voices are well represented across the country. There's bias training. There's a really huge number of trainings that occur.

I have personal experience where we've changed a policy and we've been asked to provide Canadian context, which is something I have done. We also have an appeal function internally in order to provide more context.

With regard to first nations, I don't know if you have heard from indigenous groups at this committee, but if you have not, I strongly encourage you to do so. They experience hate differently from us, not just in terms of hate but also language. This is something they have brought to my attention, for which I am very grateful. A word like “savage”, we would use very differently than they would use it, and this is something we are looking at going forward. Dr. Jeffrey Ansloos from the University of Toronto is a really excellent resource with regard to this.

These are open conversations that we're having. We look forward to hearing more from the indigenous community. They are doing a great job, for me at least, highlighting the different kinds of issues they get, as they often do run to conversations to try to correct the speech or tell their stories.

3:55 p.m.

NDP

Alistair MacGregor NDP Cowichan—Malahat—Langford, BC

Thank you.

3:55 p.m.

Liberal

The Chair Liberal Anthony Housefather

Thank you very much.

Mr. Ehsassi.

3:55 p.m.

Liberal

Ali Ehsassi Liberal Willowdale, ON

Thank you, Mr. Chair.

Thank you, Ms. Austin, for appearing before our committee. I have to say your testimony was very clear, so thank you for that.

I may have missed this, but Mr. MacGregor asked how big the global enforcement team was. Could you tell us what the numbers are for the global enforcement team?

3:55 p.m.

Head, Government and Public Policy, Twitter Canada, Twitter Inc.

Michele Austin

I'll get back to you with specific numbers, but it's more than 2,500 people.

3:55 p.m.

Liberal

Ali Ehsassi Liberal Willowdale, ON

This global enforcement team is not primarily concerned about enforcing your own internal rules. Do they actually look at distinctions between legal requirements in different countries?

3:55 p.m.

Head, Government and Public Policy, Twitter Canada, Twitter Inc.

Michele Austin

There are a number of signals that we take into consideration as we are asked to action accounts or tweets, such as behaviour and local context. Certainly Asian and Southeast Asian countries have a different approach to sensitive media, for which I would say—I don't know if you know—you can currently turn a sensitive media function on and off on Twitter here in Canada. They take into account many of these signals as they go through their enforcement actions.

3:55 p.m.

Liberal

Ali Ehsassi Liberal Willowdale, ON

You did say that, as a company, you aspire to have healthy conversations. In your opinion, given that this is one of the big, overarching issues that we've grappled with at this committee, who do you think the onus should be on? Should it be on social media platforms or should it be on governments to ensure that we deal with hate speech in an effective manner?

3:55 p.m.

Head, Government and Public Policy, Twitter Canada, Twitter Inc.

Michele Austin

I think it's a combination of both. From a Twitter perspective, I would add the users into that mix as well. We welcome suggestions from human rights groups. We welcome suggestions from government. Of course we respect the law that we're in. It is extremely important for us to deliver a safe and healthy service. It's in our best interests for Twitter users to be safe. It is really an important combination of both.

In some cases, generally speaking, social media platforms will act faster than governments. That's not always the case. In other cases—as we talked about with regard to terrorism—we are asked to act faster.

3:55 p.m.

Liberal

Ali Ehsassi Liberal Willowdale, ON

Is the transparency report you referred to mandated by the EU?