Evidence of meeting #146 for Justice and Human Rights in the 42nd Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was community.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Brian Herman  Director, Government Relations, B'nai Brith Canada
David Matas  Senior Legal Counsel, B'nai Brith Canada
Daniel Cho  Moderator, Presbyterian Church in Canada
Clerk of the Committee  Mr. Marc-Olivier Girard
Emmanuel Duodu  President, Ghanaian-Canadian Association of Ontario
Queenie Choo  Chief Executive Officer, S.U.C.C.E.S.S.
Mukhbir Singh  President, World Sikh Organization of Canada

8:45 a.m.

Liberal

The Chair Liberal Anthony Housefather

Good morning, everyone. Welcome to this meeting of the Standing Committee on Justice and Human Rights, as we resume our study on online hate this morning.

We are lucky to be joined by two very prestigious groups in Canada. We are joined by B'nai Brith Canada, represented by Mr. Brian Herman, who is the Director of Government Relations, and Mr. David Matas, who is the Senior Legal Counsel; and by the Presbyterian Church in Canada, represented by Mr. Daniel Cho, who is the Moderator. Welcome.

We'll start with B'nai Brith and then go to the Presbyterian Church.

Mr. Herman and Mr. Matas, please begin.

8:45 a.m.

Brian Herman Director, Government Relations, B'nai Brith Canada

Thank you, Mr. Chairman. We thank you and the committee for allowing us to appear today.

You know my colleague David Matas, our senior legal counsel, who will speak to some of the detailed aspects of the thoughts that I'll be introducing. We won't go into some of the broad comments about the serious nature of online hate. The committee members are well aware of it, and we know from previous testimony that you've heard about the challenges in this space.

One year ago, B'nai Brith Canada called for a national action plan to deal with anti-Semitism—not a federal one but a national one—and combatting online anti-Semitism was part of that plan. This has become all the more important, given one key finding of our annual audit of anti-Semitic incidents in Canada, which we released the other day here in Ottawa. It found that of the 2,042 recorded incidents in 2018—an increase of 16.5% over 2017—80% of those anti-Semitic incidents took place via online platforms. This underscores the challenge for the Jewish community in Canada.

We started our work long ago. In October 2017, David Matas authored a paper on mobilizing Internet providers to combat anti-Semitism. In November 2017, we wrote to ministers of the government regarding the European Union's May 31, 2016, code of conduct on illegal online hate speech. We suggested at that time that Canada adopt the EU's “trusted flaggers” approach as one measure in addressing online hate. Both David and I can talk about that, and we can share both of those documents with the committee.

In December 2018, we submitted a policy paper to the government calling for Canada to develop an anti-hate strategy, a strategy that would include confronting online content that reflects anti-Semitism, Holocaust denial and Holocaust distortion.

In Canada, we know there is a need to foster public debate. The work of this committee will contribute to that end. The public needs to understand the challenges and the role they play in countering online hate, including disinformation. We feel strongly that action cannot just be left to governments, platforms and content providers. We're not calling for an online hate strategy from you. We know that we have to contribute to what the committee and the government do with specific ideas.

It's not for social media companies alone. At the recent meeting of G7 interior ministers, we noted that public safety minister Ralph Goodale said, “The clear message was they [social media companies] have to show us clear progress or governments will use their legislative and regulatory authorities.” We honestly feel that there is no need to reinvent the wheel if we can draw on useful work that is already under way.

Secondly, B'nai Brith Canada understands that in addressing online hate generally, we know that the scourge of anti-Semitism will be captured, as long as we mark anti-Semitism as a particular problem.

There were some thoughts that others offered last autumn. We don't claim authorship of them, but they are worthy of examination.

The federal government needs to compel social media companies to be more transparent about their content moderation, including their responses to harmful speech.

Governments, together with civil society and affected community organizations, foundations, companies and universities, must support more research to understand and respond to harmful speech.

There is an idea about the creation of a forum similar to the Canadian Broadcast Standards Council to convene social media companies, civil society and other stakeholders, including representatives of the Jewish community, to develop and implement codes of conduct.

We need to re-examine the need for a successor to section 13 of the Canadian Human Rights Act, and David will address that.

There are active measures that we can take. For example, in November last year, UNESCO and the World Jewish Congress launched a new website called “Facts About the Holocaust”, designed as an interactive online tool to counter the messages of Holocaust denial and distortion that are circulating on the Internet and social media. This is a useful tool that we think can be considered.

The United Kingdom, just a few weeks ago, released an online harms white paper, and we were very struck by a number of proposals in that document that set out guidelines to tackle content of concern. One proposal in that white paper is the idea of an independent regulator to enforce the rules.

The U.K. also now has a code of practice for providers of online social media platforms, which was published on April 8. These are all good ideas worth considering.

Here are some recommendations, just to summarize.

First, data is the key. The government should incentivize and encourage provincial, territorial and municipal law enforcement agencies to more comprehensively collect, report and share hate crimes data, as well as details of hate incidents. The online dimension needs to be addressed. We are, in fact, in dialogue with Statistics Canada's Canadian Centre for Justice Statistics, which has a consultation exercise under way to see whether or not there is a capacity to record data, not only on hate crimes but on hate incidents, including the online dimension.

Second is to strengthen the legal framework. We feel that Parliament has an opportunity to lead the fight against cyber-hate by increasing protections for targets, as well as penalties for perpetrators.

Third is improved training for law enforcement. Elsewhere, B'nai Brith Canada has argued for more hate crimes units in major cities, or at the least, clear hate crimes strategies and better training.

Fourth is robust governance from social media platforms. Elected leaders and government officials have an important role to play in encouraging social media platforms to institute robust and verifiable industry-wide self-governance. That's already been addressed, but that needs to be the first step, followed by others.

Then, there needs to be more international co-operation. Canada should ratify the 2002 additional protocol to the Council of Europe's Convention on Cybercrime.

There are a number of ideas that we've submitted to the clerk that go beyond what I've said. One of our partner agencies, the Anti-Defamation League in the United States, has done a considerable amount of work on the challenge of online hate, and we've passed to the clerk a number of specific proposals that the ADL has put forward for consideration by industry.

Thank you.

8:50 a.m.

David Matas Senior Legal Counsel, B'nai Brith Canada

I realize that I don't have much time remaining within the 10 minutes, so I'll try to be brief.

I have some suggestions, first of all, about the Criminal Code. The consent of the Attorney General represents a problem because it's often arbitrarily denied. We do not suggest that it be removed, but there should be guidelines developed so that they are either followed or not, with explanation.

Second, the defence of religious expression represents a problem because often religious expression is used as a form of incitement to hatred. It should not be immune from prosecution.

8:55 a.m.

Liberal

The Chair Liberal Anthony Housefather

Sir, do you mean under section 319 of the Criminal Code, the exception under subsection 319(3)?

8:55 a.m.

Senior Legal Counsel, B'nai Brith Canada

David Matas

Yes, exactly.

In terms of the Internet specifically, we need a modified safe harbour provision. Right now with respect to the U.S., they have a kind of complete immunity. What we need is in fact liability, the opposite of safe harbour, with a defence of innocent dissemination. Those are criminal law suggestions.

My colleague has mentioned the protocol that Canada signed in 2005, which addresses “criminalisation of acts of a racist and xenophobic nature committed through computer systems”. If Canada enacts a modified safe harbour provision that relates only to innocent dissemination, in my view that would allow us to ratify that convention.

The Canadian Human Rights Act's section 13 was good in substance but problematic in terms of procedure. The standard should be re-enacted but with procedural protection so it doesn't lead to harassment of the innocent. There needs to be the power to award costs, which the Human Rights Commission and tribunals don't have now.

The screening and conduct missions need to be decoupled so that screening should be in all cases, but the commission need not undertake itself any case it screens in and could allow for private individuals to take the case.

There needs to be a requirement of election of forums so that a complainant could not proceed in many forums simultaneously—federal and provincial—which is a problem now.

There needs to be a power to remove parties. The commission and tribunal can add parties but can't remove parties. That can become a problem for an improperly joining party.

There needs to be a right to know the accuser, because right now these commissions and tribunals can function on the basis of rumour only, without disclosing the accuser. That needs to be put in place.

There needs to be a right of disclosure of the complaint, because right now, if the commission takes on the case, they don't actually have to disclose the complaint. There needs to be this right of disclosure.

That's a quick run-through. The brief elaborates on all these recommendations in detail. The general approach is that we are obviously concerned with the right to be free from incitement to hatred and discrimination, but we're also concerned about the right to freedom of expression. We don't want these tools to be turned around and used to frustrate legitimate expression and, indeed, to harass people who are calling out hate promoters. All our recommendations are developed with keeping this balance in mind.

8:55 a.m.

Liberal

The Chair Liberal Anthony Housefather

Thank you very much.

Now we'll go to Mr. Cho.

8:55 a.m.

Reverend Daniel Cho Moderator, Presbyterian Church in Canada

Mr. Chair and members of the committee, thank you for this special privilege to come before you today. I also want to thank my fellow witnesses here.

As moderator of the Presbyterian Church in Canada, I represent many Canadians of all cultures and backgrounds who hold deep faith and commitment to helping shape a better world for all. On their behalf, I express gratitude to this committee for the opportunity to contribute to the discussion of online hate.

As members of the Presbyterian Church in Canada, we have as our core value care, love and respect for our neighbours. We hold to an unwavering commitment to working for just causes and outcomes and affirming the inherent dignity of all persons. In this regard we enjoy special partnerships with many of our faith-based groups in our common vision to foster compassion and understanding towards one another.

We have all been alarmed by recent events around the world of mass killings that have targeted specific groups, whether based on race, ethnicity, cultural background, religion, geographic origin or sexual identity. Tragically, as each week goes by it seems that yet another similar event occurs. Since we received this invitation to appear before this committee, there have been, at least as reported in the media, two additional mass shootings, at least one of which is by an alleged white nationalist.

Often, the perpetrators of this violence have been radicalized by online influences, or they have discovered a like-minded online community and through it find validation for their specific personal bigotry and hatred. Sadly, it is not difficult to countenance the cruel reality of religious, racial and gender prejudices, racism, sexism, anti-Semitism, xenophobia, Islamophobia and homophobia, and the online platforms designed to recruit and incite others.

If we consider for a moment the brazen van attack in Toronto last spring that resulted in 10 deaths and 16 injured, mostly women, it raises an important question. Who would have thought that there existed a fringe cybercommunity of misogynists bonded together around their collective and explicit disdain of women because of their social and sexual rejection? It is deeply troubling that online hate and the incitement to violence are so exacting in their allure and resonance.

In virtually all these hate crimes reported in the North American media we have come to learn that the perpetrators were, to some degree, influenced by online activity and affiliations. Some cases involve pre-existing mental health issues. This might lead us to the conclusion that it is those individuals who hold bigoted views, who have a propensity for violence or who suffer from a form of mental illness who are susceptible to committing such crimes.

This may very well be true, but let us consider one poignant statement by this committee regarding the statistics on the rise of hate-related conduct. I quote from this committee: “non-violent crimes, such as public incitement of hatred, played a greater role in the increase than violent hate crimes”. This shows, then, that people in general are more emboldened to act and speak out of their particular bigoted views at an alarmingly higher rate than are doing so by just violent acts. For them, the Internet can often be found to act as an open door of incitement to hate. The very fact that there are others online who share the same hate is what gives it a perceived legitimacy.

The Italian economist and philosopher Vilfredo Pareto, in his commentary on the problems of power and wealth in a society, introduced the concept of residue. This was at the turn of the 20th century. Residue is what lies in all people, according to Pareto, as political and social beings, and in this sense refers to our deep-seated motives. It speaks to a fundamental aspect of how we wish to behave and the way we structure meaning in our lives. A successful leader or demagogue will be able to masterfully reach and manipulate those residues and turn people or the government to their own good through justifications and rationalizations.

Interestingly, Pareto observed that people are persuaded toward something, not because of the reasoning but because they already believed it. It should come as no surprise then that Pareto had a deep and lasting influence on the Italian dictator Benito Mussolini in his fascist policies.

I use this as a motif for the purposes of this hearing. I don't wish to overstate this case, but let us consider the possible implications for today. If Pareto is even remotely accurate in his assessment, then it could stand to reason that some of us, as social beings, potentially have some deep-seated residual prejudice. The growing incidence of hate-based actions and crimes committed by all people of all creeds and backgrounds across the social demographic demonstrates that this is not only an issue involving fringe, vulnerable or mentally ill individuals. Rhetorically speaking, what lies residually in all of us can be awakened by demagoguery, other authority or, in this case, the legitimacy of hatred that comes via online. The resonance of hate among a growing number of people should alarm us all.

Technology outpaces jurisprudence. The interaction of law and social media is a clear example of the complexity of balancing democratic liberties—with respect to charter rights of free speech and freedom—with discrimination. It is our hope, from our shared concern to address online hate, that through this legislative process, the protections and redress for all Canadians would be fair, equitable and robust.

The Presbyterian Church in Canada is committed to combatting online hate and prejudice in all forms and continues to promote a culture of care, compassion and mutual responsibility as a faith community, as Canadians and as global citizens.

Thank you.

9:05 a.m.

Liberal

The Chair Liberal Anthony Housefather

Thank you very much, Mr. Cho.

We'll now move to questions.

Mr. Barrett.

9:05 a.m.

Conservative

Michael Barrett Conservative Leeds—Grenville—Thousand Islands and Rideau Lakes, ON

Thanks very much, Mr. Chair.

Thank you to the witnesses for joining us today and sharing your perspective with us.

It's very troubling to be able to quickly recall a number of devastating developments around the world and here at home that speak to this issue: the recent introduction of unacceptable anti-LGBTQ laws in Brunei that carry the death penalty to be carried out by stoning; the aforementioned misogynistic van attacks that resulted in the death and injury of many people in Toronto; the Christchurch mosque shootings with a white supremacist motivation; the anti-Christian bombings in Sri Lanka, a massacre; the anti-Semitic shootings at a synagogue in California. We can just look at a couple of weeks of newspapers—we're not talking about my lifetime—and we've had many examples of that here in Canada.

To your point, Mr. Cho, the communities that these individuals all have in common are online. That is the real thread between this and it's coming from every walk of life, creed, colour and origin. The real common thread is that they're found online. It's undeniable that there is a terrible issue before us. How do we deal with it?

Mr. Herman, you referenced a safe flagging process, and I'm wondering if you could quickly tell us about that.

9:05 a.m.

Director, Government Relations, B'nai Brith Canada

Brian Herman

Certainly. This is the trusted flagger.

9:10 a.m.

Senior Legal Counsel, B'nai Brith Canada

David Matas

Yes, this process was developed with the European Union and four of the service providers. They negotiated an agreement with Google, Facebook, Microsoft and Apple whereby these four providers would work with trusted flaggers—NGOs that specialize in this area—and would quickly react to complaints and take down material quickly if it were problematic.

Of course, you're dealing with a wealth of material on the Internet. The people in charge are not specialized in this area so they often don't know of the problem and often don't see it as well. The idea with the trusted flaggers is that they would be people who would know the problem and could quickly bring it to the attention of the service providers.

That system is useful but it's not transparent. I know some of the people involved as trusted flaggers and I asked them what's going on, and their answer was that they couldn't tell me. The Internet itself provides what it calls transparency reports. You can see them on the Internet, but they're not transparent. They don't tell you very much. I think it's a good system but it can't replace legislation. I think you need both. You can't just say you'll leave it for the service providers to do, with the help of the NGOs.

9:10 a.m.

Director, Government Relations, B'nai Brith Canada

Brian Herman

For those who may not be aware, I think it's a concept of pre-clearing groups or organizations that are regarded as somewhat expert in the field so that if B'nai Brith Canada, for example, is a trusted flagger and we go to a provider and say there is anti-Semitic content here, it's not as if they have to ask who B'nai Brith Canada is and how do they know we're qualified in this field.

9:10 a.m.

Conservative

Michael Barrett Conservative Leeds—Grenville—Thousand Islands and Rideau Lakes, ON

There was reference to section 319 of the Criminal Code and to subsection (3) on defences of public incitement of hatred, I think that concern has been raised that a good faith expression of an opinion based on a religious belief could create an environment whereby someone's legitimate religious beliefs are being suppressed because they don't conform to a narrative if we didn't have a transparent process, depending on the system we adopt.

What do you think the risks are to removing that exception to section 319? That's for any of you gentlemen to answer.

9:10 a.m.

Senior Legal Counsel, B'nai Brith Canada

David Matas

I was involved with the litigation about the constitutionality of that provision. I intervened for B'nai Brith in the case of Keegstra, where it was declared constitutional in the courts, by four to three.

I acknowledge that there is a constitutional risk. If you remove any of the defences, it becomes subject to constitutional challenge again, and it was borderline in terms of constitutional acceptance.

My own view is that the environment has changed so substantially and you see so much incitement that is religious based and leads to extremely violent acts that the removal of that defence, in spite of the risk, would mean that the provision could still withstand a constitutional challenge.

9:10 a.m.

Liberal

The Chair Liberal Anthony Housefather

This will be the last question.

9:10 a.m.

Conservative

Michael Barrett Conservative Leeds—Grenville—Thousand Islands and Rideau Lakes, ON

It's probably too long a question to ask in a short period of time.

If, for example, someone were to speak of Israel's right to exist and we had a group saying that was a suppression of the rights of the Palestinian people, we have what both groups believe are fair and legitimate claims to make. Should that discourse not be able to happen in the public space, and would removing that section not put at risk the right of one group or another to make their expression?

9:15 a.m.

Senior Legal Counsel, B'nai Brith Canada

David Matas

There's no doubt that this law is problematic in the sense that many people, when they hear something with which they disagree, say it's hate speech and want it prosecuted. The way that is defended against is through the requirement of consent of the Attorney General, but what we've seen so far is that the requirement of consent of the Attorney General has been saying no to too many, rather than too few.

Our solution is to keep the consent, because otherwise the problem you pose would exist, but to have clear guidelines that could deal with issues such as that as well. All the hypotheticals you could imagine could be put into guideline form.

9:15 a.m.

Conservative

Michael Barrett Conservative Leeds—Grenville—Thousand Islands and Rideau Lakes, ON

The fear is, of course, that a political motivation or changing governments would change what is acceptable and what is not, instead of just having a common standard.

9:15 a.m.

Senior Legal Counsel, B'nai Brith Canada

David Matas

I would hope that wouldn't happen.

9:15 a.m.

Conservative

Michael Barrett Conservative Leeds—Grenville—Thousand Islands and Rideau Lakes, ON

Right.

Thank you.

9:15 a.m.

Liberal

The Chair Liberal Anthony Housefather

Thank you very much.

We're now going to Mr. McKinnon and Mr. Virani, who are sharing this time.

9:15 a.m.

Liberal

Arif Virani Liberal Parkdale—High Park, ON

Mr. McKinnon has kindly given me his time, so thank you, Mr. McKinnon.

Thank you, to all of you, for being here. It's a very important issue. I know many of you have been speaking out about this issue for some time and your expertise is very well received.

Mr. Herman, I was taken by the statistics you started out with, about a 16.5% rise in incidents in Canada and 80% of incidents being online, as well as one of your opening comments that we need to collect more data.

I think there is absolute agreement from all parliamentarians in terms of what we've done thus far. The budget a year ago provided $6.5 million to the Centre for Diversity and Inclusion so we can start collecting disaggregated data to do just that.

I have about four questions, so could you keep your responses somewhat brief?

We've also heard that people are more likely to come forward to people they trust. Are state actors the best entities to collect the data, or should we be relying more significantly on Jewish groups, Muslim groups, black groups, indigenous groups, and so on, that have the trust of their constituents who are experiencing these types of hatred?

9:15 a.m.

Director, Government Relations, B'nai Brith Canada

Brian Herman

One thing we need to do is to look at a question that has been raised at this committee before, which is if you leave it to individual groups representing their communities to record and analyze the data, there is perhaps a chance that the data or the analysis will be somewhat skewed.

I think the important thing is that there be a way of collecting the data and sharing it both between government and affected communities and between all affected communities themselves, so that we can make sense of it. This is one reason we suggest that there could be an opportunity for a stakeholder forum or a council that involves government, parliamentarians, providers and affected communities.

In our discussions with Statistics Canada, for example, we've said that while they collect hate crimes data, we also include hate incidents or anti-Semitic incidents. Is there a way that they can also collect data on incidents? I know that they're discussing that with the Canadian Association Of Chiefs Of Police, as are we. Would police forces have the capacity to do that?

We have suggested that if police encounter an incident that falls below the threshold of a crime, they should be given advice to refer the person to organizations such as ours—in the case of the Jewish community—so that we can record the data.

I hope I'm not speaking out of line here. I know Statistics Canada is also looking at the option of perhaps including a self-reporting online portal on their site that would allow people who experience something that has perhaps not been reported to the police or another organization to go online and report it, so Statistics Canada can put it into their data.

9:15 a.m.

Liberal

Arif Virani Liberal Parkdale—High Park, ON

I have a few more questions. I'll let you answer both of them at once.

With respect to the standards, you talked about broadcasting standards and applying standards to the online space. Would it be sufficient to just translate the current standards that apply to TV and radio broadcasting to the online space or do we need to design new ones?

Secondly, Mr. Matas, you listed about five different flags on section 13 of the CHRA. If you have suggested language about what you would like to see in terms of a new, invigorated and redesigned provision of section 13, it would be helpful if you wanted to submit that as well.

9:20 a.m.

Senior Legal Counsel, B'nai Brith Canada

David Matas

I could produce language, but I realize Parliament has its own drafters. They may have their own views on what the language should be, but I'm happy to do that.

In terms of what's in the Broadcasting Act, of course it gets us into the CRTC. I'm not so sure. I don't have any problems with the standards in the Broadcasting Act. I think we need to be sensitive to the nature of the phenomenon where you have so much going on with broadcasting. The broadcasters know what they're broadcasting. With the Internet service providers, they don't know what's there. You have to have a kind of notice provision and then reaction. You have to set up that mechanism. They've been told and they don't do anything, which doesn't exist in the broadcasting legislation.

In terms of the previous question, my answer would be both. The trouble with state reporting right now.... I think you're right that the NGOs know it a lot better. If you don't have state reporting or reporting to the states.... The problem right now is that very many of them don't know what hate speech is. If you cut off this reporting, that problem is going to be exacerbated. I think you need the mix of the NGO reporting—they know what it is—and the state reporting, so they can come to appreciate what it is.