Evidence of meeting #143 for Justice and Human Rights in the 42nd Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was hatred.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Shimon Koffler Fogel  President and Chief Executive Officer, Centre for Israel and Jewish Affairs
Ryan Weston  Lead Animator, Public Witness for Social and Ecological Justice, Anglican Church of Canada
Idan Scher  Canadian Rabbinic Caucus
Imam Farhan Iqbal  Ahmadiyya Muslim Jama'at
Richard Marceau  Vice-President, External Affairs and General Counsel, Centre for Israel and Jewish Affairs
Shahen Mirakian  President, Armenian National Committee of Canada
Alex Neve  Secretary General, Amnesty International Canada
André Schutten  Legal Counsel and Director of Law and Policy, Association for Reformed Political Action Canada
Geoffrey Cameron  Director, Office of Public Affairs, Bahá'í Community of Canada

8:45 a.m.

Liberal

The Chair Liberal Anthony Housefather

Good morning, everyone. Welcome to the Standing Committee on Justice and Human Rights as we launch our study on the issue of online hate.

This is a really important issue. With the increasing numbers of hate crimes being reported in Canada and groups feeling vulnerable and the increasing presence of hate on the Internet, this subject is one the committee wants to tackle.

I know that a number of groups across Canada have asked us to study this issue. Today we will hear from three, and a fourth may join us later on.

Today, representing the Centre for Israel and Jewish Affairs, we have Mr. Shimon Koffler Fogel, President and Chief Executive Officer; and Mr. Richard Marceau, Vice-President, External Affairs and General Counsel. Welcome, gentlemen.

From the Anglican Church of Canada, we're joined by Mr. Ryan Weston, the Lead Animator of Public Witness for Social and Ecological Justice. Welcome.

From the Canadian Rabbinic Caucus, we're joined by Rabbi Idan Scher. Welcome.

We might be joined by the Ahmadiyya Muslim Jama'at, represented by Imam Farhan Iqbal. If he arrives, he will follow the other members of the panel in testifying.

We will start with CIJA. Mr. Fogel, the floor is yours.

8:45 a.m.

Shimon Koffler Fogel President and Chief Executive Officer, Centre for Israel and Jewish Affairs

Thank you, Mr. Chair, and thank you to the members of the committee for inviting us to join this important conversation.

I'm pleased to offer reflections on behalf of the Centre for Israel and Jewish Affairs, CIJA, the advocacy agent of the Jewish Federations of Canada, which represents more than 150,000 Jewish Canadians from coast to coast.

CIJA is encouraged by the launch of this important study. Since the horrific mass shootings at the Tree of Life synagogue in Pittsburgh last October, the deadliest act of anti-Semitic violence in North American history, CIJA has mobilized thousands of Canadians to email the justice minister to call for a national strategy to counter online hate, beginning with this very study.

We've done so with the endorsement of a diverse coalition of partners, including Muslim, Christian, Sikh and LGBTQ+ organizations. It's also worth noting that various groups we have worked with to mark genocide awareness month, including Ukrainian, Armenian, Roma, Rwandan and Yazidi representatives, have united to call for a national strategy on online hate, knowing that genocide begins with words.

Increasingly, terrorist organizations and hate groups rely on online platforms to spread their toxic ideas, recruit followers and incite violence. This is a problem that spans the ideological spectrum. For example, Canadians who have been radicalized to join ISIS have often done so after extensively consuming jihadist content online.

In the case of two recent acts of white supremacist terrorism, the mass murder of Jews in Pittsburgh and Muslims in Christchurch, the killers made extensive use of social media to promote their heinous ideology. The Pittsburgh shooter reportedly posted more than 700 anti-Semitic messages online over a span of nine months leading up to the attack; and the Christchurch shooter's decision to livestream his horrific crime was a clear attempt to provoke similar atrocities.

We cannot afford to be complacent, given the link between online hate and real world violence. My hope is that this study will culminate in a unanimous call on the Government of Canada to establish a comprehensive strategy to counter online hate and provide the government with a proposed outline for that strategy.

Today I'll share four elements that we believe are essential to include: defining online hate; tracking online hate; preventing online hate; and intervening against online hate.

First, a national strategy should clearly define online hate and not assume that online platforms have the capacity to navigate these waters on their own. The focus should not be on the insensitive, inappropriate or even offensive content for which counterspeech is generally the best remedy. The explicit goal should be to counter those who glorify violence and deliberately, often systematically, demonize entire communities. While freedom of expression is a core democratic value, authorities must act in exceptional circumstances to protect Canadians from those who wilfully promote hate propaganda and seek to radicalize vulnerable individuals.

The international community's experience in defining anti-Semitism is an important model. The International Holocaust Remembrance Alliance, or IHRA, working definition of anti-Semitism, which is the world's most widely accepted definition of Jew hatred, should be included in any strategy to tackle online hate. It's a practical tool that social media providers can use to enforce user policies prohibiting hateful content and that Canadian authorities can use to enforce relevant legal provisions.

Second, a national strategy requires enhanced tracking and reporting of online hate, via strategic partnerships between the Government of Canada and technology companies. There are models worth reviewing in developing a Canadian approach, such as TAT, the “Tech Against Terrorism” initiative, a UN-mandated program that engages online companies to ensure their platforms are not exploited by extremists.

Third, a national strategy must include prevention. In the current global environment, trust in traditional media and institutions has declined as online manipulation and misinformation have increased. A campaign to strengthen Internet literacy and critical online thinking with resources to support parents and educators would help mitigate these trends.

Fourth, a national strategy must include a robust and coordinated approach to intervention and disruption. There's a debate over whether government regulation or industry self-regulation is the best approach. Canada can benefit from the experience of other democracies, especially in Europe, that are further down the road in developing policy responses to online hate.

Their successes and setbacks should be considered in crafting a made-in-Canada approach. The solutions, whether regulatory, legislative or industry-based, will flow from open and inclusive discussion with governments, service providers, consumers and groups like CIJA, who represent frequently targeted segments of society.

The committee will likely discuss the former section 13 of the Canadian Human Rights Act. CIJA has long held the view, as we stated years ago during the debate over rescinding section 13, that removing this provision leaves a gap in the legal tool box. There are multiple legitimate ways to remedy this.

One is enhanced training for police and crown prosecutors to ensure more robust consistent use of Criminal Code hate speech provisions. Three recent criminal convictions in Ontario—one for advocating genocide and two for wilful promotion of hatred—demonstrate that these sections of the Criminal Code remain an effective vehicle for protecting Canadians. Similarly, section 320.1 of the Criminal Code, which enables the courts to seize computer data believed on reasonable grounds to house hate propaganda, is a pragmatic tool that should be applied more often.

Another approach is to develop a new provision in the Canadian Human Rights Act on online hate. This requires addressing the clear deficiencies of section 13, which was an effective but flawed instrument. In line with recommendations offered by the Honourable Irwin Cotler, a restored section 13 would require significant safeguards to protect legitimate freedom of expression and prevent vexatious use of the section.

I know there are strong feelings on both sides of the House towards section 13. This need not be a partisan issue. With the right balance, a reasonable consensus can be achieved. I emphasize the need for consensus because the use of legal measures to counter online hate will only be truly effective if those tools enjoy broad legitimacy among Canadians. If misused, misconstrued or poorly constructed, any new legal provisions, including a renewed section 13, would risk undermining the overarching goal to protect Canadians and prevent hate propaganda from gaining sympathizers and adherents.

While Canada has not seen the same polarization that has marked American or European politics, we are not immune to it and the growth of anti-Semitism and other forms of hate in mainstream discourse that comes with it. History has repeatedly shown that Jews and other minorities are at grave risk in times of political upheaval and popular disillusionment with public institutions.

With advances in technology, these processes now unfold with alarming speed and global reach. The tactics of terrorists and hate groups have evolved and so too must public policy. While there's no way to fully mitigate the threat of hate-motivated violence, a strong national strategy to combat online hate would make a meaningful difference.

Chairman and members of the committee, I thank you for your time and would welcome any questions during the Qs and As.

8:55 a.m.

Liberal

The Chair Liberal Anthony Housefather

Thank you very much.

We will now move to the Anglican Church and Mr. Weston.

8:55 a.m.

Dr. Ryan Weston Lead Animator, Public Witness for Social and Ecological Justice, Anglican Church of Canada

Thank you, Mr. Chair and members of the committee, for the invitation this morning to speak to you about our concerns as the Anglican Church of Canada with regard to the proliferation of hate online and the very real impacts of this hatred for communities in this country and around the world. We're honoured to join with the voices of so many other diverse organizations in calling for stronger action on this issue across Canada.

Our religious tradition teaches that every person is imbued with inherent dignity and also particularly calls us to embody special care and concern for those who find themselves uniquely vulnerable to harm or attack. Additionally, as Christians, our faith community has enjoyed a historically privileged position in this country, so we recognize that we have a particular responsibility to speak out for the protection of others. If we are to take these commitments seriously, we must raise our voices to oppose hatred in all its forms.

As you all well know, recent years have seen a proliferation of extreme forms of hatred in online fora that encourage violence and dehumanize those who are the targets of this hate. Recent high-profile violent attacks in Canada and abroad have emphasized the reality that these sentiments do not remain online, but have tragic offline consequences as well, and that they are in need of immediate and sustained attention.

We join with many of the other witnesses here today in calling for the federal government to develop a national strategy to combat online hatred. This government has the ability to impose reasonable regulation on social media corporations, Internet service providers and other relevant corporate actors to control the proliferation of violent hate speech in online spaces. The government must also develop a strategy for more effective enforcement of existing laws regarding the public incitement of hatred, with particular attention given to the ways these attitudes are expressed online so that these activities will not go unchallenged.

In order to be effective, any national strategy will also work in partnership with other stakeholders, recognizing that responsibility for combatting hatred, both online and off, does not rest solely with the government. Corporations, including the large social media companies, must update their terms of use and their monitoring and reporting activities in order to better control the dissemination of hate through their networks and to remove hateful posts and users.

Faith communities and civil society organizations must also affirm our commitment to combatting hate in the communities that we serve and to use our voice and influence to challenge such expressions wherever we encounter them. Recognizing that many of the hateful words and actions directed towards the communities impacted by hate speech are carried out in the name of religion, the active participation of faith communities and interfaith coalitions is essential to effectively combatting this reality.

We commit to continuing our work with ecumenical and interfaith partners in addressing these attitudes within our own communities, and we draw strength from the leadership and witness of many Canadian faith groups who have been actively working to combat hate and oppression for many years, some of which are appearing before you today.

It's also important to remember that hatred online is never completely detached from hatred offline: hatred that is being promoted among sympathetic networks or directed at individuals and communities in the streets of our country. Although online hatred presents new challenges in terms of the ready accessibility of such extreme views, the roots of these attitudes are based on arguments and myths with long and influential histories in Canada and around the world. We must confront these attitudes at every opportunity.

A national strategy to address online hatred, then, must also equip families, community leaders and individual Canadians to challenge expressions of hatred, extremism and violence wherever they may encounter it. Education and awareness must be key parts of any strategy to address this issue, equipping people with the tools they need to dismantle these ideologies.

While I've been speaking about online hate in fairly broad terms this morning, we must also name that there are specific communities being targeted by these sentiments and that any national strategy to combat this hatred will only be effective inasmuch as it recognizes the specific realities and repercussions of particular forms of hatred. Although an overarching strategy is certainly necessary in this work, we must also develop integrated strategies that address and debunk the myths underpinning anti-indigenous racism, anti-black racism, anti-Semitism, misogyny, Islamophobia, homophobia and transphobia, xenophobia and anti-immigrant sentiment, religious intolerance and other forms of hatred that have distinct impacts on the safety and security of identifiable groups of people in this country.

The Anglican Church of Canada is increasingly attending to the importance of online presence as a positive means of communication and education, so supporting the development and implementation of a national strategy to combat hatred online is a natural step in this work for us.

We recognize that we have the ability to reach thousands of Canadians through our services and programs across the country, as well as with our online presence. We are committed to continuing to lift up a vision of this country and this world that truly welcomes and respects everyone by offering safe, supportive spaces for all Canadians and by challenging expressions of hatred directly.

If we fail to take more concerted action in this country to combat all forms of hate—online and in person—then further high profile acts of violence will embolden similar action by others. We must all work together to offer a positive, loving alternative to this hate, an alternative that affirms the inherent dignity of all persons in Canada and around the world.

Such an alternative requires strong, strategic direction from government that supports efforts by all stakeholders to challenge these attitudes. We are ready to collaborate in this work together with our prayers, our pulpits and our presence online, but only by working together can we confront this important issue and make our world a little safer for so many.

Thank you.

9 a.m.

Liberal

The Chair Liberal Anthony Housefather

Thank you very much.

We will now move to the Canadian Rabinic Caucus, Rabbi Sher.

9 a.m.

Rabbi Idan Scher Canadian Rabbinic Caucus

Thank you for having me here today. I represent the Canadian Rabbinic Caucus, a group of around 200 rabbis from across the country and from across the Jewish denominational spectrum.

On October 27, 2018, 11 Jews were murdered at the Tree of Life synagogue in Pittsburgh, Pennsylvania. The murderer had been highly active in promoting anti-Semitism on social media. It's reported that he posted more than 700 anti-Semitic messages online in the nine months or so prior to the attack. Just two hours before the attack, the murderer foreshadowed his actions in his final disturbing online post.

On Friday, March 15, 50 Muslims were murdered by a white nationalist terrorist at two mosques in Christchurch, New Zealand. These murders played out as a dystopian reality show delivered by some of America's biggest technology companies. YouTube, Facebook, Reddit and Twitter all had roles in publicizing the violence and, by extension, the hate-filled ideology behind it.

The shooter also released a 74-page manifesto denouncing Muslims and immigrants, which spread widely online. He left behind a social media trail on Twitter and Facebook that amounted to footnotes to his manifesto, and over the two days before the shooting he posted about 60 of the same links across different platforms, nearly half of which were to YouTube videos that were still active many hours after the shooting.

As these horrific attacks demonstrate, hate can be lethal, and online hate can foreshadow mass violence. There is no question that the Internet has become the newest frontier for inciting hate that manifests itself disturbingly offline.

ln 2017, the World Jewish Congress, representing Jewish communities in 100 countries, released a report indicating that 382,000 anti-Semitic posts were uploaded to social media in 2016. Stated differently, that's one anti-Semitic post every 83 seconds.

Although information on online hate in Canada is limited, between 2015 and 2016, according to Cision Canada, a Toronto-based PR software and service provider, there was a 600% rise in intolerant hate speech in social media postings by Canadians. The architect of the study explains that while some of that intolerant or hateful speech was generated by bots, as determined by analyzing the high frequency of posts over a short time, the researchers noted that the bots' language was later mimicked by human users, and therefore it was just as destructive.

These numbers are staggering.

The Canadian government rightfully prides itself as a global thought and action leader in the area of protecting the rights, the safety and the quality of life of the people both within its borders and worldwide. We personally have felt this. Canadian law enforcement agencies have been exceptionally responsive in providing support to our institutions, particularly following the Pittsburgh attack.

However, what is now needed is for federal policy-makers to prevent similar atrocities by launching a national strategy to combat online hate. The explosive growth of digital communications has coincided with rising alienation from traditional media and institutions. Extremists have taken advantage, preying on vulnerable disaffected individuals through the same digital tools and collaborative online culture that now shape so much of our world.

There is, of course, no way to fully eliminate the threat of hate-motivated violence, but a strong national strategy to combat online hate can make a meaningful difference in protecting Canadians. The Centre for Israel and Jewish Affairs, CIJA, has set out a four-step policy recommendation towards fighting online hate.

Step one of that recommendation is defining hate. One very important prong of this step is for the Canadian government to define what constitutes hate. This should begin with the adoption of the International Holocaust Remembrance Alliance—IHRA—definition of anti-Semitism. The IHRA definition is a practical tool that should be used by Canadian authorities in enforcing the law and as well by social media providers in implementing policies against hateful content.

The further steps of CIJA's recommendation include tracking hate, preventing and intervening to stop hate.

On that last step, intervening to stop hate, I would like to make it very clear that we are not looking to police distasteful speech. Freedom of expression is of course a core Canadian value. We are focused on the glorification of violence and systematic propaganda affecting Jews and other communities.

We are confident that an effective balance can be struck between protecting free speech and combatting online hate that demonizes entire communities and leads to violence and murder.

This of course is a complex issue, but we are calling on the Government of Canada to take the lead in understanding it and developing the tools to counter it. We are calling on the Government of Canada to launch a national strategy to tackle online hate, working in partnership with social media platforms and Internet service providers, as well as other appropriate partners. This is a crucial step in making a difference that we so badly need.

Thank you.

9:05 a.m.

Liberal

The Chair Liberal Anthony Housefather

Thank you very much, Rabbi.

We will move to the Ahmadiyya Muslim Jama'at. We've been joined by Imam Farhan Iqbal.

Imam Iqbal, the floor is yours.

9:05 a.m.

Imam Farhan Iqbal Ahmadiyya Muslim Jama'at

Thank you for inviting me.

As-salaam alaikum.

The peace and blessings of God be on all of you.

I would like to start by recognizing this committee of justice and human rights on behalf of the Ahmadiyya Muslim Jama'at. As an imam of the Ahmadiyya Muslim Jama'at Canada, I would like to offer our heartfelt regards and thankfulness for giving us the opportunity to share our thoughts on this important pertinent matter.

There is no doubt that over the last decade we have seen an exponential rise of hate crimes in our society. When we look at the statistics around terrorism, gang violence and gun violence, we see a general albeit worrying upwards trend. Similarly—and sometimes overlooked—with the dawn of the Internet and social media, we have seen a stark rise in online hate speech and violence as well.

To start off, we can quickly discuss what hate speech really is, as was mentioned earlier as well. Where do we draw the line between freedom of speech versus hate speech? Hate speech refers to abusive or threatening speech or writing that expresses prejudice against a particular group.

To put into perspective how dangerous this is, according to a Maclean's article, online hate speech rose 600% in Canada. Some of the words they monitored to demonstrate this rise are #banmuslims and #whitepower. Most recently, we are all aware of the rise of Islamophobia, which has hit home in Canada in Quebec City and led up to the Quebec City mosque attack. There had been a surge in mosque arson and vandalism across Canada and eventually it led to this attack. As recently as last week, in London, U.K., a right-wing extremist, Steven Bishop, pled guilty to a plot to carry out a bomb attack at the Baitul Futuh mosque, which is one of Great Britain's largest mosques belonging to the Ahmadiyya Muslim Jama'at.

Ahmadi Muslims believe that free speech is a sacred freedom, insofar as it provides an indispensable flow to a human's thoughts and is a force for good. Hate speech and ideology designed to cause harm and grief must not be allowed to use the disguise of freedom of speech.

We all recognize the imminent and very real threat of hate speech. How can we solve it? Is there a way to solve this problem by putting together a 30-, 60- or 90-day plan? Most probably not, and that's because this effort is not about bending heads. It's about changing minds.

To combat the rise of hate crimes online and in person, we need to respect our differences and to continue stay true to our notion of acceptance. Ignorance breeds suspicion, fear and anger. Familiarity breeds understanding, compassion and love. When people come together and find out how similar we are, it is only then that we can truly have feelings of sympathy, understanding, compassion and love. This is the way to combat hate. We need to open our doors and our hearts. We need to interact with one another with unconditional love. We need to recognize the rights of one another.

To truly combat the rise of hate crimes, we need to accept that this is not something that will be fixed overnight. Rather, it is something that will require a daily and regular struggle, which will help shape the way we think and interact with one another. Once we start to truly embody the profound saying of “love for all, hatred for none”, only then can we can start to tackle the rise of hate crimes.

Together, we are stronger than any one individual. Love is much stronger than fear. Fear demands that we think of ourselves as somehow separate from one another. By coming together, our common bond increases our capacity for love to be dominant and allows us to live together in peace and harmony.

Thank you.

9:10 a.m.

Liberal

The Chair Liberal Anthony Housefather

Thank you very much, Imam.

I'd like to thank all of the members of the panel for their very important contributions.

We are moving to questions, and we will start with Mr. Cooper.

9:10 a.m.

Conservative

Michael Cooper Conservative St. Albert—Edmonton, AB

Thank you, Mr. Chairman.

Thank you to the witnesses.

I certainly agree that this is an important study and that all Canadians should be extremely concerned by the proliferation of hate that has increased worrisomely in the last number of years. It has hit home in communities across Canada, most significantly in the horrific mosque attack in Quebec City, and we saw it just eight hours away in Pittsburgh at the Tree of Life synagogue, which I know well, because it's a couple of blocks away from where my brother lives.

It's important that we tackle this issue and while we do so, we of course have to be cognizant—as I think all of the witnesses have indicated—of fundamental freedoms, including freedom of speech, and how we can strike that balance.

Perhaps I'll begin with Mr. Fogel. You did mention section 13, which I, with respect, believe was a flawed section for a number of reasons. You mentioned that, with the removal of what many believe to be a flawed section in the Canadian Human Rights Act, there is a gap.

When you look at section 318 or section 319 of the Criminal Code, or I think you cited section 320.1, I'd just be interested in understanding where you see the gap. Perhaps you could elaborate on what you would recommend to close what you state is a gap.

9:15 a.m.

President and Chief Executive Officer, Centre for Israel and Jewish Affairs

Shimon Koffler Fogel

From our perspective, we were pretty agnostic about whether section 13 at the time it was debated was tweaked to be responsive to some of the deficiencies inherent in the section, or whether, if it were abandoned, the Criminal Code provisions were provided with a more robust capacity to compensate for the loss of section 13. Essentially, the problem is the one that you referenced and that I think would be familiar to everybody on the committee. We are dealing with two competing imperatives. On the one hand is the desire to ensure that people can avail themselves of the freedom to express thoughts and ideas freely, without fear of persecution or prosecution however odious those ideas might be. On the other hand, unlike our American cousins, we recognize that there is a limit to freedom of expression. When it begins to encroach on the safety and security and well-being of others, that really constitutes a red line.

The challenge with section 13 was that it was both a sword and a shield. It was providing some insulation for those who really did have malicious intent and it wasn't offering protection to those who were the targets of toxic or vitriolic hatred. We had hoped that government writ large—it's not just a federal issue but applies also at the provincial and municipal levels—would come forward and demonstrate a real political will to pick up the slack from the loss of section 13. In fact, I think Richard wrote to the attorneys general across the country, calling on them to adopt a more aggressive posture in terms of considering and acting upon potential hate crime activity, which would be able to compensate for the loss of section 13. Frankly, that hasn't been our experience. There has been some, but not enough. If we were to return to a section 13 kind of model, we would want to ensure that there were provisions with respect to evidence, and also with respect to the onus of ensuring that it was not a SLAPP or vexatious kind of lawsuit, that really would protect the ability of individuals and groups to articulate ideas even if they didn't meet with a uniformly positive response.

There were a number of other things that I think the committee could look back to in the testimony given by a variety of different stakeholders and that could offer a solution. What is clear, however, is that government writ large has to be able to employ useful tools, tools that do make a distinction between freedom of expression and freedom from hate, as they proceed along this line.

I know this is straying just a little bit from your question, but what we should note is increasingly the call from social media providers, the platforms that we are all talking about ubiquitously, who themselves are struggling to figure out what the lines are and where they should be intervening and what should be their response to things that appear online. I think there is a signal that they're looking for leadership from government to help provide them with the guidance necessary for them to put into place, whether it's human resources or algorithms or what have you.... The numbers are staggering. I have reviewed some of the stuff from Facebook alone in terms of the hundreds of millions of posts. It's hard to wrap your head around the idea of monitoring and responding to that volume of information.

On clear guidelines from government, I'll return to the issue of the IHRA definition of anti-Semitism. One of the benefits that it provides a government is that it offers a clear definition. You can then use that as a template for the algorithms you are going to put into place and the word searches you are going to use to trigger closer scrutiny and so forth.

If we were to do this across the span of looking at those specific things to which different communities...whether it's LGBTQ or the Muslim community, the Bahá'í, and so forth, I think that would go a long way in being able to entrench the kind of distinctions on freedom from hate and freedom of expression that everybody around this table I suspect is rightly concerned about.

Thank you very much.

9:20 a.m.

Liberal

The Chair Liberal Anthony Housefather

Thank you very much.

Mr. Virani.

9:20 a.m.

Liberal

Arif Virani Liberal Parkdale—High Park, ON

I'm going to put the timer on, because I know there are a lot of strong advocates at the table.

First of all, welcome. Thank you for being here. As-salaam alaikum. Shalom. It's really, really important.

This is something that I take seriously, Canadians take seriously and my constituents in Parkdale-High Park take very seriously. I know that all of you are here with the best of intentions.

We are seeing an unbelievable amount of hatred, and I'm glad you outlined many aspects of it. Whether it's anti-Semitism, Islamophobia, homophobia, transphobia, anti-indigenous sentiment, anti-black sentiment, incel movements, etc., these are a cause for huge concern right here in Canada. We've seen the attacks in Quebec, Pittsburgh and New Zealand.

There was at one point, a tool—and I want to pick up on this, but I'm going to hold you to a bit of a time limit, Mr. Fogel—that was applied in a previous iteration of the Canadian Human Rights Act, section 13. It talked about targeted discrimination based on a prohibited ground towards an identifiable group that was spread by means of a telecommunication. It emphasized that this included the Internet. That provision was removed by the previous government in or around 2012-13.

At the time, it had gone through some challenges. As early as 2006, previous iterations of the groups who are here, including B'nai Brith, the Canadian Jewish Congress, and the Friends of Simon Wiesenthal were defending that very provision. In the Whatcott decision by the Supreme Court of Canada, the analogue to that provision was upheld on the very basis that has been discussed.

I want to know whether your perspective is that was an invalid provision—so perhaps over to CIJA—and if it wasn't invalid, if you can drive at the heart of what you think needs to be added to make it more robust.

9:20 a.m.

President and Chief Executive Officer, Centre for Israel and Jewish Affairs

Shimon Koffler Fogel

I apologize for my lengthy answers, but nobody gives me time to speak at home.

Look, section 13 was critically important. It provided the protections that you just referenced, and I think it is clear that Canadians and groups within Canada need them.

The problem was that ironically, groups or individuals we should be concerned about were using section 13 as a way of pushing back against those who were raising legitimate free expression ideas or concerns about particular topics. It was chilling, or more precisely freezing, the ability of people to offer critical comment about things of public interest without fear of being brought before some judicial process to account for what they said, because others were claiming that was triggering hate against them.

That was the vulnerability of section 13. There were a whole range of ways to deal with it. You have them in front of you. You've clearly done the research and I would invite the committee to look carefully at those, because it either has to be brought back in a better construct..... Irwin Cotler's formulation—I won't go through it now, as some of you are familiar with it and you can easily access it—was probably the most compelling way to restructure section 13, or provide direction to law enforcement, the public prosecution process, the attorneys general to become much more aggressive and active in applying the provisions of the Criminal Code.

9:25 a.m.

Liberal

Arif Virani Liberal Parkdale—High Park, ON

I am going to stop you there. I have two more areas that I want to canvass.

We talked a bit about social media companies, and perhaps I will ask the Canadian Rabbinic Caucus to address this one.

There are examples of different jurisdictions that regulate social media companies much more vigorously. Germany comes to mind because of its history with anti-Semitism and Nazism. Its regulations have compelled Facebook to be much more robust in the human resources it dedicates to this issue.

What can we learn from examples like Germany?

9:25 a.m.

Canadian Rabbinic Caucus

Rabbi Idan Scher

I would again point to the Centre for Israel and Jewish Affairs' policy recommendation statement and its four steps. They write extensively on Germany as well as on some of the projects the UN has been involved with. I believe Mr. Fogel mentioned those as well.

I think there's a lot to be learned. Some of the European countries are definitely further along than Canada is in these areas, and there are certainly ways to create a made-for-Canada or made-in-Canada type of approach that also uses the lessons of other countries and the UN as well.

April 11th, 2019 / 9:25 a.m.

Liberal

Arif Virani Liberal Parkdale—High Park, ON

The last point I want to raise is that we talked about and a number of you picked up on Mr. Fogel's theme, which is about preventing, and, in fact, intervening.

There is a white supremacist strand to a lot of what we're seeing around the world. We have had people in this country as recently as this week questioning the presence of white supremacy.

I want to know from various people at the table whether there is white supremacy in this country. Second, if there is white supremacy, what is the responsibility of elected legislators to denounce that very fact?

Maybe Ahmadiyya Muslim Jama'at wants to address that. No?

I will open it up to everyone.

9:25 a.m.

Lead Animator, Public Witness for Social and Ecological Justice, Anglican Church of Canada

Dr. Ryan Weston

I think—and I don't think I will get in trouble for saying this—that, yes, there is white supremacy in this country. I think it's important to name that and to say that and to challenge it.

I think the responsibility of the elected representatives, as people who have been given voice by those they represent, is to stand up to that. The government has a clear role, I think, in challenging that ideology, and in countering that ideology, and in offering a counter-narrative that gives us something else to aspire to and something else to build relationships with rather than a white supremacist ideology.

9:25 a.m.

Liberal

The Chair Liberal Anthony Housefather

Thank you very much.

Ms. Ramsey.

9:25 a.m.

NDP

Tracey Ramsey NDP Essex, ON

Thank you all for being here this morning and for your presentations.

Picking up on that threat, I think we saw this week an attempt by Facebook to address some of the groups in Canada that are sharing this information online. We saw the banning of individuals and groups, which I think was a very good move. It wasn't across all social media platforms, unfortunately. I think it was Facebook and Instagram that did that.

To Mr. Fogel's point earlier, the depths that exist in the Internet, even within one platform itself.... There are just layers upon layers of social media giants trying to control this themselves. It really begs the question about how they can do this on their own without government intervention, without the Canadian government being a part of that and, I think, having some basic rules around what is acceptable and what isn't, some ground rules for platforms in our own country.

You all spoke about Pittsburgh and the Christchurch shooting, and the extensive amount of Islamophobic and anti-Semitic material that had been posted by both of these individuals. I think Canadians are asking how it is happening that this is all being posted. Why is no one going to these individuals and stopping it at that point? Is this a failure of social media? Is this a failure of policy? They are also asking how it happens that people are out there sharing these volumes of information and no one is challenging it.

I think Mr. Fogel spoke to this clearly, but I want to ask this to the other panellists: Do you think online platforms should be able to establish their own policies to address online hate, or do you believe that Canada should establish some ground rules as well?

9:25 a.m.

Lead Animator, Public Witness for Social and Ecological Justice, Anglican Church of Canada

Dr. Ryan Weston

I think social media companies should be establishing their own rules, but the government does have a role in pursuing legislation that sets a baseline of requirements for that so it's not relying solely on the goodwill of these corporations but reflects the commitment of this country in how we manage communications and telecommunications especially. I think there is a both/and there.

9:30 a.m.

Ahmadiyya Muslim Jama'at

Imam Farhan Iqbal

I would agree with that. When I look at the history of social media companies trying to deal with these kinds of things, it's not very promising. It's not that they have given us great results and they are doing this so well.

There was a CBC documentary, I think, about this as well, about how these companies try to deal with provocative videos or violent videos and those kinds of things. It deviates from what we're discussing, but it's because they are run for money. It's a business really. Those kinds of videos give them more viewers and more users and more engagement, so they tend not to take down all the controversial videos out there, all the violent videos out there. They keep some of them so they can keep on engaging their audience.

When it comes to these companies, it's a question mark really how much we want to rely on them to make the perfect rules. I think the government should have some involvement as well.

9:30 a.m.

Canadian Rabbinic Caucus

Rabbi Idan Scher

It seems that the mainstream social media platforms do have terms of use. They do have certain regulations and requirements, but it does seem very clear as well that they just cannot seem to keep up with what is going on as far as online hate is concerned. That being the case, and to echo what all of the witnesses have said, I think we need to see government take a leadership position. Of course, it will be in partnership with social media platforms, Internet service providers and other appropriate partners, but I definitely think that government will need to take the lead in this collaborative approach to actually being able to keep up with the monitoring and fighting of online hate.

9:30 a.m.

Richard Marceau Vice-President, External Affairs and General Counsel, Centre for Israel and Jewish Affairs

If I may add, Ms. Ramsey, in September 2018, a report on online hate was produced and given to the Prime Minister of France. They are due to present a bill in the next few weeks.

When I was reading through the report, some of its elements were interesting and actually quite simple and helpful. One of their recommendations is to have a universal logo on all those platforms that you can click if what is written is hateful. That helps those Internet companies. Another suggestion they have that might be interesting to look at is a time limit imposed on companies to take out or erase those comments. Another one that I looked at and that should be looked at is a way to make the complaints online so that you don't have to go to the police, don't have to drive down to the police station, to make the complaints. When you look at things to put in a national policy, these are some elements that I would invite you to explore. They're simple, and I think they could be helpful.

9:30 a.m.

Liberal

The Chair Liberal Anthony Housefather

It's Tracey's time, so you have to....