Evidence of meeting #151 for Justice and Human Rights in the 42nd Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was hatred.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Lina Chaker  Spokesperson, Windsor Islamic Council
Sinan Yasarlar  Public Relations Director, Windsor Islamic Association
Elizabeth Moore  Educator and Advisory Board Member, Canadian Anti-Hate Network and Parents for Peace, As an Individual
Faisal Khan Suri  President, Alberta Muslim Public Affairs Council
Avi Benlolo  President and Chief Executive Officer, Friends of Simon Wiesenthal Center for Holocaust Studies
Mohammed Hussain  Vice-President, Outreach, Alberta Muslim Public Affairs Council
Dahabo Ahmed Omer  Board Member, Stakeholder Relations, Federation of Black Canadians
Akaash Maharaj  Chief Executive Officer, Mosaic Institute
Sukhpreet Sangha  Staff Lawyer, South Asian Legal Clinic of Ontario
Bradley Galloway  Research and Intervention Specialist, Organization for the Prevention of Violence
Shalini Konanur  Executive Director and Lawyer, South Asian Legal Clinic of Ontario

8:45 a.m.


The Chair Liberal Anthony Housefather

I call the meeting to order.

Good morning, everyone. Welcome to this meeting of the Standing Committee on Justice and Human Rights, as we continue with our study of online hate.

It's a great pleasure to welcome Mr. Rioux to this committee for the first time.

Welcome, Mr. Rioux.

8:45 a.m.


Jean Rioux Liberal Saint-Jean, QC

Thank you.

8:45 a.m.


The Chair Liberal Anthony Housefather

It's also a great pleasure to welcome all the witnesses here today.

We have a very distinguished panel, and quite a lot of witnesses in this first panel.

I'm going to start with those here by video conference. We have both the Windsor Islamic Council and the Windsor Islamic Association, represented by Ms. Lina Chaker and Mr. Sinan Yasarlar.

Are you both hearing me?

8:45 a.m.

Lina Chaker Spokesperson, Windsor Islamic Council

Yes, I can hear you.

8:45 a.m.

Sinan Yasarlar Public Relations Director, Windsor Islamic Association

Yes, we are hearing you.

8:45 a.m.


The Chair Liberal Anthony Housefather


You're going to testify first, after I introduce all the other witnesses, because we do not want to lose the video conference connection.

In the room with us, as an individual, we have Ms. Elizabeth Moore, educator and advisory board member of the Canadian Anti-Hate Network and Parents for Peace. Welcome.

From the Alberta Muslim Public Affairs Council, we have Mr. Faisal Khan Suri, president, and Mr. Mohammed Hussain, vice-president of outreach. Welcome.

From the Friends of Simon Wiesenthal Center for Holocaust Studies, we have Mr. Avi Benlolo, president and CEO. Welcome.

The rules are eight minutes per group.

We're going to start with the Windsor Islamic Council and the Windsor Islamic Association. I understand they are splitting their time.

Please go ahead. The floor is yours.

8:45 a.m.

Public Relations Director, Windsor Islamic Association

Sinan Yasarlar

Good morning, honourable MPs. We would like to thank the members of Parliament for allowing us to give our perspectives on online hate on behalf of the Windsor Islamic Council and the Windsor Islamic Association, of which I am the public relations director. Lina Chaker is from the Windsor Islamic Council.

Good morning to everyone.

The problem is victims of online hate. Internet use is growing year by year and will continue to do so in the generations to come. Just as we have regulated other technologies, including television, radio, movies, magazines, and other communication platforms, we cannot ignore the Internet. The harm of online conversations transcends the digital world. We don't need to cite violent events or even the most recent attack in New Zealand to prove that online hate has real-world consequences.

Our community centres are filled with troubled youth facing negative peer pressure, social anxiety, and mental health issues. The overall international Muslim community has been shaken twice over the past couple of years by terrorism, just as other communities have been. These terrorists clearly built their Islamic knowledge from misinformed online sources that spew hate.

We have our own Canadian example from January 29, 2017, in Quebec, with evidence that motivation was driven by online hate sites.

To prevent and respond to online hate, we believe there are three important actions the Government of Canada can take.

Number one is to set strict standards and guidelines for social media companies to self-regulate their content. Number two is to more readily enforce legislation that criminalizes both online and off-line hate speech. Number three is to increase awareness about public reporting and responding to this type of behaviour online.

The first action is to impose strict self-regulation standards and penalties for social media companies. Other countries have developed strategies to impose regulations and protocols for social media companies to self-regulate the content of hate speech on their sites. For example, Australia and Germany now penalize social media sites that fail to remove hateful content with financial charges or even imprisonment.

Alternatively, some countries such as Sri Lanka...[Technical difficulty—Editor] ...social media to stop the spread of misinformation and hate. Canada should consider policies of the kind that have been adopted in Australia, Germany and even Sri Lanka to enforce the removal of hateful content and combat terrorism.

We recognize that there may be difficulties in regulating online content. However, our country currently regulates other forms of online content such as child pornography, and anti-spam legislation does exist.

Similar to this, there has to be an effort to combat online hate. For the individuals who try to bypass such regulations, we should combat that by not allowing companies to provide individuals with VPNs or other IP-blocking programs.

Nuimber two is to introduce effective legislation to penalize those who incite hatred. In addition to penalizing social media companies for not taking down hateful content, we must penalize Canadians who spread hateful messages, whether online or off-line. Although we currently have tools to do so, such as section 319 of the Criminal Code, our community feels that they are not adequately utilized and thus cannot encompass online hate crimes.

In fact, we had an unfortunate local example here in Windsor, Ontario. An individual was spraying graffiti all over the city, on the posts and bus stop signs, inciting hatred and harm to Muslims specifically.

These acts weren't recognized as hate crimes under section 319, which makes our community pessimistic about the prospects of encompassing online hate speech. This individual had a misdemeanour and no other charges were pressed against him.

Recognizing this, we believe that section 13 of the Human Rights Act was a vital piece of legislation that was dedicated to online speech. However, it can be amended or restructured to be more effective. We recognize that section 13 was not heavily utilized before it was repealed. However, we do not find this to be a convincing reason not to reintroduce it.

Online hate can be responsible for other types of actions in our society, including verbal attacks against women with hijabs, trying to do harm to people of a visible minority and inciting the physical confrontations that have happened in several supermarkets, shopping areas and malls in our country.

Thus, we are not limiting the discussion of section 13, but hope that any legislation introduced to combat hate will readily be enforced for the betterment of our multicultural Canadian society. The frequency with which a piece of legislation is used should not be the basis on which we decide whether it exists or not. Rather, it should highlight to us that most people still do not know what to do when faced with online hate.

We recommend that there be more education on the consequences of promoting hate. While recognizing that education tends to be a provincial mandate, it is our believe that the Government of Canada can play a vital role. This leads us into our third and final point: educating the public on how to report incidents of hate.

8:50 a.m.

Spokesperson, Windsor Islamic Council

Lina Chaker

My colleague went over the first two action points that we believe the Government of Canada can take by introducing regulations for social media companies and legislation to regulate those who are spreading online hate. I will cover the third point, which is that we believe that victims of online hate need to be more educated so that they know what to do when they are faced with it.

We grew up with teachers telling us how to respond to bullying on the playground. That's not really effective for the online world. They taught us that sticks and stones can break your bones, but words don't really hurt you. Unfortunately, in today's world, we learn that words can not only hurt you psychologically but can also lead to criminal activity and even terrorism.

I want you to think about the last time you tried to report an online hateful comment. Assuming that the process for reporting the post was user-friendly and noticeable—that is, you actually saw the button that says “report”—where did it lead? Did you have to personally follow up and check to see if it was taken down? How many times? Did you have to forward it to your friends and convince them to also try to report it? How many of us continue to experience and see online hate, despite the continued reports?

We have a couple of recommendations for the government to enforce so that social media companies will better create mechanisms for us to be able to help them regulate the content.

The first is to make it easier to report hateful content. Currently, for example, Facebook doesn't have a “report” button; it has a “give feedback” button. It's not as visible.

Second, hasten the time between the reporting of a post and its examination. As we know, time moves much faster in the virtual space than it does off-line. These processes should be receptive to that.

Third, social media companies should provide the person who reported the harm with an update and provide them with information about other resources, including law enforcement, and such resources as the human rights commission.

Fourth, social media companies should examine software and other algorithms that direct users to violent content and share that with government authorities so that the government can also help find and eliminate violent extremist material.

Finally, social media companies should produce tools that help us, and help users, differentiate between credible information and fake news.

As we have been talking about, there are two kinds of content online that can lead to a lot of violence. One is actual hate and the other is misinformation. We believe the Government of Canada can support and fund community initiatives of digital media literacy to help youth and adults alike be able to differentiate between misinformation and credible information as a method of responding to hate. There is a variety of programming that successfully teaches both generations how to differentiate between real and fake news, making them less susceptible to being influenced by hateful messages. This is essential, given the industry of hate and fake news. Moreover, teaching media literacy skills empowers youth to control their own narrative of their identity and to respond to the negative messages with positive ones.

In conclusion, as Prime Minister Jacinda Ardern said, freedom of speech is not advocating murder, and it's also not spreading false or hateful content. We thank the Government of Canada for considering the important consequences of online hate and applaud the right honourable Prime Minister Justin Trudeau for signing the Christchurch call in Paris recently, where he took the effort to tackle this issue of violent online content. However, there is more to be done.

To summarize, we urge the government to combat online hate in three ways: first, by setting strict standards and guidelines for social media companies to regulate their content; second, by more readily enforcing hate speech legislation, be it online or off-line; and last, by increasing the public's awareness about how to report and respond to online hate.

Thank you for your time and consideration.

8:55 a.m.


The Chair Liberal Anthony Housefather

Thank you very much.

8:55 a.m.

A voice

May I just—

8:55 a.m.


The Chair Liberal Anthony Housefather

You guys have gone a little bit beyond your time, so I will go to the next speaker.

We'll now follow the order that's on the agenda.

Ms. Moore, the floor is yours.

8:55 a.m.

Elizabeth Moore Educator and Advisory Board Member, Canadian Anti-Hate Network and Parents for Peace, As an Individual

Thank you.

I want to start by thanking the committee for the opportunity to speak today. It is certainly a privilege that I never thought would be afforded to me as a former extremist. I really appreciate the opportunity to be here.

I would like to provide a bit more context about who I am and how my views about online hate have been informed.

In the early nineties, I was a member of and spokesperson for the extremist group the Heritage Front, which at the time was the largest hate group in Canada. They acted as an umbrella organization for the racist right at the time. They brought in the Church of the Creator and the KKK, among other organizations. Most troubling, they were trying to do what the so-called alt-right is trying to do today, which is to make inroads into the mainstream and to try to have a veneer of legitimacy on top of the hatred.

I should add that Wolfgang Droege, who was the leader of the Heritage Front, was convicted of many offences prior to starting the organization, including air piracy, the attempt to overthrow a friendly nation and drug offences, which I believe included possession. He managed to influence people, despite this veneer of wanting to be more mainstream and trying to make connections with the Reform Party. His followers committed a wide array of offences of their own, which included hate crimes offences, assault, and targeted and unrelenting harassment of anti-racists.

I feel very fortunate that I was able to leave that terrible world of hatred behind. Since 1995, I've been working with non-profits, educators and law enforcement to raise awareness about the dangers of hate groups. I'm currently on the advisory boards for the Canadian Anti-Hate Network and Parents for Peace, which is an American organization that provides support for families of radicalized individuals.

Back in the nineties, when I was an extremist myself, I quite literally communicated hate by telephone. Also, prior to leaving, I helped prepare materials for the Internet. They were back-issue articles from the Heritage Front's magazine and they ended up posted on what would become Freedom-Site, which was one of Canada's first white supremacist websites. That website, which was run by Mark Lemire, in 2006 was found to contain material that violated section 13 of the Human Rights Act.

I feel fortunate that I never personally got in trouble with the law, but I do realize that it was a very real possibility. I understand that a sample size of one has limited value, but I should say that section 13 did moderate my behaviour. When I was working on the hotline, I was very aware of the fact that friends who were working on hotlines very similar to the Heritage Front hotlines were facing charges under section 13, and it made me more careful. I did not engage in or indulge the unrestrained hatred that I certainly felt inside. I do understand, with the benefit of hindsight, that what I was communicating was still hateful, but it was definitely not as hateful as it would have been in the absence of such legislation.

The methods that are used today to communicate hatred are definitely more sophisticated and exceptionally more accessible than what we had available to us in the nineties. As an analog kid, I have to say that it frightens me that young people today could have their life trajectories altered by watching one YouTube video or interacting with one Twitter account or one Reddit account.

Racist extremists have always networked with like-minded individuals across borders in other countries, but we now have an environment where the transmission of hate knows no borders or language barriers—or even time differences, frankly.

To fully understand what is at stake, I think it's imperative to consider not just the words and images that are put in front of you but the emotions that created those words. Hatred is intoxicating, it's all-consuming and, in my opinion, it's a contagion that when embraced crowds out not only other moderating emotions but also any sense of reason and connection to one's fellow human beings.

I want to read a quote from R. v. Keegstra from the Supreme Court in 1990. Hatred is:

emotion of an intense and extreme nature that is clearly associated with vilification and detestation...

Hatred...against identifiable groups...thrives on insensitivity, bigotry and destruction of both the target group and of the values of our society. Hatred...is...an emotion that, if exercised against members of an identifiable group, implies that those individuals are to be despised, scorned, denied respect and made subject to ill-treatment on the basis of group affiliation.

...hate propaganda legislation and trials are a means by which the values beneficial to a free and democratic society can be publicized.

With more people being exposed to hateful ideas and emotions than ever before through social media and online content, and with the very troubling rise of hate-motivated crime in Canada, I'm quite heartened that the government is revisiting the inclusion of incitement of hatred in either the Canadian Human Rights Act or the Criminal Code.

The introduction of Canada's digital charter shows promise in developing a thoughtful and measured template for how Canadians can expect to be treated as digital innovation continues to expand. However, I wish to challenge the committee to consider that the government's responsibility to Canadians should not end with the adoption of these measures. Unless effective and ongoing training is provided to everyone responsible for implementing these laws, including judges, Crown prosecutors and police, victims will continue to feel that they are not heard and that justice remains elusive.

As an example, just last week I heard from a member of my local community who wanted to report anti-Semitic graffiti that they found. The responding officer was not at all sympathetic, and because the swastika that was found was misshapen, he wrote it off as a collection of L's. That is not a responsible response to the community.

Speaking as a former extremist and as a woman and a mother who is raising a child in an interfaith Jewish-Christian family, I think Canadians urgently need you to respond boldly and to lead us into an era in which we can expect that our children will be treated with respect and dignity, both online and in the real world. I think we also have a responsibility to the international community to do what we can to limit hatred that may impact identifiable groups in other nations because, as I said, borders mean nothing in the digital world. It is unfortunately no accident that the Christchurch shooter had Alexandre Bissonnette's name on one of his weapons.

The endgame of hatred is always violence and death, and that game starts with incitement, words and images that we find on the Internet.

The introduction of legislation to address the early stages in the progression of hate is both right and necessary. Canada's values of peace, diversity and inclusion are being eroded by the unrelenting promotion and communication of hate online. It is time, if not past time, to send a strong message to racist extremists that their hatred and targeting of identifiable groups is not just unacceptable but unlawful.

As I stated earlier, I have experienced first-hand the moderating effects of such laws and regulations. I think it's time that we do the right thing to rein extremists in before anyone gets hurt or loses their life.

I would add, if I have a moment, very briefly in response to what the earlier speakers had mentioned, that when it comes to reporting online hate, I think platforms need to have more transparency when they respond to people. I have experienced myself being targeted as a former extremist online and receiving hatred, and when I report it, if I get any response back at all, it is, “We have found that they did not violate terms of service” or “We have found that they have violated terms of service”, but there's no additional information to say in what ways they've precisely violated terms of service. There is no mention of what measures have been taken, whether the account has been suspended or whether that suspension is temporary or permanent. I think online platforms owe it to the people who are victims to have more transparency in what they are doing and in saying whether this account is going to be monitored, going forward, for any additional infractions.

Thank you very much for your time today.

9:05 a.m.


The Chair Liberal Anthony Housefather

Thank you very much.

We'll move to the Alberta Muslim Public Affairs Council.

9:05 a.m.

Faisal Khan Suri President, Alberta Muslim Public Affairs Council

Thank you, Mr. Chair, for having us.

My name is Faisal Khan Suri. I'm the president of the Alberta Muslim Public Affairs Council, or AMPAC. I'm joined here by my colleague Mohammed Hussain, who is VP of outreach.

Today's topic of discussion is not only an important one but an absolutely necessary one. With all the events we are seeing in Canada, throughout the world, and especially within Alberta, it definitely warrants our being here, collaborating on this effort and sharing our thoughts. Thank you again to this committee for inviting us and allowing us to share our thoughts.

I'll just give you a snapshot of AMPAC.

We're dedicated to championing civic engagement and anti-racism efforts within the province of Alberta. We focus on advocacy work, implementing strategies around media relations, community bridge-building, education, policy development and cultural sensitivity training.

AMPAC envisions a province where deep equality exists for all Albertans, including Muslims, in a political and social landscape that is respectful and harmonious for people of all faiths and backgrounds.

To get to the gist of things, to state it quite mildly, online hate influences real-life hate. I could be quite blunt about this. Online hate is an enabler, a precursor and a deep contributor to not just real-life hate but also to murder.

We've seen a lot of recent tragedies happen across the world. In January 2017, the Quebec City mosque killer, Alexandre Bissonnette, gunned down six Muslim men in execution style when he came into the mosque with two guns and fired more than 800 rounds. The evidence from Bissonnette's computer showed he repetitively sought content about anti-immigrant, alt-right and conservative commentators; mass murderers; U.S. President Donald Trump; and the arrival of Muslim immigrants in Quebec.

In October 2018, white nationalist Robert Bowers murdered 11 people and injured seven more at the shooting inside the Tree of Life synagogue in Pittsburgh. This was an attack that appeared to have been motivated by anti-Semitism and inspired by his extensive involvement in white supremacy and alt-right online networks.

In March 2019, a lone gunman armed with semi-automatic weapons burst into the mosque in Christchurch, New Zealand. This white nationalist, in what was a gruesome terrorist attack, was broadcasting live on Facebook and Twitter, and 51 worshippers were killed.

There are so many more examples we could provide that show the accessibility of online hate and how it's affecting the real-life hate we are witnessing today.

I think it's absolutely critical, if not fundamental, to embark on such studies as this and to look a lot further into this issue with a deep thought process in place.

Online hate is a key factor in enforcing hate in all forms—Islamophobia, anti-Semitism, radicalization, violence, extremism and potentially death. This is why we must take immediate action to work on prevention, monitoring and enforcement.

In order to combat online hate, AMPAC has come up with three recommendations. Number one is to employ artificial intelligence on online materials to identify any form of hate speech. Number two is to reopen the Canadian Human Rights Act for a comprehensive review. Number three is to have transparency and accountability for social media platforms.

Allow me to delve a little further into the first recommendation, employing artificial intelligence on online materials to identify any form of hate speech.

Right-wing extremist groups are using social media platforms such as Facebook and Twitter to create and promote their groups, share messages, assemble people and more. The question is, how can we remove their access, block IP addresses or even discover these types of groups? There are some tools being used today, such as text analysis, to combat online hate, but these groups are becoming much smarter, and they're using images such as JPEGs to help deter that monitoring.

While we are happy to see that the new digital charter incorporates elements of an approach that involves industry, we believe that the government must itself fund innovative technological solutions to track online hate and aid in developing artificial intelligence that can combat it.

The AI technology needs to be comprehensive so as to encompass text analysis and languages, and so as cover all forms of social media that are used to facilitate online hate. We believe that there is space in Canada, especially within Alberta, to build that capacity.

Our second recommendation, to reopen the Canadian Human Rights Act for a comprehensive review, is quite near and dear to our hearts.

The moment freedom of speech or freedom of expression puts another group, organization or individual in any form of danger, it can no longer be justified as freedom of speech or expression. This is now freedom of hate, which has no place in the Canadian Charter of Rights and Freedoms, nor in any pluralistic society that we live in. It has been far too long since the Canadian Human Rights Act has been revisited.

Keep in mind the following: For the last few years, hate has led to the murder of innocent civilians. Also keep in mind the importance of reviewing how online access and other media have been used to propel such hate and extremist perceptions.

AMPAC recommends not simply revisiting and reviving section 13, but reviewing the Canadian Human Rights Act in its entirety. The review itself needs to consider facts on the rise of Islamophobia, anti-Semitism, xenophobia and all other forms of hate. Questions need to be asked in terms of what determines hate and how we can bring enforcement into the picture with respect to the Charter of Rights and Freedoms.

Part of our third recommendation that we talked about is transparency and accountability for social media platforms. While we're pleased with the signing of the digital charter, we think that there is a lot more to be done in terms of regulating social media companies. We recognize that social media platforms have been trying to curtail hate speech through reporting options, but there is a lack of accountability in what follows that reporting, which in turn minimizes any sort of enforcement. Social media platforms such as Facebook, Twitter and YouTube must be held accountable by government authorities for reporting the data and for any follow-up measures.

We're quite aware of the challenges that such regulations can bring to freedom of expression related to this recommendation, but we believe in a statement that New Zealand's Prime Minister Jacinda Ardern gave. Her persistence to control the amplification of online hate is not about curbing freedom of expression. I will quote some of her words. She says, “...that right does not include the freedom to broadcast mass murder.” She also says, “This is not about undermining or limiting freedom of speech. It is about these companies and how they operate.”

Working alongside social media companies, holding them accountable, and imposing some form of financial repercussions or other necessary measures are part of this recommendation. We hope to see a requirement for online platforms to be transparent in their reporting come to light with this initiative.

To end, I'll go back to the key factors that are priorities for us: to look at prevention, monitoring and enforcement. Today the recommendations that we've talked about—implementing a comprehensive artificial intelligence tool that spans major social media platforms, implementing language-text-image analysis, reopening the Canadian Human Rights Act for an extensive review, reviving section 13 and holding social media platforms accountable for sharing data—are just the initial steps that we believe can help to curb online hate.

With a 600% increase in the amount of intolerant hate speech in social media posts from November 2015 to November 2016, I can only try to fathom or understand where those statistics are today.

Additionally, with the clear evidence of online hate, including the horrific killing of innocent people, there is absolutely no greater time than the present to action immediate government-legislated change. We cannot allow hate to inflate any further. We most certainly cannot allow any more lives to be taken.

I'd like to end this by echoing the statement of Prime Minister Justin Trudeau: “Canadians expect us to keep them safe, whether it’s in real life or online....”

Thank you so much.

9:15 a.m.


The Chair Liberal Anthony Housefather

Thank you very much.

Now we'll go to the Friends of Simon Wiesenthal Center for Holocaust Studies.

Go ahead, Mr. Benlolo.

9:15 a.m.

Avi Benlolo President and Chief Executive Officer, Friends of Simon Wiesenthal Center for Holocaust Studies

Good morning, everyone. Thank you very much for having us here today and for actually doing this. This is very important work that you're all doing.

I'd like to begin my statement by first telling you a little bit about our institution. We're an international human rights organization. We have a network of offices worldwide, monitoring and responding to anti-Semitism, fighting hate and discrimination and promoting human rights. The organization has status with the United Nations, UNESCO, the OSCE and many other notable global organizations. Additionally, the Simon Wiesenthal Center has won Academy Awards and developed museums. We are currently building a human rights museum in Jerusalem.

ln Canada, we have won the Canadian Race Relations Foundation's award for our tolerance training workshops on the Tour for Humanity and in the classroom. We educate about 50,000 students each year, including those in law enforcement, faith leaders and teachers.

The organization has been tracking online hate for more than two decades. Twenty years ago, online hate was primarily found on websites. They were fairly easy to track, document and, in some cases, bring down through the help of Internet service providers. In fact, we used to produce an annual report called “Digital Hate” in the early days.

Section 13 of the Canadian Human Rights Act allowed us to bring down several online hate sites simply by bringing them to the attention of the ISP. Our ability to sanction hate sites became limited when section 13 was repealed in 2013. We lost an invaluable tool that provided a red line for the public. If that tool was in existence today, it's unlikely that anti-Semitic websites based in Canada, like the Canadian Association for Free Expression or Your Ward News and others, would so easily find a home on Canadian servers.

The advent of social networking sites like Facebook, Instagram, Twitter and the like introduced a tsunami of hate into the social sphere. According to one study, roughly 4.2 million anti-Semitic tweets were posted and reposted on Twitter between January 2017 and January 2018. Conversely, according to Statistics Canada's 2017 hate crime report, there were 364 police-reported cyber-hate crimes in Canada between 2010 and 2017. Of those, 14% were aimed at the Jewish community.

I'm telling you this because this number is actually really low. You'd be surprised hearing this number, but it's low. I think it's low, given this recent Leger Marketing poll that showed that 60% of Canadians report seeing hate speech on social media. That would mean something like 20 million Canadians have witnessed hate online.

Moreover, through our own polling, the Friends of Simon Wiesenthal Center found that on average across the country, 15% of Canadians hold anti-Semitic attitudes. That represents about five million Canadians. That's kind of the low end of that threshold; in Quebec, that number surges to an incomprehensible 27%.

Social networking platforms must be held to account for allowing online hate to proliferate. We note that these platforms have begun banning white supremacist and extreme terror groups. This is certainly one step forward. However, since they are operating in Canada, we must demand that platforms conform to our Criminal Code, specifically section 318 on advocating genocide, subsection 319(1) on publicly inciting hatred, and subsection 319(2) on wilfully promoting hatred.

lt's possible that Canada requires a CRTC-like office with a mandate to regulate online content and specifically ensure that online hate is curtailed. Indeed, one CRTC mandate is to “protect” Canadians. The CRTC says, “We engage in activities that enhance the safety and interests of Canadians by promoting compliance with and enforcement of its regulations, including those relating to unsolicited communications.” It's in their mandate.

That appears to be consistent with our interest here to limit the proliferation of hate online in accordance with Canadian law.

The Christchurch Call to Action to eliminate terrorists' and violent extremists' content online is a positive step forward. However, it must be implemented by Canada with concrete tools. Friends of Simon Wiesenthal Center recommends the following actions that could help stem the promulgation of hateful acts against all communities through online platforms.

One, reinstitute section 13 of the Canadian Human Rights Act to make it illegal to utilize communications platforms to discriminate against a person and/or an identifiable group.

Two, the section should as well make platforms and service providers liable for ensuring they are not hosting hate websites and moderating their online social networking feeds. Fines should be imposed and criminal sanctions should be placed on violators.

Three, expand Statistics Canada's mandate to collect and share hate crime statistics from across the country. At the moment, Canadian policy-makers and organizations are mostly guessing. This is where I get back to those police numbers. We really are guessing at the extent of hate online and beyond. We need better information collected across the country to make better policy.

On that point, I held a hate crimes conference last fall and I invited Statistics Canada. It was the first time they attended a hate crimes conference with police units from across the country. I was shocked that this hadn't happened before.

Fourth is to improve police capacity and ability to track and respond to hate crime. Through our research, we discovered an inconsistency of hate crime units across the country. Some cities lack the resources to implement and deploy hate crime investigators, as you just heard. Last fall, we initiated the hate crimes conference. I'm repreating myself.

This country is lacking a best-practices model for policing hate crimes and understanding hate crimes and understanding the law around hate crimes and collecting and delivering that information to Stats Canada, which will in turn deliver that information to the policy-makers.

Number five is to improve communication between the provincial attorneys general as well as police when it comes to investigating and prosecuting hate crime and hate speech offenders. This will require additional training for prosectors and police officers so that victims of hate speech crime feel their needs are addressed.

We have specific examples that I can get into later about the mishandling in how the prosectors are working with the police and the disjointed communication between them in finding hate crime criminals and prosecuting them.

Number six is education. This is, for us institutionally, one of the most important elements. Education on responsible usage of social networking sites and websites is required now more than ever. We dedicate literally millions of dollars a year to deploying our educational programs to bring that to students. We have, for example, cyber-hate and cyber-bullying workshops, where we aim to educate students.

Even going to a website about the Holocaust is one example. How do you know which website is legitimate? How do you know which one is fake? Further education needs to happen in schools across the country so the students, the young people, the next generation will understand what hate speech and hate crime really are and be able to differentiate.

Finding a balance between protecting free speech and protecting victims of hate is essential. Our freedom and democracy must be protected. At the same time, we must recognize that there are victimized groups that need protection too, and leaving the issue to the marketplace will bring about unpredictable consequences.

Even The Globe and Mail admitted in an editorial last week that times have changed since the Supreme Court of Canada struck down a law in 1992 that made it a crime to “spread false news”. The Globe says, “Much has changed since then. Mr. Zundel printed and handed out crude pamphlets”, whereas today the same hateful message can be viewed by millions of people at once and inspire violent action.

We know this. The recent terror attacks in New Zealand, Sri Lanka, San Diego, Pittsburgh, etc., must motivate government and civil society to take immediate action. Terrorism can be prevented with the right placement of instruments, instruments that include a combination of enhanced legal measures, advanced monitoring and prevention, increased resources for law enforcement and hate crime units, and broader educational programs that promote tolerance, compassion and good citizenship.

We hope the committee makes recommendations for immediate amendments to the Canadian Human Rights Act to end incitement of hatred on online platforms.

Thank you.

9:25 a.m.


The Chair Liberal Anthony Housefather

Thank you very much.

We'll now move to questions.

Go ahead, Mr. Cooper.

9:25 a.m.


Michael Cooper Conservative St. Albert—Edmonton, AB

Thank you, Mr. Chair.

First of all, Mr. Suri, I take great umbrage with your defamatory comments to try to link conservatism with violent and extremist attacks. They have no foundation. They're defamatory and they diminish your credibility as a witness.

Let me, Mr. Chair, read into the record the statement of Brenton Tarrant, who is responsible for the Christchurch massacre. He left a 74-page manifesto in which he stated, “conservatism is corporatism in disguise, I want no part of it”, and “The nation with the closest political and social values to my own is the People's Republic of China.”

I certainly wouldn't attempt to link Bernie Sanders to the individual who shot up Republican members of Congress and nearly fatally killed Congressman Scalise, so you should be ashamed.

Now, with respect—

9:30 a.m.


Iqra Khalid Liberal Mississauga—Erin Mills, ON

I have a point of order, Mr. Chair—

9:30 a.m.


Randy Boissonnault Liberal Edmonton Centre, AB

On a point of order, Mr. Chair, that is unacceptable behaviour of a member of Parliament to witnesses at our committee. I have the speech in front of me, Mr. Chair, and there is nothing linking conservatism to that movement. If alt-right is limited to conservatism, that's conservatism's issue—

9:30 a.m.


The Chair Liberal Anthony Housefather

Guys, guys—

9:30 a.m.


Michael Cooper Conservative St. Albert—Edmonton, AB

He said conservatism and conservative commentators.

9:30 a.m.


Randy Boissonnault Liberal Edmonton Centre, AB


9:30 a.m.


Tracey Ramsey NDP Essex, ON

Mr. Chair, you cannot have a member of this committee calling for witnesses to be ashamed. That's unacceptable.

9:30 a.m.


The Chair Liberal Anthony Housefather

Again, I certainly don't agree—