Evidence of meeting #148 for Justice and Human Rights in the 42nd Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was online.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Mohamed Labidi  Former President, Centre culturel islamique de Québec
Jasmin Zine  Professor, Sociology and Muslim Studies Option, Wilfrid Laurier University, As an Individual
Bernie M. Farber  Chair, Canadian Anti-Hate Network
Mustafa Farooq  Executive Director, National Council of Canadian Muslims
Seifeddine Essid  Social Media Officer, Centre culturel islamique de Québec
Robert Dennis  Assistant Professor, Department of Religious Studies, University of Prince Edward Island, As an Individual
Leslie Rosenblood  Policy Advisor, Canadian Secular Alliance
Andrew P.W. Bennett  Director, Cardus Religious Freedom Institute
Greg Oliver  President, Canadian Secular Alliance

8:50 a.m.

Conservative

Michael Cooper Conservative St. Albert—Edmonton, AB

I'll call the committee to order.

We are continuing our study on online hate. I'd like to welcome all of the witnesses.

We are going to start with the witnesses who are joining us via video conference: Seifeddine Essid and Mohamed Labidi from the Centre culturel islamique de Québec.

You have eight minutes.

8:50 a.m.

Mohamed Labidi Former President, Centre culturel islamique de Québec

Good morning, honourable members.

First, we would like to thank the House of Commons Standing Committee on Justice and Human Rights for inviting us to appear as witnesses. This work is very important for the future of the country, and the Centre culturel islamique de Québec is honoured to participate in it.

The virtual world is becoming increasingly significant when it comes to transactions and communications. Just as the market is moving to the virtual world, so are social and media interactions.

Although hate speech has always existed, the shift to virtual communication, particularly through social media, has exacerbated the situation as a result of two factors. These factors are the easy access to an audience and the perception of anonymity online.

Unfortunately, we're witnessing a form of impunity online. Some statements are considered serious in the physical world, but are trivialized in the virtual world. In addition, federal and provincial public law enforcement institutions are powerless to deal with the serious statements that benefit from some form of immunity in the virtual world. A useful exercise would be to introduce into the real world some of the rampant hate speech that goes unpunished in the virtual world.

I'll let you look at some examples of hate messages found on social media. This is only a tiny sample of what's unfolding before our eyes. The goal is simply to give you an idea of what's happening.

In the virtual world, there are two sources of hate speech. These sources are journalistic and media entities represented by writers and columnists, and individuals who act openly or under the guise of partial anonymity.

I'll now talk about the impact of hate speech in the virtual world. Whether the hate speech concerns calls for the murder and extermination of minorities, the glorification of hate crimes, or direct and indirect threats, Canadian society is suffering tremendous harm. The impact of hate speech in the virtual world can be summarized as follows. It undermines the well-being and sense of security of victims; undermines the sense of belonging of victims; marginalizes groups of people; leads to widespread radicalization of consumers of hate speech; and leads to the risk that sympathizers of hate speech will take action.

With regard to the last point, it's worthwhile to look back at what happened at the Grande Mosquée de Québec on January 29, 2017. The killer, although he had few friends, became radicalized as a result of his Internet use. This theory is confirmed in the judgment. Lastly, we can't separate the virtual world from the physical world, hence the need to update the legislation to counter the imbalance.

The current legislation is clearly insufficient and not enough of a deterrent. As a result, we must strengthen our legislation to protect all Canadians from rampant online violence. To address this issue, we're submitting the following three recommendations.

First, we must legislate against hate speech in the virtual world. Freedom of expression must be protected. However, it mustn't undermine public order. As a result, criminal consequences must be established to stop the spread of hate speech.

Second, we must give law enforcement institutions the necessary tools. Law enforcement must be able to use mandates, but also the necessary resources to confront this scourge. We can see that the crimes are shifting to the virtual world. Our law enforcement agencies must be given the tools to prevent and fight crime in all its forms.

Third, we must make media platforms accountable. The Canadian Radio-television and Telecommunications Commission should have an official mandate to oversee media players and impose consequences for non-compliance. Media platforms must report to the authorities any hateful messages or messages that incite violence, and log or delete hateful and violent content.

Thank you for your time.

8:50 a.m.

Conservative

Michael Cooper Conservative St. Albert—Edmonton, AB

Thank you very much for that.

We will now move to Professor Jasmin Zine, professor of sociology and Muslim studies option, Wilfrid Laurier University.

Welcome. You have eight minutes.

8:55 a.m.

Dr. Jasmin Zine Professor, Sociology and Muslim Studies Option, Wilfrid Laurier University, As an Individual

Thank you to the chair and members of the committee for this opportunity to contribute to the parliamentary study of online hate. I am a professor of sociology and Muslim studies at Wilfrid Laurier University. I specialize in anti-racism and Islamophobia studies.

Currently, I'm conducting a study funded by the Social Sciences and Humanities Research Council on mapping the Canadian Islamophobia industry, along with the National Council of Canadian Muslims. By the term “Islamophobia industry”, I am referring to a constellation of individuals, groups, think tanks, politicians, academics, institutions, grassroots organizations, media outlets and donors who manufacture, produce, distribute and attempt to normalize fear, bigotry and hatred toward Islam and Muslims.

The research I am doing examines and maps the political, ideological, institutional and economic networks that foment Islamophobic fear and moral panic in Canada. This is essentially an industry of hate, which operates through a variety of tacit and overt means and intersects within a broad, interconnected transnational network. The web of associations in this network connects Canadian white supremacist and white nationalist groups—who, according to Barbara Perry’s research, have grown in number from upward of 100 in 2015 to almost 300 in 2018—with a variety of other groups, organizations and individuals that form the soft power behind this industry and the dog-whistle extremist rhetoric often guised in liberal discourses about upholding free speech, preserving Judeo-Christian democracy and safeguarding Canadian values from the threat of Muslim infiltration. They use online platforms to purvey their ideologies of hate, racism and xenophobia, and connect with other alt-right groups that are rooted in neo-fascism, misogyny, homophobia, transphobia and other forms of bigotry.

This is conceived of as an industry because there are donors who provide financing for the activities of these groups. According to a report by the Council on American-Islamic Relations and the University of California at Berkeley, in the United States a $200-million, small, tightly networked group of donors, organizations and misinformation experts circulate funding to advance certain political interests. A recent report released by CAIR in the United States, entitled “Hijacked by Hate”, expands this funding base to include philanthropic and charitable donors contributing almost $1.5 billion to 39 Islamophobia network groups.

There is no doubt that this funding is being used to support, maintain and proliferate the online reach of the network of organizations to whom this money is being filtered on such a large scale. Many of these U.S.-based groups have interests tied to Canadian counterparts. We've seen some evidence of Canadian organizations that promote Islamophobic agendas being funded by U.S. donors, which increases the base of their ideological support and opportunities for political mobilization. For example, the anti-Muslim think tank Middle East Forum in the United States—headed by Daniel Pipes, a key player in the Islamophobia industry—provided funding to a conference in Canada for a group called Canadians for the Rule of Law. I attended this conference with some of my students, and was physically assaulted and forcibly removed for asking Christine Douglass-Williams about the kind of Islamophobic rhetoric that had her removed from the Canadian Race Relations Foundation's board.

Further, there are global ties and transnationally linked spheres of influence that circulate with impunity both online and within the public sphere promoting widespread hate and bigotry. A recent report by the U.K. group Faith Matters investigated Rebel Media, described as a platform for the globalization of hate that promotes white nationalism and Islamophobic fearmongering to an audience with over 1.5 million subscribers on YouTube. Rebel Media also received $2 million of funding from the Middle East Forum.

We also know that online forums are the primary site where radicalization is taking place. Even online gaming sites have become spaces for organizing and role-playing racist forms of violence. Online hate propagation creates an ideological breeding ground to inspire terrorists like Alexandre Bissonnette, responsible for the Quebec massacre, as well as the New Zealand shooter and Anders Breivik in Norway.

I provide this context from my research as a preamble to contextualize the formation and scope of contemporary industries of hate, bigotry and Islamophobia that operate online and in the public sphere. I’d like to put forth two areas of consideration for the committee today. The first is the controversy between hate speech and free speech. Two, I'd like to offer some recommendations from the European Commission against Racism and Intolerance of the Council of Europe in terms of “General Policy Recommendation No. 15 on Combating Hate Speech”, which was adopted in December 2015.

First I'll speak to the hate speech versus free speech controversy.

Challenges to section 13 of the Canadian Human Rights Act were contested on the basis that provisions that protect prohibited speech inciting hatred of people based on race, religion, sexual orientation or other protected characteristics would violate freedom of speech.

Free speech is not an unbridled right, so it is important to consider its limits. It is vital to differentiate between the legitimate dissent that may include unpopular or controversial views, and speech acts that incite hatred and create poisoned and threatening environments. This critical discernment is what these politically fraught times require and is the work that must be done to balance free speech as a limited right with the protection of human rights, dignity and equity. Only then will we be able to uphold the greater good for all.

Sacrificing human rights on the altar of free speech has become a strategy in the alt-right tool kit of bigotry. In the midst of growing concerns about neo-fascism, white supremacy and white nationalism, alt-right groups are weaponizing free speech and using it as a rhetorical prop in their campaigns of hate and ideological intimidation. These groups engage in tactics such as vandalism, harassment and online doxing under the cover of a free speech alibi. Now, newly emboldened neo-fascist groups are coming out from the shadows of Internet chat rooms and entering the public sphere.

As a flagship case in Canada, I want to remind us of James Keegstra, the Alberta high school teacher who communicated hateful rhetoric against the Jewish community in his classroom, depicting Jews as evil and denying the Holocaust. In 1984 he was prosecuted under subsection 319(2) of the Criminal Code for publicly and wilfully promoting hatred. The Supreme Court of Canada concluded that even though the legislation infringed on freedom of expression, it was a reasonable and justifiable limitation, in a free and democratic society, to protect target groups from hate propaganda.

9 a.m.

Conservative

Michael Cooper Conservative St. Albert—Edmonton, AB

Ms. Zine, you have one minute.

9 a.m.

Professor, Sociology and Muslim Studies Option, Wilfrid Laurier University, As an Individual

Dr. Jasmin Zine

Okay.

I'm going to jump ahead, but I want to emphasize the fact that the Supreme Court said that hate propaganda denotes any expression that is intended or likely to “circulate extreme feelings of opprobrium and enmity against a racial...group”. I think that's an important area to be included, if there is going to be, at this stage, a reconstitution of section 13.

I also want to point out—and it's in my brief, even though I cannot read it now—that there needs to be a clear definition of what constitutes hate. I've included in the brief, from Justice Rothstein in 2013, a definition that I think could be built upon and elaborated. I also point out that there needs to be consultations around this definition with academics, community organizations, NGOs, social media companies, Internet service providers and experts in new media and technology who can provide information relating to encryption software and artificial intelligence.

In the couple of minutes I have left, I want to refer—

9 a.m.

Conservative

Michael Cooper Conservative St. Albert—Edmonton, AB

You have four seconds, so perhaps we'll end it there. There will be an opportunity to pick it up when there are questions.

You've cited Justice Rothstein. Could you just cite the case, if you have it?

9 a.m.

Professor, Sociology and Muslim Studies Option, Wilfrid Laurier University, As an Individual

Dr. Jasmin Zine

In my notes, it's “speaking for the unanimous court, Justice Rothstein, with respect to the Saskatchewan Human Rights Code”.

9 a.m.

Conservative

Michael Cooper Conservative St. Albert—Edmonton, AB

Is it Whatcott? What paragraph?

Do you have that? If not, we can find it.

9 a.m.

Professor, Sociology and Muslim Studies Option, Wilfrid Laurier University, As an Individual

Dr. Jasmin Zine

It's Whatcott, yes. It's from 2013. It's paragraph 41.

9 a.m.

Conservative

Michael Cooper Conservative St. Albert—Edmonton, AB

Very good. Thanks for that.

9 a.m.

Professor, Sociology and Muslim Studies Option, Wilfrid Laurier University, As an Individual

Dr. Jasmin Zine

I'm happy to talk more about the recommendations during Q and A.

May 9th, 2019 / 9 a.m.

Conservative

Michael Cooper Conservative St. Albert—Edmonton, AB

Very good.

We'll now turn to Bernie Farber from the Canadian Anti-Hate Network.

You have eight minutes.

9 a.m.

Bernie M. Farber Chair, Canadian Anti-Hate Network

Thank you, Mr. Chair.

Honourable members, thank you for inviting me today.

Good morning, my name is Bernie Farber. I am the past CEO of the Canadian Jewish Congress, where I worked for almost three decades. I am also the son of a Holocaust survivor, a survivor who was the sole Jewish person to survive in his village of over 1,500 Jews, so I have some visceral understanding of what hate is.

During my time at the congress, I spent much of it monitoring hate, extremism, white supremacy, racism, anti-Semitism and xenophobia. We undertook this work because, of all people, we understood that hatred run wild is a deadly virus without a cure.

Today I am retired, or as I prefer to say “rewired”, since I still act as a social justice consultant with various boards of education, as well as Human Rights Watch and Community Living. I also chair the Anti-Hate Network. The Anti-Hate Network itself is non-partisan. We monitor, expose and counter hate groups. We are journalists, researchers, court-recognized experts, lawyers and leaders in the community. We've held workshops in schools with law enforcement. Our investigations have shut down some of Canada's worst neo-Nazis and exposed so-called patriot groups that are actually anti-Muslim hate groups. We've become the go-to experts nationwide.

Our strategy to counter hate is really one of containment. We monitor and expose the worst of the worst hate propagandists so that they face social consequences. We put pressure on platforms to make the principled decision to remove hate groups both online and in communities across Canada, and we file criminal complaints. I just returned yesterday from a meeting with Facebook. Facebook called this meeting to deal with this exact issue of online hate. I give them credit for becoming, finally, a corporate leader. Let's see where it goes.

I want to emphasize that online harassment is harassment, and that online threats are threats. Our laws apply to the Internet and we need to enforce them. That means holding individuals accountable for what they post and holding social media companies accountable for giving them a platform. Our goal should be to drive the worst hate groups offline, to de-platform them. Often we hear the counter-argument that by driving hate groups off of the largest online platforms that it gives them attention. It helps them grow. We hear that people will seek them out in the darker corners of the Internet or that it makes them more dangerous. I want to be very clear. There's no evidence for these arguments. They're simply not true.

Last year, our investigations took us down one of the largest alt-right, neo-Nazi forums used by Canadians and we had the opportunity to watch what happened very closely. They had user names on these forums and they got to know each other and trust each other and they got to vouch for each other. They had a huge audience. They had a network. They had propaganda materials. Suddenly it was all gone. When they lose these online platforms, one or more of them may try to move people to a new one but many of them never make the switch. They lose their megaphone. They lose their network.

Most importantly, it means that the high schooler who has been watching hate propaganda on YouTube and has started to believe that women shouldn't have rights and that some races are biologically inferior, is going to have a much harder time finding one of these online echo chambers where he would be exposed to even more insidious propaganda and people trying to recruit him or her to hatred.

When we deal with online hate targeting the platforms, we're preventing countless untold incidents of radicalization. It's similar with hate groups in the anti-Muslim movement. They mostly use Facebook and when they get barred from Facebook, they usually come back with a new Facebook, but they lose all of their previous work and they have a 10th of their previous followers. It defangs them.

The problem is that while Facebook is taking a lead in responding to online hate, it's really only dealing with the tip of an iceberg. For example, it has yet to remove some of the worst Canadian groups out there. Take the Yellow Vests Canada page, for example. We and other organizations have documented hundreds of incidents of overt racism and death threats. That page is still up and we're worried that the next Quebec City mosque shooter is reading that page and pumping himself up with anger.

This is why we need the government to enforce the Canadian Human Rights Act when it comes to social media companies. It's the law that no company in Canada can discriminate in providing a good or service in this country. If I were a baker, I couldn't refuse to bake a wedding cake for a gay couple. Social media companies are breaking this law because different people have very different experiences on social media. Persons of colour, women, LGBTQ+ persons, Jews, when these Canadians go online, they are much more likely to experience harassment, threats and propaganda that dehumanizes them or calls them vermin.

The act says that every company has an obligation to give people non-discriminatory service. The government could give the Human Rights Commission a clear mandate and the resources to enforce the law and beef up our legislation with stricter financial penalties to hold social media companies accountable for their role in spreading hatred.

Of course, it's not just the platforms at fault here. Very bad people are spreading hate propaganda and they are getting away with it. We can deal with most haters by exposing them to the natural social consequences, but we do have subsection 319(2) of the Criminal Code, which makes spreading hate propaganda illegal. However, realistically, these investigations take a long time and few charges are laid.

Most importantly, we need section 13 of the Canadian Human Rights Act back. Section 13 allowed an individual to make a complaint about online hate speech to the Canadian Human Rights Commission. If the commission's investigation said it was a reasonable complaint, it would go to the tribunal. The Human Rights Tribunal would hear the case, render a decision based on the evidence and could order the person spreading hate to stop and maybe pay a small fine. The Supreme Court ruled this law was constitutional but the government of the day repealed it in 2013 anyway. This was an effective law. It shut down some of the worst online purveyors of hate in its day and neutered a generation of white supremists and neo-Nazi leadership.

Additionally, the CHRC has a central role in enforcing the act and protecting Canadians from the social destruction of online hate. It should be resourced accordingly. Simultaneously with the re-establishment of section 13, we need to continue to encourage individuals and groups within society to file complaints. Over the years, this has proven to be the best mechanism to enforce regulation. The loss of section 13 has left us terribly vulnerable. I can't stress this enough.

We also need the Human Rights Commission and the tribunal to have the resources to hear cases in a reasonably speedy manner.

In conclusion, we need the best tools possible. We've been fighting a losing battle. Our intelligence services acknowledge that they dropped oversight of extremist hate groups many years ago and only in the last year have they tried to re-establish a presence. Police services no longer have dedicated hate crime units so their expertise has waned and hatred is getting worse. It has moved from evil words to evil actions, from minor property damage to outright murder.

We count on our leaders to lead. I ask you today to lead. Be brave. Be bold. Give our country the tools it needs to protect us from this growing menace before it's too late.

Thank you.

9:10 a.m.

Conservative

Michael Cooper Conservative St. Albert—Edmonton, AB

Thank you, Mr. Farber.

We'll now turn to Mr. Mustafa Farooq and Ms. Leila Nasr from the National Council of Canadian Muslims. You have eight minutes.

9:10 a.m.

Mustafa Farooq Executive Director, National Council of Canadian Muslims

Thank you, Mr. Chair and members of this committee for the opportunity to offer our thoughts on this committee's study of online hate.

My name is Mustafa Farooq. I am the executive director at the National Council of Canadian Muslims. I am joined today by Leila Nasr, communications coordinator for the council.

By way of background, NCCM was founded in 2000 as an independent, non-partisan, non-profit grassroots organization dedicated to defending the human rights and civil liberties of Muslim communities in Canada.

The NCCM has a long-standing record of participating in major public inquiries, intervening in landmark cases before the Supreme Court of Canada and providing advice to security agencies on engaging communities and promoting public safety. With the independently documented rise in racism and Islamophobia faced by our communities, we are concerned about online hate. Since the Quebec mosque massacre—and we are here with our brothers and sisters from the Quebec mosque—many Canadian Muslims have been on edge.

Justice Huot, in his decision on Alexandre Bissonnette, held that it was clear from the evidence that Bissonnette had consulted sources on the Internet before carrying out attacks on our Canadian brothers and sisters. Bissonnette was on YouTube. He was on Facebook, and he consulted #muslimban on Twitter.

There is no clearer evidence of the existential threat presented by the dangers of online hate to the Canadian Muslim community but also to Canadians in general. Our brief goes into more detail in providing more empirical data and summarizing some of these potential harms relating to the effects of online hate on other communities, including the rise of anti-Semitism, the growth of the incel community, and issues around democracy and misinformation.

My submissions before you today are squarely around three key recommendations.

First, we are asking that the government reopen the Canadian Human Rights Act, the CHRA, for legislative review.

Second, we are asking that the government begin a specific parliamentary study on creating a new regulatory system that would include some form of penalizing social media companies for not taking down material that breaches the Criminal Code and human rights legislation. Such a study would focus on creating the framework for a regulator that is effective, does not limit freedom of expression and does not overly burden industry.

Third, we are asking that the government consider combatting online hate through digital literacy grants so that industry and civil society actors can conduct research and develop tools and programming to combat online hate.

First, let us discuss reopening the act. Many of our colleagues and friends have already made submissions before you on the question of the since repealed section 13 of the CHRA. Indeed, in the invitation to the public from this committee, the repeal of section 13 was specifically identified as a gap in the legislation in countering online hate.

We take no position on the controversy that led to section 13 eventually being repealed. However, it is clear that many academics, activists and policy-makers believe that section 13 or a version of section 13 should be revisited by way of legislative amendment to the CHRA.

This is not our position. The case law around section 13 demonstrates that section 13's utilization was not in line with what we might deem to be best practice. Indeed, despite the controversy around section 13, complaints arising from that section constituted only 2% of the total number of complaints brought to the Canadian Human Rights Commission.

Rather, we recommend that the government initiate a comprehensive legislative review of the CHRA. If we examine, for instance, the 181-page report in 2000 from the Canadian Human Rights Act review panel, it put forward a robust and well-considered analysis of the act, which, at the time, had not been comprehensively reviewed since 1977.

We believe that the CHRA is due for such a comprehensive review process, especially with the modern forms of hate, violence and discrimination that have arisen in the last two decades since the original review. Such a comprehensive review process would enable a panel to review not only the overall impact of a revised section 13 but also the impact of such a provision in light of the entire act. Revisiting the act would allow parliamentary study on other issues related to the commission, including addressing the well-known backlog of cases. I can address more questions about that, if questions arise.

Moving on to our next recommendation, it is clear that the current state of affairs, where online hate spreads rapidly through social media networks, is not healthy for democracy or safety in Canada. A number of jurisdictions, like Germany and Australia, have already acted to address these concerns.

However, we would not recommend that the government adopt a single model from a particular system. Rather, we recommend a formal parliamentary study on the question of regulating social media companies specifically. Such a study would generate the appropriate exploration it deserves through internal Government of Canada experts, external experts in human rights legislation, as well as through academics and industry.

The parliamentary study would explore how to create a new regulatory system that would include some form of penalizing social media companies for not taking down material that breaches the Criminal Code and human rights legislation. This study would ensure that the new regulatory system is effective, does not improperly limit freedom of expression and does not overly burden industry.

Lastly, as alluded to above, we recommend that government adopt changes to provide further digital literacy training to Canadians so that Canadians, and especially young Canadians, can deal with hate and misinformation online.

The UN report of the special rapporteur on minority issues in 2015 held that education and building resilience were key elements to combatting online hate targeting minority communities.

Our recommendation to the government is to consider creating a special grant program to develop digital literacy programming. Such a grant program would be available to academics, entrepreneurs, anti-racism organizations and NGOs that have expertise in thinking about digital literacy, democracy and online hate. It would also allow the government to foster further innovation in Canada. It could provide grants to everything from psychologists conducting innovative research to funding programs directly addressing anti-Semitic beliefs among a given population.

I also note, in closing, that we expand significantly on the submissions before you today in our full brief that has been submitted. Subject to your questions, that concludes my submissions.

9:15 a.m.

Conservative

Michael Cooper Conservative St. Albert—Edmonton, AB

Thank you very much, Mr. Farooq.

We will now move to one round of questions starting with Mr. MacKenzie.

Mr. MacKenzie, you have six minutes.

9:15 a.m.

Conservative

Dave MacKenzie Conservative Oxford, ON

Thank you, Chair.

We've been studying this for a few days now. Obviously, we're not at the bottom of it and I don't know whether we're at the middle of it yet. I think one of the things that comes through loud and clear to all of us is that we do have a concern—all of us, again—about online hate, and that how we handle it is a very important part of the question.

Everybody would like us to find a way to police that online hate that goes across borders. It doesn't just occur in Canada. People can access the information from around the world. If we're going to do something with it, I would like to know if you have a suggestion on how we deal with that issue. It is in fact international. How do we get to the information in the first place so that we can ask those companies or tell those companies what they have to do?

I will go across the line here. Can just give us some suggestions on what you believe?

Ms. Zine.

9:20 a.m.

Professor, Sociology and Muslim Studies Option, Wilfrid Laurier University, As an Individual

Dr. Jasmin Zine

Thank you.

I think that, obviously, these are transnational and global kinds of networks. I do think, though, that many other nations have taken it upon themselves to institute policies to curtail hate speech within their domain. I was trying to share, and have done in a full brief, what the Council of the European Union has come up with as recommendations and considerations from their vantage point of how they can successfully curtail hate speech. Some of those are going to be included in the larger brief.

One of the things I want to point out from those recommendations is that they recognize that criminal prohibitions are not, in themselves, sufficient to eradicate the use of hate speech and are not always appropriate, but nevertheless, they are convinced that such use should be, in certain circumstances, criminalized.

It is important to have a national policy in Canada, one that also creates codes of conduct. This was another area that the European policy covered for the self-regulation of public and private institutions, including elected bodies, political parties, educational institutions and cultural and sports organizations, as a means of combatting the use of hate speech. They encourage the adoption of appropriate codes of conduct that would provide for suspension and other sanctions for a breach of their provisions, as well as effective reporting channels.

I think those are helpful and instructive for the Canadian context. I think also, in tandem with that, is the recommendation that they have to withdraw financial and other forms of support by public bodies from political parties or other organizations that use hate speech or fail to sanction its use by their members, while respecting the right to freedom of association and the possibility of prohibiting or dissolving such organizations, regardless of whether they receive any support from public bodies where their use of hate speech is intended—

9:20 a.m.

Conservative

Dave MacKenzie Conservative Oxford, ON

Thank you.

I only have a limited amount of time.

9:20 a.m.

Chair, Canadian Anti-Hate Network

Bernie M. Farber

I'll be very quick.

In my meetings yesterday with Facebook, it was very clear that they do have the technology to limit country by country. Places like France, for example, have some significant limitations that Facebook has to deal with. Germany is exactly the same. Holocaust denial is absolutely illegal in Germany. They have completely blocked it and Facebook has acquiesced, as have others. It can be done.

Last, I want to take issue with my colleague Mustafa here. I am a big proponent of bringing back some kind of facsimile, if you will, of section 13. It's true that it dealt with a small number of cases, but if you take a look at the cases it dealt with, they were the most egregious. If we can knock off people who are listening or hearing, even if it's 2%, I think it's well worthwhile.

9:20 a.m.

Conservative

Dave MacKenzie Conservative Oxford, ON

I have a view on section 13 too, and I may be closer to someone other than you on that one.

9:20 a.m.

Chair, Canadian Anti-Hate Network

Bernie M. Farber

That's quite all right.

9:20 a.m.

Conservative

Dave MacKenzie Conservative Oxford, ON

I think our problem, honestly, when we dealt with it here, was the abuses of section 13.

9:20 a.m.

Chair, Canadian Anti-Hate Network

Bernie M. Farber

That's why I say “a facsimile” thereof. You can tweak it and make changes, but we need it. It's a dangerous world out there.